var/home/core/zuul-output/0000755000175000017500000000000015163470066014535 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015163501543015474 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000332700715163501366020270 0ustar corecoreikubelet.log_o[;r)Br'o(E}ZƐ:Ȓ!ɹ>vl]ds,r.k9Gf\HPvA^M~:E?h[.6f;l6^yzc򟼤-b6"οƼ>UWn׫Y_?|u+ݗ[y[L-V_pY_P-bXwûxwAۋt[~ _P^~&RY,yDy~z]fs,l<L& " d :o5J=nJ71f /%\xiƙQʀClxv< |N ?%5$) y5? fۮ?tT)x[@Y[`VQYY0gr.W9{r&r%LӶ`zV=Tooz2¨(PQ wFh k0&JS V3M.*x6Ql"%qYHzn4}*|dd#)3c 0'Jw A57&Q"ԉQIF$%* 4B.K$*/Gmt΍L/1/ %T%e63I[wdt6o[ .`:J ]HmS>v5gCh31 )Kh3i J1hG{aD4iӌçN/e] o;iF]u54!h/9Y@$9GAOI=2,!N{\00{B"唄(".V.U) _.f*g,Z0>?<;~9.뙘 vKAb;-$JRPţ*描Լf^`iwoW~SL2uQO)qai]>yE*,?k 9Z29}}(4ҲIFyG -^W6yY<*uvf d |TRZ;j?| |!I糓 sw`{s0Aȶ9W E%*mG:tëoG(;h0!}qfJz硂Ϧ4Ck9]٣Z%T%x~5r.Ng$g`Խ!:*Wni|QXj0NbYe獸]fNdƭwq <ć;_ʧNs9[(=!@Q,}s=LN YlYd'Z;o.K'_-הp|A*Z*}QJ0SqAYE0i5P-$̿<_d^"]}Z|-5rBC wjof'(%*݅^J">CMMQQ؏*ΧL ߁NPi?$;g&立q^-:}KA8Nnn6C;XHK:lL4Aْ .vqHP"P.dTrcD Yjz_aL_8};\N<:R€ N0RQ⚮FkeZ< )VCRQrC|}nw_~ܥ0~fgKAw^};fs)1K MޠZ 1VZ|&הԟ,Tصp&NI%`t3Vi=Ob㸵2*3d*mQ%"h+ "f "D(~~moH|E3*46$Ag4aX)Ǜƾ9U Ӆ^}ڲ7J9@ kV%g3T0|9ē7$3z^.I< )9qf e%dhy:O40n'c}~d i:Y`cФIX0$AtĘ5dw9}ŒEanvVZ?B2~\nT_dž-_Mm#?,>t?}=ˢ>f>\bN<Ⱦtë{{b2hKNh`0=/9Gɺɔ+'Х[)9^iX,N&+1Id0ֶ|}!oѶvhu|8Qz:^S-7;k>U~H><~5i ˿7^0*]h,*aklVIKS7d'qAWEݰLkS :}%J6TIsbFʶ褢sFUC)(k-C"TQ[;4j39_WiZSس:$3w}o$[4x:bl=pd9YfAMpIrv̡}XI{B%ZԎuHvhd`Η|ʣ)-iaE';_j{(8xPA*1bv^JLj&DY3#-1*I+g8a@(*%kޏ=S-ܑ2ƹڞ7կZ8m1`qAewQT*:ÊxtŨ!u}$K6tem@t):êtx: `)L`m GƂ%k1羨(zv:U!2`cV, lNdV5m$/KFS#0gLwNO6¨h}'XvوPkWn}/7d*1q* c0.$\+XND]P*84[߷Q뽃J޸8iD WPC49 *#LC ءzCwS%'m'3ܚ|otoʉ!9:PZ"ρ5M^kVځIX%G^{;+Fi7Z(ZN~;MM/u2}ݼPݫedKAd#[ BeMP6" YǨ 0vyv?7R F"}8&q]ows!Z!C4g*8n]rMQ ;N>Sr??Ӽ]\+hSQזLwfm#Y~!%rpWMEWMjbn(ek~iQ)à6$X?K0D"\KjPQ>Y{Ÿ>14`SČ.HPdp12 (HĜ]Xgy G[Ŷ.|37xo=N4wjDH>:&EOΆ<䧊1v@b&툒f!yO){~%gq~.LK78F#E01g.u7^Ew_lv۠M0}qk:Lx%` urJp)>I(>z`{|puB"8#YkrZ .`h(eek[?̱ՒOOc&!dVzMEHH*V"MC Qؽ1Omsz/v0vȌJBIG,CNˆ-L{L #cNqgVR2r뭲⭊ڰ08uir,8w87yPIܧWo3`-s9zz[ɤV)v} {qԐ{M;4VDsc#֚ ЙIuƗ-]טk&%o-w7PYEB>-jyL'8>JJ{>źuMp(jL!M7uTźmr(Uxbbqe5rZ HҘ3ڴ(|e@ew>w3C8sx[o%}wS`ýͽ>^U_S1VF20:d T2$47mSl*#lzFP_w֙`mn Lv%7mSU@n_Vۀl9BIcSxlT![`[klzFض˪.l >7l@ΖLl gEj gWUDnr7AG;lU6ieabp៚U|,}S@t1:X O .xI_7ve Z@7IX/C7@u BGڔE7M/k $q^hڧ};naU%~X!^C5Aw͢.@d!@dU}b? -ʏw |VvlK۴ymkiK% 0OFjT_kPW1mk%?\@R>XCl}b ,8; :.b9m]XaINE`!6uOhUuta^xN@˭d- T5 $4ذ:[a>֋&"[ }Oõϸ~rj uw\h~M il[ 2pCaOok.X0C?~[:^Pr򣏷y@/ڠ --i!M5mjozEƨ||Yt,=d#uЇ  l]չoݴmqV".lCqBѷ /![auPmpnEjus]2{2#b'$?T3{k>h+@]*pp桸]%nĴFԨlu |VXnq#r:kg_Q1,MNi˰ 7#`VCpᇽmpM+tWuk0 q /} 5 ¶]fXEj@5JcU_b@JS`wYmJ gEk2'0/> unKs^C6B WEt7M'#|kf1:X l]ABC {kanW{ 6 g`_w\|8Fj皇ȡstuf%Plx3E#zmxfU S^ 3_`wRY}@ŹBz²?mК/mm}m"Gy4dl\)cb<>O0BďJrDd\TDFMEr~q#i}$y3.*j) qQa% |`bEۈ8S 95JͩA3SX~߃ʟ~㍖›f!OI1R~-6͘!?/Vvot7{? ףro_nն-2n6 Ym^]IL'M+;U t>x]U5g B(, qA9r;$IN&CM(F+ hGI~Q<웰[,2 'so fnHXx1o@0TmBLi0lhѦ* _9[3L`I,|J @xS}NEij]Qexx*lJF#+L@-ՑQz֬]")JC])"K{v@`<ۃ7|qk" L+Y*Ha)j~pu7ި!:E#s:ic.XC^wT/]n2'>^&pnapckL>2QQWoݻ<̍8)r`F!Woc0XqL(f"-Da +iP^]OrwY~fwA#ٔ!:*땽Zp!{g4څZtu\1!ѨW(7qZcpL)ύ-G~^rFD+"?_h)yh=x>5ܙQ~O_e琇HBzI7*-Oi* VšPȰһ8hBőa^mX%SHR Fp)$J7A3&ojp/68uK͌iΙINmq&} O L-\ n4f/uc:7k]4p8wWLeUc.)#/udoz$} _3V6UݎvxyRC%ƚq5Щ/ۅw* CVo-1딆~ZYfJ"ou1ϵ5E bQ2mOΏ+w_eaxxOq:ym\q!<'J[FJ,4N:=6. +;$v6"I7%#CLTLyi{+ɠ^^fRa6ܮIN ޖ:DMz'rx#~w7U6=S0+ň+[Miw(W6 ]6ȧyԋ4ԙ./_A9B_-Z\PM `iĸ&^Ut (6{\٢K 5X\*wٗYS%g,0\ Rk k8P>x7v21՚H :[Γd!E'a4n?k[A׈(sob 41Y9(^SE@7`KIK`kx& V`X0,%pe_ן >hd xе"Q4SUwy x<'o_~#6$g!D$c=5ۄX[ു RzG:_[ӏ[3frl ô ހ^2TӘUAT!94[[m۾\T)W> lv+ H\FpG)ۏjk_c51̃^cn ba-X/#=Im41NLu\9ETp^poAOO&AcY,[G1ډ}IN/[3=XH$Ϭ$@ɩJl(߻a.G-zS@Gm$d9*-Z ޱQ.V2 g콬)ϖ{_kR0(h$f-%%AAn! J${4od9`mCX:Q|ɠPG,Z[XT£b-FF]gwX :`!r "oLɣBb%9L` LNNw$S`!sdz\ZXuxJV¤:p2fE|,U2h\՟Ejo䮫p:InjhO޴VYm:!Ya(mo/>Za_Tba_B o)Ih;!4 ɼL"u(4nsAxr.DyOV0>Qsgla]B { *SHh7䀹*;>O擙~d4%0/eH##MB#e6C/'ߑJb.^Z=?xW= ']x'/rH;Ss虻*a=Ct8IwGYnqE7F0:qYh{p &=FD4.."Cq.MPeDZhr5|2lTz1sTu;w=ͮ109i:3h`z>uHGX5P&4$Q`%+% @}qdzĸ?@7S4n1so/B}JA#5gy$ZXႷq9C%-`˅C*#v٘Ey>7 Û+P&E"l-F.n0wG)YGŇBܶJ+SWUu_Yۛ&UQ=nW8}{kjR(,kIIv=-- Y!TɰNl@gF۴GcRآɰ.hٰ9Ĺ0*e;; ,b`LAhCv:S.n<9ӚuQs%(/ YZ]u蛓ӟ>~xs48ngA+Xeވ%WVJ:( VmOabKu;&HAǻWcdmjM;< XݗvB\owBjʑb&<8X+`\>VKRA6M*Փ=#jK7qiPOnvjv+9sh\m?Nn{FZDp>Hp<jY`dEݦf@-'{+Y<²6/dQђ2kx8J,7!{tNp>Cc3Tpu1cd>&EJ[cm3΄o*[\F Y1t1;ރ^]xaup 7Y%!yW)> 0yJ {\Hn-tZ6S흴u>Ok:>7!ITA2FQKB 2J Bw@Ft}ۨ=]')";=k"27'U-NsSTRQ=(p77VrXu/u]`IED uDV BjFM["`mhMd~M>aFۖӟ3d02nV;auasH7^~U_C̺VaF0;]dyi_7p0".HΚ[bݼoǗng fw. sE$M릥__kU; ޷&;)q6fp5q_u{Ufr7ulX-.E?)Xdɨ +D$ߋq{F0&x{kwf /RheX[EZNO廨l/O!}'}iHE`]|-/lj8a 3 kN3kt҅wܸ]]j`WL 1Fu o cGǒ^  2 ĻK{3thozX }' h]ӗ/,sFoWj{Nc$lCp|nCUFXw@W,w YZǪ,$%aX kQ;黻0aDdhA̺$>\z 닼 %Be4g/o|?l>/J(E& P1Mlr׀']i#wwOkg/ Ӵ͚%Mc}K>х`whL!I^GOJ8Ɲ&2K,:i"%$_d" F`1A&`d6%>`g(^/zL:Bޞ ZZ[r {j/5{qtˬ8K9vU >xexkfn?7ss9z߹6H[T)UIY z5/Q̟OME@i $-4 i'OjH#GVi qZy8JV-=du]NJį>2Ɑέ#؞_4UhwD7~FXӏV z+ :lt sJMQ6i*e"\\Thq< dãs9BN NWJ$L_N.z}%oT\lhf~*j){#p QnJH BRcҕDF/c x57'4*|hVU[~ηQ ۊ*@:\3bň]"O=ӻoHQNdM#Y PHJ_)'l)R2ȜnQe!vj+G0!oU riLI9\(R (y)C騸K e8v;Azv8~wՕ~ 狸~%OizH _󃉐ߜ."hA&uo!+vUݒiJ(=V E$%(CR/I25!ci,FC_x1Y&.m|Ԃ> o~C~L?읉8FHޗ4DKK v2UTMUӜ=B)Y'%Gz ƌm7ke>\RRaIV'¶6!=8?8oZY|Uv`Ƌ-vo|J:9[v~\:衡pO`c IcjlX):_ EeV a"҅2jB7S2t=t).\aQcb^CZ-uvpr!(dv^'5|X/nI-D!PltsHDwQ$zzBvQ0h} -_>7޴kӔn,?W1;{|,ۇ=sx+@{l?.*+E>1]8B*0)QsU·BS&vp/Χ6I[Ux8"EȿQGa[qVmSІ Y$9F_h6~߮ )ib+q!EXFucYgcV>&w6?H+NL$]p>I*eOjpcm{Bl9vQ.OT!9U}W冨 ;])>6JdbXԠ `Z#_%.9VF[7Id8:W5>*N>KStE)KS1z2"l%^NEN? _4ILAٍKK7)O,:خc磶Fcݒ”^h*_G juİZsQ ~!GAxg_$A|`e)B QlvKlXt҈t9JXXqdl[r9RǦ:q5E](>Z zZ1&8G9r޴T0=Cj,?V~:3] ;Y[OӬNb1{8+7%L1OUaY쎹aZHgi |D `%޻I>rd31V_Sh])AUqػu\Mڗ鸷A+A.A~&'f2*q0âZEqrO| \56cTAnOFo^ X]joC!Pu!Jm l?Qac_>'"Bg<" 0H_-AnG =q޵^Ų gwpГz]'ť$:fr;M-e ՠNhfG8>Vڇ RAٽ9| cbpcT?x]aU {ӋG ނ1v_/EZ~'*.EΑ9U.ϊ/,9怕:[QcUyUrŽ XRjwflѓ6.ܮCy*8¢O[9bu) O14B`.z͜u-ss>Uݴ SaSK§ GT6&l`GT~ꢰ\0P8_)Z]k5>.1C( .Kp| vä+ kj· dM[a^ $H;M $YǫU>?<UݸoRV >IsawF\b+s~p"eʰ(zZ=.!BjѕFdpUna"Odb *75:&C k1ͤ#O Rۘ– Er/G/UcAPQT]|XN X]^F Ŗ:ޔ&+@,{3T\X)|*HN'e*h0:VumBl ۏ `9AgPF_Ѫ2)sCj1T.S0Z1:?Wy9egI+bK?&#I3X\WGZ3M`AI.pH6xm`Rs _Jt@U8jxɕͽf3[I3G$)ՖbG(}1wL!wVf;L|14jغRqcZRݹJ$]!:YF2cU(7B~ ;Wi+vwv-_@q)+?Dobtm4Sxb(9A `aRs ޶d6'XA5?V_W puȇ;s`uih _F2% [U۴"qkjGX6)_+(f?\T)* &9V(]"tJ8۷)g3J1n`ROu~}#Ѥ#r !J0CQ v⯥ho1=V T:_#OV+kG\8Sz^'툜+OqFǤSCǔl X1\1:" 0mtH,>7>a.fpU`ZR֩bK'`tTiwm* "Qi+ *mDtH-ʐ?sk47iIb3Ώ%TCv}e{̈́=I;iƊc2J1TN>7q;"sդsP[ kW`u!8Rj.2hgWsE.,uDΡ1RºVݐ/CBc˾[ shGI 0Os\l}`pΕ qO-ˠ{'\ QuaBn|L@drVec>$Ȃ1L-'{뭄GdɱL ;V[bp>!n&աI̱Sx!shjuL P Ӧɇ~t#K1pVi8F'+1dc&xF 2侯}>tiDpU`%7iTH .Y[L'y}Jm2$EB"{3cMmhipEI:59sTz?[uvcD-~V,.ȍȱHEB:p`\E)jlȔa|)nɲ"Tq?E8V 7z[v_J~C4>''Rc1-V RtzJ=sۄ`g?7̪ #`u0V<s)/=rnlg9| RD1౱UR}UR,:ơz/lvc& GHwMlF@a4D Oj!;V|aq>0*s%6)L?$ća$."T#yqHhlك&ٕEt_$d:z2-\NR#cDB/eWzH 1Հm -XıUXFr\A-2]6u/ųz Z?ڢV)-&!8vL f2D?#y8Vk[~;DSu>6nQ qf2Lυi l-傊וzF"daf]>HUd6JG`\g2%tJ4vX[7g"zw\k>kqBbB;t@h)Gရ[-rnl-wgpn]Y#&ߓ_SGo_&AJ烠a/f_Œ8aJM6MY(3ͯl~l8V0٪T zL{As:&EXAn 8Ugݗ^Os RTnp{PZ )`S!9| Z*7XibR${˪AokOr\y$ ^X9k}n'_փzu&37M٦ݜӝgv2HټDU'ٿiR:ںQ{lG gErvc(Ƭ"uSFrD0VHUG 1ʰi'4q\#*d<5FGIXJ|dGUar,A,ƪ$r|Xՠ⎩mXu U!JDe;,JQSicS1VEx bHh0ZŨi5eUo㬸=K.?I!T$2VxϙgZ'e0VX&My }ojƗ1V˻bcpitWsHidP,i2VuFyuU#Gc~tiFT kZT `Zd8.ro~ا'I*o*18/2,o<%e'FtJNؿ8uÉġnЊ-+zi| l+^_+cc3v\n,\O/+϶H䍈ZAq-NzY\fb[ar`m3۳BߍE`)OIh̾Ҥ5r׶.qCE^{P,Y`za♃9^ž|O\7 J㼊˗g PQ5),oxrcqD~h{iYdƧe2fr398%CYkA6u!w$<"D3Z̖0с7ẍi(dHν~18MiW}e+ӻ2+ǝ:מS4Of4.Q JFaBS\o:TXl|)Bґں'iqJ=Ǵ}w"L$?U衼F3ScUb17b:B%[]x)2wT[-dnpW;ybIŒW 3/ lO%21Xf. dW֋D^L+_-3xZR5QkJ{4 q9bˎNJH&vz"VK]"'Ah'ΨS4͐~DA@uƁ'#Eq:jS[Si:Dނ̫#-ǁ|^[? ~M|bk֌Wd9+E^" 0Ek^Mդ%@IM %ӥz'}rA kR||!R~F.8&J$W8 8Bdۻf",b89A87Əł0NV5 䤁J1{C@AqWK1ח*A{4k /g74 !íwo.BKZr8hNRНk4(0Pi5&e O(놦P( 썔u-`1MNiyOcB.@y Tq}Fof FFs pJo֘v{Ӌ IJć-hҚF >uw6lh>eT}t2Ql5vb"~(x~\HJ-Ar7/4+R"OsqſTՉޝ`-Ug{l ۺZy)1l\? `Rڏ8]4%|p.ޠ0C2VTeSkÂ؇"۾` }gqzFD.2XbBL}3 6}ӠH]< r-sMZ-!V.r-oi;Y 'CCV͓ u҈BEABd`o25"qӒВ_j4.qw89>fp.^}(@W.Rv%!kB QoO#k 45}%qBKgSZ7zZrnHtey'նYgP6Ie$b  l@]%Y,o,R >2Pl47M^фqIyn.L6)H, ŢHu!+F*2QY&R֮A*у%p R^%4"pu #z$F>$i[E|g$ZQhQh! '2b?C,Ѭ(e<0VYIs&֕`8(R;, 3!g(4[eд]e(gs NEv"ZS8>‘]HU㓬p8` UNqKG J]hKiI1XgHLqX<ݪŁe?U{xB>L9W"ptwXzݑ My m-v`S*2,n#BeuIJt3STnX 1(kǭ &z#-x*Bi 1eRoztRUyaKc*>aUpj)V70lط5n/)0}Gc4FوUHJṊmuMů9/Td'-i A@خeUbJ'jtxַ1"emBw#Hx&5,xZ P}Rnmjo YZi@?u]|E+lx}`j7V<@ɾɖ yl,|a X BHSyE:g]5zO1"j+9IU!H;ܣjG6Ik} DL Lbnh*Mb'M،\\(nx,e_H;;R<:;CsA2w<8ϺPn]sEXG;qH},j|k R hjMd>D"h.^FSH+:^,$UTҐfKN @ z,$)Ւj }m(4CsOnm_(t3Y}V3~byW!pчvCј( B{jyԲ|ںؠ%ԹxQʔOF<zkUǺϧ3Y] v\.b!1u*[hU{ս5Doŀn;ݦ`曡%2/5&K̨bUѠPcTX0:Σݣ>5խB˙ cIAձbjAm{N^m0I I`$4!֊ z#lj}%;&!bJApP VެnIů*z T6ݥs?@&Y!n)8:␷.;gMBӵtU<{`"Y;tKm-#ʟ 2H,ޑXc|mg wNRգwptu쾊ե1D;8~֌`[ l;AM@ޡ1#iHEJy"f~)Yds,t#_cR4jx!34I: 8 nu.~(h1o`z`4h`_<\7"& =D0ᘸ} C;=!~;6#P#kLjcyaM$ACQjjv?9w9D:vWF3m0>v} <K_]1^&d˥팎!6ۍCƜ,,M-3])PPي|¸SdmkwK|?C XP9?<duA ¼goor+ ePa@tB恌(A %[͐kWTR;WY~eݨch'8MD\r]o۸Woqw/E.Ȧl%"%9ֶ$;i.~3$%ӱ:ڨn-Qopɹx!-7`tM~8vG P[߾s<5spu浞,0)xm~"^`8ٮ{ ikq}I?O8(b ݮ"x~B1&=~;5a{al:H{_i # ݠ>+vloM2XReApRi@?M2\l19=x,Es!PWJU<=רCa!H,5D9Zӊfߺ}M[+i:lbd䤫,} ?/=50! . 3:4s]Sn$֩(;*=TNGK؍])@ 3Ch?]o`mwˉnʣv=Ay  "sV pIp3ځ C=Y-q@ڥX ?0`03" #+@k7x0!x]q0Cn^w  : x8F@1-h5.x]2ࡃ1B/:ƀPn~$VߕM;im2ȆѲS3 (~밎x" eH]P Zv B0K Pص0`ez9 9+`04:  b\[oC#=`QN ®Y0HֵT*F\YGR!#9ಁ"l}qr<+Gf,3%MUHosߪvgLcxqMc Es.70{qн픜ubPI\v{AVg@ l{GK'x<Sn& ;î|WC(~NǜD~$oϣݱ{^u@TSg2ig7-|n|낳oQ^P:;@8>۱P˶ kgQ>X@sڥh?_e#`o5IRo(-H$o&xyݍ?O؜zr[v4h Mbr \iZrM^ ;}ۯ쥊QB[[& u8aHYȭ8+RL4嵚C(.%tgܡ/aᎯZ+1LeYrߚ#:">2k]"iHlq9m@c8A*ϳ;"x=6IZd\\^fA&><+}@mL( ,$(laE,7a}DD#J͊:`]zߤ |ٚSBg 2P" K >ed%ńXkC3{r4{0JϲegT{] ̴)a5E韛b0>VWN3]ыg[ 6,=Eye 5Uc3Jzq*W$DD ̎|$&_8ŜV T[5Ž(El .y4'|wA04E,|#&] ߋ)*dR "i1GY'N !Вj>U`z ˒\6yxLzeY`E 1'3}j;S7F|% NT*Kۗ9vit~0ӥStX9S: v{HCv x`ӝR2+Tgf$6:}:0JmR`nߪna;8IwG ?Tr E慨C8%:{=߉txhf׃4fJaP:=l[J~//g6 KtMpGPzmq)~4"]7( LED~5& '/B2$§=|2Ud{_HCjMsjܺmVZ-}bc@r@Ao]jy^kU{cZHt;1ч4`63-0G[d A=j˹WCxwA Ҿ`.9<9=Ey":X oC_\V5t&6lJ,wS:-Z;&Ey' l<#I'7nQVt#NGQ{h8^n(߿~}yRܠKHz  ꯋ|?S Y}O%A]귚.r`]A؈a@ݐgCb?r&JfuXVz`٧ cftUj-uٍT牔:[nA=nO$݂PJ(ۂP=l7B e[W [ʷ'F("| BUB-'ۍPz[oA=nO$߂`` B v#4x"[nOh 7Kl8+%N"kiZLpbՋP[3oVGy6γDpf:-C i"7XIrYo#~}tZ؂2++AW7ٚ~ g*JD~Cϸs>N?9.<a8f廪. k<'/YWyOVa4 6jgeA<89jƲG.¶_\1UiMЖch/]e=t[.s71h ĕ cY" ˧0^&W-bnM%Vxlhu'\ ޳i|"!@U!| g͟ AX[턝!|ng84E@)@}8ʀ !XNm/իG?Ÿ\DK҃kT}T'yZ'9H6f^#.&pL\øFx^Iy4 Yu.|3WU .,Ψ(Of#0Vk:TXRLpxe`+v,_mJ=O V)<2YV#*YN5fZD:w7Ju?pEb9zO̧= ʲDt@ro ?J eKݵ!`LDRNCYz8hqW,0JD<tA`N#C4mMDoصZst͝e4N_dQT 7n*s I&070D5+"{ЕG{jbH[(wTyY r\A#EVzvI[1"_״JwD^gs+,z g ӳLٟJAo x1ܶ*UZ:mX_!vGp'BNkwtB}rO]-9JR<}WjZL-iߓ{}P|_cvt!C}ՅMyo[GV,j7|YKu][I M}y˫X|Ʃi-\~_꡻oyUGy]~ΪPKP:?SM&NAWL@t+Y-Bm:fkC"8Sݶ[vܓOx{PǾ!$?lL^ 3DatUTмM yZcHQeFKe ʒd/x|"J 2NZLo2X<3Tګ Q3,~Q'O$=i. 4ge$Eq +g;-2j ْP  jt[w%s\eSPgG8[ Ac'lZ48 geV`.XR1:PK5z̪vd%޷|>E3y b!/wUC6MElEf 0Vt6.JfP]ve8ջL:"޷siydSie=~k&1Qj~kJz!Jb=1NuQ%xuFT6lb^"ue`!@` i(Fs'rTa޽ Kwbv0k_S7{+ R8^ȒrCDtsĥeQy $ˤ8 Jr,OZG\/,96zbڜ?(>]mJ/2cG-Pw?:{8tS4>pqIdI%jJ,GSaLJa+j]Ĥŕ/[tZ7*Uts Y?EE@5ZǷw]w7A&$R%WCmJNaT*KXj,&S9)FYPqI&ڠ0f_il@}0L.x_8+.03RGw ũ%܂г5nZD85M,.%fDBByH{JHԤ)BuZV(DUhM +a'4^&Fi|01!]1 C|vCbRGҺ&"4A"cmZP2_xKO|vSf2?&p$S{e3J=!=J߁hP:Dͻw=v!f};hDj&mӞK0VjM{G+S_T6ɏldw4£&/f$Fm{gwD{Dyϯ/p7b){Oޕ5u,gu4N_7/X0`]>3J$:(A76ÿ?AòBs͇{ ?'h.o;Ӻۨa /C襕|vO\st5.B$7uF=%Ϩ)Kj8N_ZY Ӛ{يmRZi " {' 5?<^*Բq.4_P?OY;GoY{6N.@?澉K#Er˂[PSq3h#+BoKq[Zl)b ޳hm|cSz1Wmo8!f!9'uŠ01\̂S1NKs6ޓ6Y_1!E "OaA0VQcmn0 2ѡ~5ҺƩniR$y_/q݉8JCIAl5QXt %i^"i[Xn*p.\`dP g "i"8,t/8FsX,#/TB W#}7w,88j|x P#C0 d:E G#W!+t,Z(|3Ie=|61-_K?R$f1,iX㈘2ajm8prZ\ȩR#o!( ؼ%i"i؎h3ף70Feo ,6_y, _h" nogYJB̸k W \<5!HA=3 Clуp:FE0^NUtJhCh;H̊`NS^b v[oD>ԓnŰDŽfr0٘A$I1*E w͎~owbu‡f[9u1zߩB 7IPn.GKl76ۧ1ꈯ0~?U⋻T. D ګ5l1H:o>]VmZF7?% YPEK(bOWl+3PzA5g|9ʠG6BB*t"1fn&ladF,тR; 8$~Āzw0y/M^Nt(˛t8ƂgU4:U X.u_>sL6QEȆ8It!9p¿(gvS>gC;7ޑIٕ(E 7 ThQNR$p >;W|5AP56ִ*\A,!Eґ%k&e/-.!7ĎA*BjYMjĹ$^u7Ys$И&ZK(- Qj-z se`Gîjr9 e.)xHlZYg# leXx(_Qu8ZԊYl11"~IRxz2@vkZCIAc@: >emb`ۻM*uz_gҘ\Sb=uՅVmtǂ975-V JF _s%E'HbfrOIIQ2Z > fJ1WM,!E { hYO}hEotRJ% aM6FV1> ScR(E/DA2N Sbu4`Ж{!DAt2JυEs[5HB5>f2;0HK_$Gt|,8pVZ{ Hvr Byzz?˸t3 Wx9x=p֋hj;: j~-I p43_(~`,wJ'm>ڏnk,vnQQvKs|m_2{w42 mq:z1ۜ6ӝ>(y[AϢlOM_q$%oq|gLNDՄW1øP!Q'E ӌf`TE >`iSSJTHCt{[_V1kKXhX>Э9`:I e٣ }355~a[,8'gɻ΍vPcZ^t¾g,-&wgwWX>k?yۼVabYǛL1-JE݉\I_~aa6{.@8Q Y- uA1 SaVm}H!#pﯴ/ݹ`uaG=zŪFDzS[  οquY߆ĢFa¹ SI^ Ds` "ݝ?rBѩgArrrbbJ2Ȫ!h(uJ+W1/+{Ȃ\e%>ܕ8s_#4L\>u6|VB?:` O%Y< zū t~GwiL./>zk!Avƚ`` v1/k,C]gu6wf.jfOgAO!t(x9 }fР.$u1cya2 %iʎ3R +4w}QM&Jd 4/;#Kc?!NP$=_>+łvf>oF (e) ̜H1ԩՕSo8sW?BzG$!g iٗk7H9SIz.Xk-1\31g%#Tma=W؍s:I+hX\Hq[JĘz-O>%H:u(K5yD;Nc%̱F)-Pļw|h[;a̋2V~*'j‰FP:QzK2iIyuNm(%yUgnh&_9Іr|G~>8OvZP*zϟ8ptBՕU r(65H?%5+p愃\u=H~ 03SgA(KxbߨYB{_-˻mʛixT= :݊|E 87pSեP;fGxF_NݯvI/.o,dZAX}hr~ʢuj Vbe!Y_aeBQ 7)NK3>DwØ)!jH3otN&לH01Ӆ$jɤ48E_A~T9嗝&(9yԩO2ephi XguGґ#Gsc jVFGGi˨֤B5_iMy&O^;PDR gN=7rd`W2º09+ۤXyĂի?zA>UubpEB)e$ʽ$F&;poMU,8pVtU>ɵh}pV^cX?@p^2՘2@^NՓ?O ^u(\o8wYp43Ew^V0e h^Y' ?0!;j6eܩuieI:84ܔu8ۨ: IL?;$gĻ%H84U\LŌb#h<<Ym*evf0Ւ!a|FaF&R^YE7[sv\imm#Lbq2QY_hTy TIK)$|d$҅[N1*B/ R( Pf8e'{SБs=U0C9yv+ '8 41©+yUmRV?mcس";lQVv*[;6 zf)SP+HDnVr0&"l.Yr0Cʍ,p0{&娓F%ז 4|BɔF8U7E wlA,j Q!zkL"R$}ŋԧ ^ϪD!awOuwJ>tlڛۋK|TUFG!C{j}Jט̈fp&Q!ukXskMn;{jyD] 6.EX*b"T̅}(Ok9k1L.RX2 JgKCߗu1鱜 OM s6*Aґ!O,s N{Puxq-™Q#Awm_!loǼ-55>~ c8굔3Cڦd;E#7gY® 3Oso:~_I.3~ >Siy[Kfl|fU9Yf"wI-3زMVZjWznrjEO^0M&`#d03Խ(n,,8l<2lj)ĔXԽwIԇf\ z'2⑓3~7MS &?0i&pŮnJ71]3qs{~:Eӎ}3CWVKʻ &9 ׬p׃=Dsg=17OJ`˾ʩlM2>pAg u# 9Uޖi{ ,~Ʌqy[._x4wudNNX70yR;1Q>G;p2I=k"R׺$$ҕTJcYRK7hԄkG6kCpǟ[> _^IwkgYoN_JU>F7nrpw}gzn\c=}Lks/]ziqa;Ufq6~||Mp'FbKt#G}/gn=޹LFw<IMQc|{QY{|\h}9nm/gW`ܺ_@ uQ$QDFH#LU(][O.Y]c=" Ta{,G] Mt+ { Vnɸ;k8w}s޽@iE=g.r\^ZX08s*M?8m<24!n {7jobn1NNi !7@Sj+fO*+TRK  эlRZUW8UTmm/M#/`LŬ˫O:*&gGX$+mE)k'ZE8Fmr9z93ܔ;D~jfF,ar\~<̡L`.C P}QC@%r]A{E[ T7rgœwn.=^2us]~(LEx6ʇPg%xOm.3t fKvFfr}9~o^w\1,psJ blq`az p`ѴG_jW7E7CV&=&j?xq:_$1'?>-.<)9\RkݍMZ!#!) .#^D[ e"Q>>3A5Y:9@7>~O Ln<^k踷Hoס汹yB}e?ؙ1qoJfOI7 bŰ`3kFKOb 4m[˃TPѮ ŒM7QHpiK/,ZPȵJ8~j6iM@m1ۤnڃh: n`]`M lOLi[ |ɍv#YEJ3*PZ<A]`vA#!Z `ﳃgEkD5fj˝/)-?ӫ:bLMm[zb%-&$u#Ť+PE9} \'j>2/Fxi~xia4p*h7Rd~On᠘] 2Og~/.қWw]`^aﵬJ E'AiXۅ|C?)4]'yj2V cflws^\/⫅5jXpNY +Ýr&(H έ[c(ڵZ-o@g, @)O#eBNvV% 4yW9$+Gۡi^|i>ɚih%">|~*058(R̝@q0qAbUfZPlZFrMlB֒[XaALGTZgQOjSj`;* Ia= %jߕKw,F?cZt}VkLxK| l/3{LJ1lxP=H5ۋ"w`(uhM0kzF?y.^mVì$dݢd]c>fWfT(=PWuXDaCUi$V#ppeU`_-ن~I/ =QyZ3 y|wl(chC_)n$vNw }͢xCM2_E0bzC-qt-%( _ZBVwp_^Aa|[[ M~ueUz`6 З|C /y쥱Y/kRI{&m3_MۂKSR9R|l U M*]"~=g;/Gn?rݪ2BKD;A@E{eKĎϱA/>4i_¡WN@cnjǛQ5ބ3Ё)FudZ]/yv9?b4W=n~?*<;OnFQ}G^fi , _ߖ_:f02+J"։BR1 oKa%)ic!'\ .-lٴp°?o<ě &ܻJȒƕJmkƢӌf$Vcj ĦtJ8IƜhNpqS)P24AlҚѰfѰ5CsvG3a3!%cY,HJ֊8fЁ,)v01KbqS! 6D0i5aE4aM4zd5cQʯ)Ib) gpiA}%q/ȯ__rbPu(a/.C}+C&@OzfO )$kQfD&e-=V-qsdRT͙f0\W=8B:px5<6}`cɛ`FUNGǯ 0jNI(gm/gWۡVg ZtmG&R[jzo hŰ_lfZ^7xrS\~8UJDUQ-YL5}]8P9x|u뗽+ևfbNӘ=[:9Y %aڤsf k(ߺ 3AY%i*Y}vLbND4Fߜ(1't>{3q-?cX_9G @$Z?4 R_l-K%dcH!b{0ŷߦ%\!xM},vvB_W\V}.ezQ;\l!vK57IM9nb #&` 5][9j^Y/-k|pD+}X|$l+tU2 ~vfLee,r{4;y1.Ttې@%t^J1") O:—Jh]QwcOѨgKmWt7|!nT# jTΪ\./?G]^RnP:bBʷ>SO`L뉫vxjHɶY A&=Ƴ|pDdK*/xG0ݏ^ QIl$iEe !a .^E;zWoo> wIQm֥.w>3$][on_>vA,{vV[>\w>]3!X݅LA9C'R|jAD;t˧ $OG]8Dzx:k$ ?>6Z{~YC!B.>3@ezŔ)(;x̍b==Jµn4 @ݟqݽW Yϝ9)_hMqk?=ʣGd\C~Sĩ~ɃSASK3E˫OUK2he{7H-I#1hK:bR'ufD-Oꉫ))Xb-D ͭԪ ra1t$UFLRx qY⾟py>C%)ὙLr4IcY-%؂XGtD#3uR7 xk m)u:T=`bEtf= rsx۽ߖ֨0I4"7v [E.Vk2DM>\a lH쥅U<'Y㰄7΀̖\oI>w3r[U&#He2f3Rx$DD HWJ]5 <* )J)tE{5ԝŐLr==5e1sYa I0@Z*JR "M9 ;0QbG"e4(Qdb|u6",(l0 ʴ RT۵TqlUA 8Zp":LA*G(SaJ+qIB`*S\aLǙd xkझ88|)F&8n\։Cl4P*i-o8_׀c Tq%9]E×%EUCݽh8ɤlH^HdS[9(e5Beye UD1"D\86mh}AV{Q5Ɖ=8>J4EҠ:^iX 5FQiD—ozoG0fחfzSY>>Qj>@iSʮςҜ8Hv71$F889uLWob ~+~`[g3=36Mǎ\7?ݕܳ34מl1i6ese2!=vRJ(N~QhU ,[`YvkjIiYaPHXsIMj.y1eW:(ϳn$E=2y6` eZKé>,ag,s9\ȧ᥉rLF,SqՉKEGIƈI8'!ʁ{Ra0aTrgSGEr¡jg\\⣹M.E+H?`|5 - JE% 6IJ[اbD%2 MwRti2 |mEֶfF40F%o#jҚe{En,Y}tb(f" 8pSad5 a** &BH+Dn %X="}> <' Dd&?a0EM\` Tq5M =c9yZF JPgHV%{]QxH/,NFb4-*=j]^Y[/}|r=_o5`C DUFƕg6}S-{{6%yX U{9ʺҘNq6<ҍK`S]^2Rkb_1KXO ٽOjob^w\ gU6JNeB2 dbdX\8%jO 8u8A˥Tr');Z^2A _ +,MƻWsOquGahGx$ Ns@KE9ˎrM/V€0Α6?9MFSu4+ өmϭAD#1S['CJH(R' rȘpzTS ӛ>^o,|{ sKSt7 ~za/r!*ME#zCr뚜h$ϦOQYfܷ܄Y5+[GQ/Y-ƷSX;9RgW`P?7f/9sǘ>+9Ϣs[Baյ3xrI(X'B{:U']괭:pn?˓6 s}1?+_t̻ ?_#_X+?i+Cr#'(zzbШ8,/~ߗfo|=}uy|7p0}[WSWO߬_m{mӯ[3;}O UraI/?/'`ӿtk?ddD-lڊ6(tOxĵL`Ǡ/ⴜy N8vCy)zh޸ɸ$ԏ1{u{e"+jV4?fH`_߽{]?Q z:/O3,_N>p } #ᗯ'ՀAR~}4Y뮆_= 8pCs5_ = ##GƮN5ޤl7ZD}b`DYZceHhC6jĥx!"yÿO7iʃ l9>&n#},/f ~$c'{15NSZrܷZݺOy?ȯpLƳy|<$R{kdŒKQx9߻&v$oW5ԗo.)bBӨ7 ZUۑPMt4+ۇU9>}M},5S@&Ď>A:g"]C/=-ϧv͸@[wp|5xgDRxoZ~#_ZF]4+]}4M>䮪nӪc[ ,꧋o@,#xcY0d n\:n3Z*أ^2e9]|89_H𧦻[wH Kh^/ƹ G}64UUbʹҐ{ߟ`yiDv6fgt,q@/FɅ{0{Y?ӧ^ʭtsٖf|7u7D)oE۷7wE-/ 8Vur3LT9՚ k7qDLek8{ 7q׾8꣸y Ry"DBPLA2»?hTǏB?itG`QNaOfVS]oL&WW,UqpO[(7 -,|* VK͂ߝzj(JMmc?|maݗ/ h"+8-0'mƊa*t۬1]p+q`YwK(B©I {UVy`'Neb`{squ]g\ӦD]NJ]/|=}3RBuk$d]SC5Q㰭ksDMy"Ũiڵ|HцsLM8*/Y#0qR؊^5+nTǏ0~8~Js_=# v!,ˢVcnV?y;$qi<0D`ˏc?sy!GCOU~(*0]e# 5u鏐Aa.i$}8nmYr SA;V`]q (DZ{ӫ5>V8EA$LIhV{  b $t4D~AA)^:gD2ƞ:s TcAAn8?AQ :y>=Ȱ绾^}i [ysx9:رB9-RYuuH:eY@c#P ru'z}evufMC\|-s݆uɺ;D ob/h˶B>Ne>cJ])$`𶖏j˹ɬP4CMxݭ") P0Vݭ?*w]ycWkd,YW)zⶻLF)榛@CWI[]իc{S:,-G|S,c( IƀE SL\Lo6xӕ-|& n> j~⇫G"NA.mmjztlG1r1yTNC7 }n{y| ~-T"\IJ~j#[a֖s g-7o’+J( L͔B"b"F+aSтF;I”Jluf KeL':Ij>v*ħ"qpL{*n 4Rq8LE[.E+=*0u.]YmI."<7)0R@\]l 0ƚpjgZ" [ ;>b ֝a7V-ػk4&z)m]|5*_2qHϯ? hξNi#fZ{q[XRǘK6!0 z;QmRdcB7XѠ͛qq2{t=bge dS9icS<:r gC8sVGY_ËX#x)M/b `Czkեr.{:lG`4Puxәnz@h{IhahyI=R3@CӆwjoG`(.\6Ӥ[R<_)pi=#6 ԑ'<#r=0.K.']p+Rbz@_Q77' ?IA|~s0L+gzgn)\ga1m&D$+1 uBf?D.1fZ3|]C U3MF. i HD@W`Czk}RM^LC^$ P%#4r;" 6inQϢh9m[LT4uESuitLI/F˔}FIcj-㒠10 ;yNIuƚVžUqk>cMƱkA2֞NwXKk由k3HfS#~k=3̥=5?LlVg4\,/49fo_H_-Gk>,m^k=v"6s5ۤp|z>H{!X^_..[]l8LЮ̍+fe3kTѕ%UXdin pYAfJ]~ߘ~q'w-{}t$WtTm,/ߵ۞'t3>I\.xzKAWIi /@7SEĺ[固pU. хh72jl3P?8"1~?wW 30Ǐ?@)%kOz VbS9+qJ7twrcxMŏ}vqlTl6^k̈́:iuKs[j4ҖΡb\eENL1^TP=nB-gy3=\8Qq[zkژ6t bJ g@8 X3iaLƙ֙ LFu26#DNf)y }fg.j%`K* lks-;*Erټ&BMD@yQfJie|3w;5Frl%=ټÈթ'ܧFA`H^:=lj! /Ӳˏt;\ .ԢXՂ+hU}kiXy RuUŠׅ ^@9]a%qVu|^ k&ɘ! d dkdr5)H>uBਸ6HE(;2ROs|ӏۣRO`Ȫ`0v 20*ЍCy fQd?2(\/!*CcZ)l`j&U&ٛd8e)mLj;=ޕyTO\6+~Oy9_ >bk"p[rG*_2B rQXcTWxw]1]̻#ן W%[?ڽt"fQ|*2[? zt e\QLre1/r_KLiBn)>{"#wzVD-V -fAq8 VsD\M)ME6ha+zik꺐,G/@a/iPj sJ.ȗGsc g&39Ci}יҲR Aa3[p1X!0i0?\?,E-\Az^ei(=4<{1^-ZSW 9(Vɲ̴.@Y+Li@Wvz TJpSbh*N |L w`J fbv hc%l1tJX>%uW|t嗛.[[LU`ȫ=+^zjneQVvZf*+W*E%?i \~$^661V(&w9r`MQu 22\N6wȩƽzMzRHz|LicOC6=y'rԆN:/YY0+K!B[xp(v(J)\ 4dx"0S`*UfmV, n0gOOzn@v÷;b#-1$ACJhH =d\Ӷo}iZPk~ Ӂڤ3gBdB1)aĔL(ǮAL! )\ۓDN)g5=Z?<"A-z 㑮A#ZhZ! $]$3F m.rmÀ5d P%A H4cI)~ϖMX@R; `5 BƘHiYbJJ,r87_ X^~^\6a-XeL=sU، Y\X(J`U1LBAt3#̕/a%KƸCa݃LĐP֧IeUiZ}nuқOc94> 9 d:@n:*h^TӥJtsKJ9(]oՐsR, +( QpSt0]NAJ ȥ$ !9tAةS(jEƥT1_v"1|yy5B tyfw4u(}04+anӀ0Bkr1J br%E.])3t+%e`CTIaeHL$Edո,=AV2v?\>;|8yTXFϙ`.ܠVSџXȇC,,2d :˭|0'_%َԑ' Az)`ɰ:49*JӡKqm~0UfJd;g U2Dpu1*־ތRKSd|-R6z Z40d2$70IAAh[&)},ñtow-t d:E[Qʳkϳh p).6?]nzҟp C{fZ *΃DPT`/'9999uInS\9,ZMALM}ҞRB.TUϙ>@iWa@6B݃L ^)oJ!okC.Q pdAsU}>{&+$_7XmК/1PWT 2~1b8qdk~ `" wX餃փJW\tPykTN\hsG6q d:E"vrX^:;M (2`yҸ*U,7) 23Jc YhZj^8J<Ùٮg$W21t S!+ɀ * *Y/uMQ*רwN2 yqH !5KDJ霭DAe?]mh+rU!A- ]ѓLT YX}8EުS _\H\ mw1hsДj Y TT^}GfBZEچKawK ,ݮ ,]wNsaHep|3G2i "#N#@M/7uu#X {m|dU7.?wkz7׷ xC}(vogUvM~1+([.Ȧ6f! lXgRP Ca4D~nKrMy5/>6&c^]IJYZMc+5ftzçM V dR`R^z(lws)aLF|[a ֣/R-jvJDvAcL䗋W[۰TR @򇕸#k44V6=(ܷ0t8FLiއ=Glw̕"-IUaR3"a4~" ej;6%",99F.KD"Q!!4)QY"gj{U'`ȡ52St^ry+;*"D*7?s;d pG}')6n9tȸ٥om>8k׶,90{(fiͫ B+. vuBsF! ̄u#_Z7bm '};Yw#FFeHtQdF݃컥 87,Ryx i 8!!& rARz,bj/-p{Y- pP02/l5h-]W$TYX<赮30q@$TkK@b aBZcH,iŒgTBX8スI@h w'm Z !큈6RPrR"Hs#*Tybphׄ0aIT0q!fa{ pB$>`lp {m"BWH]+{t)[`mW4`@`?!KE  ;&{eOvM|a@2&Z)SŒU\e~RL$Ć~Ҏ~v u#%fLqš'qƐ)CBj#smS1@IcVkYT(gBe<~я]X`˚cU0e)1e36e)Mt0nM$| 8 )RڃCn $ƥI,ǣu$_CzJYȩ(}P#=P\"4IeYoF\ Q[́t!|ƈ95m ] È*u[Uص0b)1b91&&qY7B,gK ; q/4KiZji&Ū\(l-37jޥ@ӷ IiQ 0|Z,lר=RѨ)Z+jhgè}Ǩ1gl)ai`ԫsnޯiHCC*GGZve;] ntq@ #h2:6 |maۣ߶G$[-:Q!vt{0m!}wu<†YaW}?X$ƪ}τ;u*]ڒDxbӜvOHJ҂Z!=D~d<~:N7x!l/ki?Σ|>{=0-YwJ@P+Wt{>[ '\KM`hkv#[q՛Y#x1冀xu>)}cLmw45[F(t1.l{)ecx::av yWuY,kl Y<;?z)YLmqA?~\bl / 7YE8+Qa RWN~R14|;DU I9ˈybwpWa]+&.~kou=hWA)=,ZL27@{MMSDRAkM\@E|:R/-fu..ǽFɻ-`$ɖ( Ѥ ݥ -(?UFOnUaƸŢbŔ>)HJ㙍. jW( e2Q(kͩ5;UB>$b^f |z_XyH(g r08T8y3\&82Eee sя  pT$Nι}\FUV:mRGU/LK`G.<"/8VJeIYեc=[FyWKP;ww?*V .G! 5rt7R1+M>CXࢠ~Ws#ASN NSSHվWOspa/u% VSZ<8{A czZk 9bM75By=OaNUז2F|2hQaċiNYh9zs{=YAw[U&ɟx3/f۵Fnqio._ 7x@|Sׅ7ڬbKք)Y-N6N_ViZǯUNjQ .0/<\p^ xx0-+P= I؊CMcteZ:nm|?;rsÙ;}gsظ3mFy,/M|J^WK0er.w=rʏ_N_~uzEM)@h>Go~xj*\.nҳw7PKGz[ct;^Lתkm)g/s{>̕yu5|H??|l~afhQkETyWU~[}׫UUw Zx%ŋ./iMa+id3<#KƜ؅), x_ȧ\<>o.JcWU˝=x_j(%%銻Auz9K :>ÚDkQkNҩYJE.)$qu~`ۄҋJKj#ZK6>bY %CdȪ%IX(Cprs^ QUZbKT)5$rM NԭhE>{(R.E  jA 1cij-xԬ;vDiJ2oh|vIl`r)K>0x5nv.*68:-w ^ 4sXol[vaiU R*!VeWR$\ŒgK1#!ѕ$qoYWqai 1YýLX\s \C^-FW-eqyEiݘuU`t!ThS%vTZ[Uz2,uu؎uBނs6)cf+2VjIQ22 .ft\S,QPIw4xFt*T[az@Cg3 dQ2um vE{~e] V }Wh%W&$g6 a]5Z A #NuꎈQEgJ7}k#.ŞEqn0, \"1 P#_J)Pu>KATbǹ5nLv̼.n4+4TCA̬%J92jܢXt5O,)(h`;ںq/Ԇ8ebUXL]l-׽))H fr 9[i4x̔)9o5J.8hYSl-c> h^ hB5tU5ldgS0(W^ݭڻÜKUUO:E' 7:ƄU(6ʰvNZỰ"'d߯ c; [L|㛵p c/x u`c{.d!"ka'c[o.yi9(O-X:J`#eKW=$V*#l`1<|LE yj_*-tF\8GhZ VE"4iPyM.C-l5̨tqY,yܣuxмL}Xl,kwkXxQ oĢ:Yة׏oTC^i}ǪۆnBZ!BTDXao?Addi*S,7&؈`PǠ]jrk^ @oB|}z /s#L϶sG_K0.W¢+Z0FZ%8ܖSBGK0ٖa:9H0Bp5.aBU=#'{v= hFe#kvq5[,vjx`e vR+3zН-V߇@ftHXe U7XʙvilEkiP/zB\\H??H(A/8Q{h=m@B'c9ڰw ҭ?,E<pi,sVTS.*w?. qHC, cViMɥБdSZ,vQՒ[ )94FƧ+jp!GR+ V ۣ^ ˾t7oQٰ?sX3T%u%KgpBGy?|ZϸvwFY5 ~~θ߷ן4\}a_a2OZ]T@⮼|_vqU7)7܏Z!שyVރVuu)ŊPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPق:4O5Qc dԁZg@cu~DP'fuPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uyN3:<*2Ӏ:g@U>uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uy>Η'm|$aoy8 :q?~{˿شJkp,o.wOK5wf/wyjnxr+~wv}ݺD,0X"IzEl@a"IґbQq& GGQCIZ=IĒeeY3Y:1EDl@ۯg,OyfCۿ77%q_;ߠAجAa>R,Of08^IĎNMD5$b I`tHl0oYl0lDl4NxRH\$bx?"s'k}d;t HǰOxvZNgb;7XmɨYl0?^<^,qr<DD8$b) Enil8Rl8d5'k|p%bhL.Y6NIz ^ϲf0vebb,b5fk5M"vpqY2E|ֳltT'}H\9C=1YI^AM"֢*?XBgˎ%fC2FO"6Iӊ5G5`1?X:G5('k nݿo]L"y U&b+pY}}WfOclR}sٮ_ܾmŧs,[;'\)=.vk.Rͻ$rvT^mpѰlhGʥpѰe|(x(e? dz硠2D~DjNx(ᡄJx(ᡄJx(ᡄJx(ᡄJx(ᡄJx(ᡄJx(ᡄJx(ᡄJx(ᡄJx(ᡄJx(ᡄJx(ᡄy(M7|(Xv 4<:y(,<ԏC腇Jx(ᡄJx(ᡄJx(ᡄJx(ᡄJx(ᡄJx(ᡄJx(ᡄJx(ᡄJx(ᡄJx(ᡄJx(ᡄJx`Ճ؛l^|9܎kC>bp|[i^gvHlcIeFF xvb1&Ȗb}OTSH*IպU anvʌ8Qc~ GNm;!o4;xrs~tv* imo[JMV6J{ESJ>A}`IY~\E8,+gi-`ֲ ʭ)+ed܏k%2>#'Hi'A~ޟ'@9ë/ )WG,څ`FV1VhfVچWi^ !>4t s;&l56g֊ݦem`>UX4p'w(U\4o>?ɘg}^&dIhGvU5o;ɖXeU.{@-׵lR\mVߐЫi&.6Wwm5KW`wGWoO/.?LMw7=nT.8яo6? #Vr]mRO O2?ꇇ!xe2s&IƦZTU`l/}1/<?ˤ?ϟT_J*Mk)Qu{uDT`D:jȹ,1YK̆W[V֓=B#H+y:l6{'J%a%`5=ڕ,QpέLQf6VRn-#_rX#${>H<{Y+t_ƸG?&t_uɏ?)ʷjReɵ?o^ʍvy'ď5=d{ 'oJj%c]\ Xcl8pS?X6^O5r% o0ea- ,llQDmWV[VV,VX1,֢QpkY`zr~%`#lx-1a, P=0|edɉMJGd2AU^OkoV9V;ZѶ%D?Ț=傧.sWQ5֮,{^q%`'^ XhOwo; gD)+ ~2`k) ~2\O?m;dM(ݝB)3Qr+I?y.߃s_'3v6/~wm\_Lvx3ChA?oo!=ޝ2^A/]$^~SϦ^}<=ު/Ҷ_^37Aᬝ#xS{yPw>n{?詝6%m?N\R7Rmoprjn3-q@S}*b>ZTNgN{9&د88k>;&foi>!}GoRnl⧙zr+UaRQź@"Eܯu}OG%?ql<4y8=lj͑ ĖC%C6]05Y.Gq{ftq+bjA >3 nXQO(uw5r =U>eda::5H9e֡UcC(FP[rT!y}Wե?m ^nK{ƕZٷ*B%E1Ίgm"P,-kRkP *ckdr-&FkӫN=hRى{,Z m01';їwTsԻf2wΪmo2mmSSLj52T,3)[qBE2鞺5rS2ȘgEʥYĚ/`Br]65P]ju;"4֤GxPy`Y_hQ+ &@՘^ ,*60:-w \ 4snLg} &ʉs,j]vU H28!F 5!2ѕYA,ĉ. q:Vz^16 A@p\fW֣(-հ֞ F, "cջV8VqV&rscXc5m54+!R5#cf5+.@2 f炒X:#Y!Aq)t7L]`Nj*(_豬lPLfjx`e &5٤V,vvs++e$fLPg Hh u#5 @ mlXcTYoѷL t,,Ȳw&bDb`.Lƾ(r|B#f2ص.NVAzoMpf%J92j,Xt5\c呼#-h~GB['O !A!N9!d]R  v[W %I[ű>4[QQ=5r4e"ZC$o5>LJN3Hb`۽BHy V*КeM'23ɑs[k7(W̺g'cB*WEh^4ڄ0K6O =vjohxAF*]^tмL}L 5dL5EvTn~Xtu~Q& :QI5%PU;G@6tF_ #Hb"R1~v?n\Xwu4S)c%Ve,!zR|#Ve6 M(Tʗw@1 A=n;P P9 lem-ѧA)I)a:9@ 1^AB ;*-D(g broٙվgy@΀@\!ĕ\c7&Ն:& XBiv%7?=(D"ho:"v*zVI?EiXYwnjpM#2PBЌMDp`k~rP 4 #liِFj@gVo-ZRr뭪#}Ȩ *50 f}`T ƐZ)mm`=8rΥxM{uynr$YT 2˶? qu+ F63IҢG*=tPIlQf18k0kM!j%9O%'Ѯ Ș @y%ϲφHzQ{Xuk8ݰ(^:(5\Qv#bs&zamTr7 ف-hHTbFHz|1rw]߲UOa+PA>.w`E]!6b]3Fp[c}E bZ0z) "i zsc>JX{a竞wuKE*2UT*cjdF;c Vn7yCyѪVVg045gҼNL 4t FٌP-ŸCtmr;Q痽z"%$7l klj!-:m@F@0>~rm;ڔISo,KK:;ifhwJ'׻mOOn"~[>g:jOz^\zR?Q_E! 7?hu%TOq[.|x-BJ5 uz:b8P^c!'0:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C uxUBa=B}v( uRGJC3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C75$a.{fagمKϵVg%a;*+rDrU*d}7(:PgBQ3 uF(n u,z\SE)Աj uP((,q2(:PgBQ3 uF(:PgBQ3 uF(:PgBQ3 uF(:PgBQ3 uF(:PgBQ3 uF(:PgBP˕ Ϯ|O^x%+A_~qZZ7ej2SnNL~6' Cswv}*@WV{.pwb1Jج*jZ ؀nJFyxr>囟o ROn>^^&}Û/ 9Û[Cnk[J2t_ X!JGmW@VZI.l@X^k ]:7=3  :@dZlF>8ɭy%`,Yh G(핀ZȪ`udJYA+k#Y XGND`Þ`IZX X&n-#H#kjt5dqJ:2!|63T|b- Cw[m7'mvr_+pT޽Mz LE箿nȻ;nvݔu|~wӾ\Ӧ*.Y_kWyەxjk.EV)ϵ+c9DL/ Am7gY6t(v׫"c~wrO$u횦vmʻVޟz9ిoߝCy?ofxĐǙ @}O_Bqȇ_k7Y!ѕjT A^Nj?ʤ?woT،kIlXMΏ6``#We n%`5EJ/kuV3>:>8 GV}WauC4+ Kka25Q ~`ΈeZ XfôGηlc++5kY'Vcg/ql1 HJV.BָҺ&QFVJ:e_xv8-G:lTN*՞`2_aNVFA/JZMuuDf^WV&Fx\{geemhW29)&kZ]vZٵ:W62G~?hd+s6gdwӫ;7pn rJ۟n?S6WuߦNԯzPmj|0_w_tgC?vD>_ݎ$l$o>Si[}چWEQnA=Z.}WcɆbSVVcDx ́nv7!]l7NGM?wG_Nv㰎|oVN|Wg]:;/ ?& ;לr_~wpwZ;x]>Iin{7mJV^S(r?/HQ>=07 ˌ~mV>ϑ3S>3~$~!/qc2=%bKʄJZk@]FzskBbWO`+|g4~cw7ɩEFBժ.[Nd1Wuj_~GSfhj̞1USܦ>\UZIVtOեiV؛]֩ Hȱzp; YD}fkZ-`@f2|َ- UB[%kdD6m;^9Nf';gbhQC l./?^RR֖v[b&o2[l&&5#|a=l9Y83K[X38f҂1'))P 9_^fG}؞Vgn&+"qav$g%m5 ܥ's@.\&^euK>p+ y@}tBbό:i51/Ӈm{}J{||HUw*Q=ṵ(s.lbqMoi9XqZ25bI:PwAGI <7}xF~%wbL6)hK >įZ)xNA;l/ii_" RHjEU~i`! !k-~AjjfŸر^4/i UQ5)*Ե0sS҈g`ElHҗٳ]a9w<56d#oGDfXQȗ߂j5ߩ:X#}Ԣ ΌG1ٽ*踁[ x\ ~99GD~Q3`&ܪ˃Z] Ecȳ cM0mV']m> G(@)0+3\iMC`a\f̓HJڬC6LBmPUikGU0(|+¤TƆ"(ưhqB`KUn0WKeeCQ1 ]LhHpu68phjBPP&׀T߈w4~*,QegS d,Z,zYF[`B]S g7n]AwVrEC@|kB ᢭h5BO=t[,ݛKW5g@AggGmwaP ӌ/%JaYLu>A% 76/0UJ$W{HHOzElXxL ӆ ~_fQv4l5'Pi$\ 2ӪAU^ җ`$ycAH}(kuEvdn>z ,T_ɂN5~d}E "l~g` jbZ10TXf7eCB &9QbզX5?a= #liyF#4@$ěA(5V^Uj{9BExgZFmtd`@MC}YFbV6ynzu tzM{sjw1׉2IVÁFQCAdX#ipDa6 3 z{ ;Q6KS$m5%ZSzI牑zh횠@.EogFo^fĞTEA9)l=TFꞲs pwudLփ $0gR 4Č,m e'Zea+PA>w`E^QH"Mmb trSpC bZ dZ8 R" 4FHmr3:JH,{ǰY;:P%"Z`p GtjCYj0ZS{Nd晚Xf=ؘj`%jΤyNFLҠ@4 S-E~B9yYy~gP׮BT5ɋ7[E05^ƨ8Fσ% uZe[cae+Mg@ B2)y/D NtxVC%p"/=FMWHOA,eJ C\1F!*knJA0xeov=W;J[`w)SaOk^ŅK6'[:]Ew*8=j{n0!rrOxsgz^߼y?#7eY9S=m>y|Y xY7:V$W)ZPGZ u::&OBQ3 uF(:PgBQ3 uF(:PgBQ3 uF(:PgBQ3 ukmƲE0 m|4,l0`"[_L=,ɲeIr ݖ:E^{.UuQuQuQuQuQuQuQuQuQu^QGjaܔ:cԑ d:}:@ɦu:TݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:֨ DxJF1uutοQGUQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQu^QO[?_'A=OϾw¬Jɋc~{|\Wă_c_~(a/+xbv>o{ kex ִ$1D:&⩌'>f/ahkHY~e~!lq{q.L;s uyG|K x&V2MqS fJB{k,&SaNf0#X KIˉ(kXeCO )FdTFN3k2MvGT\X2YV댟Xeܱ7v'1BAT =0DZ`LT[IP_;V&VG.w+mhd"`=^`wTmY?0<0֩ɀu Gwtܘ;]I'VOs ?Ic2줷<)SY`i4=p<>jRb7R 5?D3OV:ˊ&V9TFVTbOqTb9fS Iɣ[(ҩhGk%/q =JU'VvDo9l"`7$ַMωu>X``%2'3R4D*AGfZS&4N TZ~ X'v`_0K\}`y5++3/Wev,,!}8˥D݁l?ݮE;>E. ]\R14Z}7_K&bvn`9QA,\>898?sԙ$^/GvNpk`I|lϯorP:,<8t9i>|urK97Ϯ1'DOUGiIx9zy|(2__/3u>p֚ a47vruu=wß~nڍ11'폯Nfqߝl7'?'nU"MX‚tR" jMJj'xI6C1O{7|Hgo;~q`Vg?X ()>rrA$gMM,}ji1<ĬvDz2G~,PxJԧՑ7vKdԵcwV `i"u\]M;bjRS yj7TSSZ}cSh1JuxUŬ:_~ݴd~<>Uh̿hu=|P}{X5>0؈#ה~ϗ)A ^=|#@?=05R; l`lztޏ'NgڪgI7aj}U394nP*WO_dcwۨ?<''S>~"Bx2U"gH`5yMS)i<F3f}Qp޿hoJȿHs$Ps6!v}Bz;!~d(_0|j'*E4A~;?_ъtC RxBZ4Ѫ SѪ -[/C Z!c I \$_j7'sߝq, X ^BB;T~7r1߻bE oz$ZNfӣ{>M8>^epw*'Cտ[6bA&}(wڴeX <~C{V =j:ϯRXmH׷ rB\lJpvAlhhw_ v߀Bjو?WkF.wS+կaɅ?71W|nn.xysy:jgR➭f~A(ԺX[͛_g}[xm Tyq|Ec*X {O2_g5߭96#?g2w@1fn!GŚ/AjW] JIvpV9pM&1P2\ dRk[ѦJi[JZv~˯u( ᩚuq5ZVDXl2:+MДC7˧7r}2%0R"d&Y 1fWɤلܑlbpLm=LVWeRږ*2BғڎxnFE$uɩu䤽#!6mP^B)P鄋|)bN OZm%U3Ws4(Z -C..n/!bV攣TVD,+@&VJ՞ PcM"=C,m c1cИ[GTL4餓(Ct_aKDY{-ry#L1J!Z T{7 /,IfDARSX8SDcU Vgkm p~)\-<7B}-.:(gE,C*߲pօD2dȂ}DGQdIIP(j*%Vmd)j <' RI 1ۢ5\@6X,}'Wh)5/ZJ|\kv.F[;z,?h;Rabv&XYi }!>.*"571[E֣`)AbjFG ca*}"cFA (X8Ȓ*B$cGPW KT=h"XˈWhlSS߯nhL%T/}m>h-S͡Lŷ*Qe\q8 eJTKf<*hrJ\́dp/ C!s@hcmUIرzP6h CZT`ղfg19]L/* itYI1k s1 Ǩ6P2!R^nsEΡd((lɸ+BdmN R~+ S嬄*WH% uC ȴ̆6 eBAˣl`;5CY#TP:fT_*BiJ|"T. Wj,n8 H&`I9HUJ+&)kfH57]?X 1y4߲06'n#29ڡnMњb;{*#m (u2v0a ?%Ad $]MɷJ ull@(y Vd_ iJ.LpF)te7 U Bv¶cH( +ʺ8Dbk.V Q>(I E> 465")X\]ʢ!!1:`"J8=AdY%G DD[nҰΩ3 U+c+:c"& ԙqY% MۿЭnF1".EU표4_(*i_HMa9'x?0`om{'6FTerªʷj.0Am`-$L>b@uPqVJ?l:@+:ZHL.(Zʭp1%kg;-/}P(#˜֛ H1DNW&2Vi.2-<{ 2@: J9[#&H܊`m mTBN~d}) dU;EK @T9%$ H+=6ٽ}?vH٣u'S\ -(#V_b =N"DD${ e!/Z@mD%luj ЗULY\%A"~8 )Y#m6TB;h1#, ˈ )'#XD|P 1chTEВQ3j|c8ߡh$6e#s4qؕl#:% 4$Fi4HTy2Z;(o *"mXfp%Kz,**ÜTE(# !΢U ʉ҉l9?Xj;Y gi&Q+ITf)4MQڀJuκ޲(QEXUGBZzwr@f<HB}^3%t0>\2Lk1ʘw7@m,|.WeùU&PV`TWHfRLARaùi0=w"#bq\0!YkhPBʪBas@m1F|(F~9>AtPk>k{8ݰ(QC^"o*$|樇ty@BK6O9ꠋY.X*!SAF Lj|DfJHXFwج׊anll2'V$bHB\IN.BNc}Y(ϘV1Z'R(wm#Y4JwU`1n 0APO[TDَ[-ٲIV@G!)V}[uOAei zڧ-ssPEhlqT@1`c"8!,t7R+,/+RAy`[k&^ M1Ie=d $gTZ¥'phv,'|sP2qQu\xc1p) ^@Y3ȴB6@!T\αtrat ^ JP^!hgY2(Bzxh8Xb.v0曎'@pijIx:62ӥRE@5Yhac-<Bn);(@9Z@$ktkP]N  :xq/P RXה)kDq RA mA=zɼ_['@81xF-gtdbXe`nr\ h  gnŝz27\AӤᢊ.vuw|ojoϧ՛a1?Fhwg,+,oQ^718N ՇX8]:)A`u@\N*:DZ$ f,uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQu8>9%Lu@\JN 'ꀔ&ыDIDDIDDIDDIDDIDDIDDIDDIDDIDDIDDIDDIDDIDDIDDIDDIDDIDDIDDIDDIDDIDDIDDIDWKԁS"NԉK%d: -&Rt$@&"y"$N"$N"$N"$N"$N"$N"$N"$N"$N"$N"$N"$N"$N"$N"$N"$N"$N"$N"$N"$N"$N"$N"$Ϋ!ܯ|#*48uo׾/o{W5e U1+.ۓCZ;)ϣx8ϸW0S=8ͺ^,_=].ZF9#H9@x.h CxlfKI Wxt;~|帘W#:hgŐ8j4ŦC@4x^Ejk4Yy9"z2hktr4k]Y5 :6jOAՐGy>4G0\|͟EQΛ W.u9@ґb"'=M`_4q "j <`V kϖNmvl QrxeoYN_5A۳~= ' Oov}ݩսzcvZO:'_Q$i9i,VY0:z#QwY^ߪ?=c9 Nk4 Vb- ǬQaWڢ`-(~}OjWRhE_Y׈ yP}O{>Fj7݀ /f鿴7 |.2L?]0˩1ogaWƘ |ZSS ! boF 3ZݻzkEQCW:ǔXОKI8(#Axt_~9-L}цqE:NVB1j;3 |w&Jf˵zb(:;|gn=Tď=Z^ʚ!P,e*;w_k.utuɪct`EƯ7߿#5e*c[VGb^|6lohzUz*c b;u(g>VCGmf} tlwkA'd{a\Q5Nm^٥`̖ `g'0s?Ͼ9rBƤ 8OTNDWvyˠ҅3qQ.&" pPꁆfPܧՁ[tj@ GgFygWm~]yc++lA$7?LfgRS B)cE'l磇=epΎԘrzj_\vNjôA\I<Ґ<t L"OS`VO#3W 3* 8VPlVv/䧇W$B(V2-Hƀ-yQLϬX36(]TSbth(IN-:xx~tIS%S<)n#Tp*T3BɨVh\gXme,#{3ErZboД͌*?:jG7;Ɍ[{&B7@t_\C{gµ$'Gє`yŠ.7myjB0J<2qQIfG9+1BAg tqW-9`(n‡!+\>0Aţs坍R-Np#Z*uvIJQb}Q][hJ^2-'~a^ĐϛbrhI EG*zJ6F}RqLR7a;#$\YWƅ-/q*f Yy<3YfEܬ*n6h@t{P?78x׫A r~>=_/0N㏇_O^N 5n̥Hy E8QRpȝS" hE . QCDWzTt0X=`,s ⠯AYς$y-[@DJ1nha0>tr.m@0z{*ŽGϲZ*Ȏ Ov۫ȥcXCͥt m.(N@#N4BV؋(!Yמֵ(m~iw,:67E̍g9wO0H>D@A91rj8u+'\T`}ȭ}; a)@ Qx|n9XZ.hsd}Xؽ)dc~ S9g @ C4$㌕sϑ ^Z pSA@F`6c2YZdi_.Ѭ$<_R{k=xwWcϵ~.C/mEƥErodo -d9my7oS@oBSaSV:zl){d&R"dsjT9('kr<8Km)fpk#MX گmu 6@,'jf \' $B3*|p"/=f:>o|hϔu ZHҮ}깰S,; 3ONׂVv6Jp&PZ*91۲4S.Y*$5 !@2&qI'6I؜qò+sٜ0X%OzŝdcGpdȬq%\ >bsI[2;bU jzao͟.;$mtx&rd" t1 4  DP?{Ϣm.ߏ nr@d(HcI ~V3)MO&^u,UbqIaf iZ8R4y ,\`4 e`;xwn oS7y% 42ΟVř-YUOqv+eRs 8_5J=~_Dkvo  2 mN^^qoTjy=2z gu\wՇ.g/ّ8yәsa,g; wkmu/.KIQJ,}IIλn2Rwi/ 1-Wf.|\|3zs9?oofT9~]T;jc+|L +X>߮EF}{f9 z R k/;d_rݫ?|͋&}՛wo|w_;69Zӻ ?ꀕ=:pH=ia坤-=Prt?B0˕C;/} !^~2~=9{3aUH9/,@A+oy6^}{W:qUq+b~u.MuWo3Z]eu-RdU3vi mV!T,}) z,8Ƃ:B/:~s[g%WcYxǹ;.s6*0+2I@]ʰn#{ M*a4 \ӿm͋)IuY*,aQrz@4s_.CsAP|;wSFlM\9ȃ7;c&m gv줳&.-#؛xgvԵAĨ#V焨 Q_Ϯ@ ն CJ)K⼵DJHH֓V%#+;|~}F9s8:|N;~PS|hw&X#_mBvؐ6̘WýwgzI띦gST3VރY- Ig7`ÿ,eJfWS8;WLH}~2__/ӟaa=T{LBHQ:"C6VEY:;I}Q$ c Ŭ8{03-P"KOgyStpӴ]x_;[Uʁh$sa,H"AI`)'KƸ7%-e(Uf0ަexO{/!8BiR4D|6-N3Œv=1[|ƁܯIۋ/a $˃.A5ωǴ)Ҵ,ΞT-qRzceX]FՔgiɸX=_1Z|Ł?a-U-'"N\2u-2X>7QƯaO$Pʼn+y+z_2S0Asmx_ns~j _ꗴDr[IWZ,RP$EL8Qsmx_n3R.`\.#,»83!3 (^3M\Ӟ Mn3Ӽhx_Fh{쟱x6g>!E~P- QJuɘ-&{26a5J8.,,2Q_P6+KVU¤S1\ x_Wg ykMQR]m? `(YYm5܍h۶0P}Y,[b߭~c??cQ3V, R:Ft9h! UCk|(U뇷~ywo&S6dT\d[6KlI=18Z,ǁeΦ=V#,vPH$OЈV3$2W| }lsȈW@"ď͗ʕ@U\T|hZOg/_tSUvظa_|3p߄ xPw_ubwH`1cH~l貖U$Ht!H\#<mAG<5N>)}6 UR S63c '~G@>E| A7y\#޼$/\>)C( `H:h4Q^vW@t`Is`H qJGit Eyü4ڄTF[ǡ n*欰Ɣ a_-,GE:hL9 %҉mddN+o8FmޗJ_7q*cY¢.ΏF)N7qMcr q}Ync N'DS+3)>>FQ0O3\"XQ=Wt~WZ&|j9Q$RxT0h: p&V4i#nta[w2wS$1 ښaCR=CKPS'^?˨d +&iE(7M)'0r}/ 0cuܺ2cIF9XGI4[vޗykutgV3{nI^Eʀo1+V:Ѣ"$)&K̫5ȧ 7Ы<);XCLUJYjxx{ Ut G,XwQ|(O8F1w8IWC~w`F$+aȾLr`օw\A('ݕ(_?Q,$. Q95=RPFWt//S2P+nJxW/ 6nE`50nǮڍD|O+pwӇfQ]S|WN=Vfl{z[6ch,@k'.Gkl"LApl| n]897!GNy ϟ|Pkpx dԩC8x &^=k=_CP`lM!xz7,㊍7I'ʌ$ fP$:FxYTޟ 0N{0]T}I2ݮy]h ۡ72JUrSm)`)9 %3d/is^~wcb5s9w1 &^w%_AgA0`c2:fIt-3tI1J" hX9W Xx.qPV^CR zw4[8xfwxZbru͛j2z={/`ζETV{|py%8/f?vlX۞2KOʆ҅g{Bxʖ@~^r9?a%uI<9bXeTYnpC|fr:u8{ j"5Zş_^֛ _BJ B@V0,EMu5Z*WbWkT\ &8_K˿_l_lRdWˎztJD.NϷg:糿J:?ެGџWf)8yחgth#//+d-՞JQi}M$wr;63ϕ~%"j--$φL`6KL ͽ[T4f5:Mwysw=N.}k:/uoԃP}{V\k|{k*_=wޙ?d) A,mg;W9CzA:puq +nί/?yvKScyk$F+r-rK???_7>4OXc 6xcL;OBz* ܖ)?ƹ* eAC,:7׫ն_oӿ6rt=8`X[ష؇}}==Ȗ\TRz0}|8Nb't\VT,{hR!%0OLOOդPG7;`_GyKPW/{j|uI{/jfup|P/lnk 8O i;mT/D`WX(LDόOa_gk`WFW"6d]LALGi2pa^3+y֚W+#$yLil&3/8/yoz\ 0 ..^e,߿_5<[]BSVx"9y->y<8gKܰěkZt =}Т+xˍZʷxl>uџ@9 Gs D;TbGsM#JVXYQKgxQ. ͵u,kddc4V_fA`P%IMe`-U%L6pqvf@640&]e&& #lr)?1Jf?Q/_ĸDBYj&oy/6>@bF_6.X^j܈-'_&g 5nzۤM|eװTɵ PU)&d&USɢl(%Шoh?`oМc'c >}@2jci}[\]Kf$݃hޞen}w9![[Th6. FZ5ri杮i_0)B#d/=&(cZvШ'(#p4'(^PfY=FZG'8T͒8j HB^DPEΊUi.SGM<ak!i)x Z;C#˸#a2PeQHX#,%`T 0aΑ -^7dbF^fnby:W>7N@8|a7o)! TU1ރ#>1+㽹']cMxBd"'Q@eVtسrPkE 5y[%n 9M.s4g7سr@sf\ijA58 |(˥I6D»nᒛlF+F%Ua1U4l(4_L#(( | oA#sQcw.#>x;׽Kߗ/Q"R 4cPCGU?*~:x`h)^=Rd5&4,h>傎\-_hTCώ0Cy*\-‚̳ ,202b>k,rU隼,fbz]x=rc0CEB;CM1Vxv1`!rGcuH|Cl=17d3^01Pvsiʶa4$Y;YlFxJ؋1~Oh:*ҲhQ[5ѴQCY% GHuyQ}4M>6۷suvH" ׵\=rR8thND7[3 &_Ah+(,QPkӢS+BPV=A Res  ^ZstϺRxJ௷5FM"56RMQ }Y$ͺ3gP-NrΆ"✛+_Mz`l2Q,3+]7V5Dw) pZx _B=#=cn;X]u#Y~譃N"bۓGp "^-E0nq#h&S~GEDXsNz v;k;fF+-gUbނ?rYuSH Ni@2T)IFr&QCY.h4|]>B)́$Vu3?;480CY.# fb0LGl=0y8T ,# )yIᵄ@zi]Dṫg{@f#8=$QS;hM8MPCYdRPV⩒H4xUw0:<`t<*ˀUftJ z{uB-1#{ہ:70mtS x1۫1-t˸KSX|6C%Hk>2~yE;h$XM ^+TJ{IX!4ϰ'\d"98\%xΦiuF> Qr)Z)ɧsWC/ؼ gQ1 F&W)gi 'h:⸷[Qkmi ilLd:tTS* #XdQ=_$l'qo`3Xt:/fE%ήVq fZU VqSv4֏֏(,#l>@/F5>ÞpnGl!>7jDmsIgiGon*ӗ5M :6 jnxٸޭݨgR3*Ae֍$ qS 8r"_z)JF¨Va%Ge-f3e,# ']*06hPKvs)uYa-A>M9BosXȺSM1_Àe笼 ]jA#fZ0}R\.qvR6Ĉ`ƨ}enL-%*+CWdJ2e ̓ĵixW P9<;4mF 1"Yq=k^*[e@|ˀZneG seOp6rd BͰ8 m"B5k1@ -oxtH<^S{VA#gCq`9fBM\٤x":x&Wܬeq@n(Ľ967=j1? r<4CЅ Z25p*^'Q^w!F5. }x()?ׁd9E5;nlBW/['g.g.@63og}h)+ѰK 2LGu G3>#hʑ){a^O'b^S Qf8qyt)eKюơn: @dYspry! cCL<5\x6ȏW7 1nIsBpe ;I2XQ(U8a>lV'2~Pϴ̆>uÑKS}}+ETm2Ҏ( |(%Kq * sLEjϹ͘{pEyQe0"疸3"]3htNPחcJ*[@©Ff<^u('*F)FXù~~){Fh;)erzlUn39(ǛsՃD_6DQyhr 4R~Qr+t?2/sR+',oٶ_O^1K\m5o _҉*Ӝ.焖k &=/kŠ8X]ՅP_ +V,*˧vRr-ryu˪+Pl}*{ LVKjT'Ķ&+6crD;L&6Lv.y((04`ק> 3DNl+VfCPd΃ v)yt_ǷI q0+gU>zЩTԝIqXJGϩ͹(6 lcE<zb0Ջc\h c]<DJ%T52+ڃ^yQR~p">Lv[f;MұV~m.zN2;ګM{z푬 S }[>䭗1#uڲlξ )kX`.5ґLxbZagMe ̨[Z"XITi&4N]}>>cDfd'^vF P s7a=Qr2쎞N%"E#KRI3R4~H:c"Yȑ\ P 2Ӥ%"M. ]wH_79@pdwlfml=eɣnyA+v(bN20#UW#YY:G_W8B̨R`s,4pSFV"^7+3uQߓy=NS}/QWH_QFʋDE4VT1]^4@Z$U!̫ 2d\{t`׹ٯO7Te񹽒~:HbmjC:6xfH*lG,`u Gƫ8uʽpz9oݻ2~FfY W_4",K>(9aJ#hquex}2"v:p#F%#(R2A|[TdbOyL]ԏ/;QdrH|,%sw$MEX]=2^>#GED{G'`2תկvj7(v/^CQ}9/JMn柡>`׸q3nbZ߸ HPWA:p"59ڸ%Ꜥzi[]&߶~IL% M 8-6qwv bvGY|>/Æ!4,8波Pv奌s_&uk}{giyӠ 3+5R: ,EG\܍ CFUN2rHLTLG+I3rvZ~kW9%QemNӼ@٩.lR,6&z#$Exq Xf^/G\eqxdmH~ı$C,NosEЗs&w( ϋj< `HXz @kz=7f#j4.FuexXh\4],fa\ԣ0ؿ̿ITj2W+]W=0.ޏy6F—柟rON:r(Ui;o>Nw&E`YxΌ X]"`p}XM Ջy#wf;q9z[+_n5P> N!}ʹ l,+g1e8cºl_0%b&m9k$cn>W=(jO2Q j2Q,EV-.x/p}5DYA1082dHV 7\1Z%P1\W@EJ$j;FӼ@rpv~I$ܼv uFF[1ıڮiNNMN;)3o.B?Y#n`bi{Q|@/ (>HO>%XW馠/v.Hf*'Db;PREOzbQıSi StZ9aL,4f1`+p(Xl.3?/G,5u#b qZ7cD%< APa dE12S2`q\֩t̚d \ pQĽZvTvO *Px$-9LgHc%0Ԣzc2[ j.{b dQ}@7e\Ik eaS׉ŧL "1,Y!5!lIhɜeŦ"U-t12VEG*AԞD-T" E,Y f4{]L哱[]6{q!+$NSيԶR"2JM>XM5Vc:OLiP49AJ[y¢{nt 8RwM84'A˒O"L|CZh2d49m"bI!2vneSN׆:鳻J.@t C84Gyf6:8n'vFH]Öx* [or^PwS'2w7cW˔=&R#0Ө Zr#qM}yPQmC6b`qO?o0{Λ*,&Uwqmrw5"4٩LO-6+/!˙O<65 (s`XRnJ9n.߆+mo%O=l kΞ9R&utOn6 #?;24N6ܞ25öܩTŚ:l݌!\BiTE?NƑE(N ^e`%+Sv*Bz*8:n0T^z0Rê&d1%eU1ׇcs0!/l&3hn#զqeJ r* f7~ڋY{U*FG#kP,rt̲|P;pš͜ՠYgIAMLC  h|he|>&gOk]ѭzꝊ:jxT%<3+t7 qC g<xVf#ELol𵱐"+͸5_ӥ=䫻kK]AܟÀ\Ts0!ؘ8 nbt Z{|Zgry0KǦSC7LZE,Uu+8A`0M!!M&"FY'"B70$:B9|quE]9=>`LPdQ|k7Wi5-Kz`*j~}Q8ݙ*2W&f1v?GT=^p^sW[?;?˂>3wowBP،aC3񮏤fZ^2hbS;_hb|fmaLjz77Ӳ~h{ (c(#ڱ͞ 1ү~<OwEOvuºSjEzbq\-OfzMlO+.@SΦ;Ffj @]K/?|wJ'V<>,砓֫O95U  ހ3 2e`Xu`ZsZ-65ͭԍUsV?B㛖,S^nMm]^aꟲCi r-2|~4PzE, e4m,n׶[S_15}n6nڷwfYB#B3Ipv[WƆyӥ=zY'Zo.nަWu}d?F ?/`fڀu5CwNLW4{{f E<}ݻyM||0kΣ帏`t>X@N5cm{q6Qo]^뷤l=?zZ!kIfx,]<]R~krD\bP=Pw{Yo0Xε΍~Nj? ofz ɩW У]-L&Gx%.w^~V pś[a`7WW 7Z2zם-qVF Qߍ \_F=%#3Ey<_6gPOK3˾yRiܛ3H7B%4R|f|GK"s gb$a}l)xu`<Pi§Rv-L}Hi-@e+bF߷_\ڀSwa>LqǬ@,4fQA89#Vzҟj,t~/QP_5oKeu9 4H5?ގ+x dֻm b 7ۯsG{k\ϲ@sPoF];/QaheaŵI% K}uFR3 j%9͔0F fjXSWeG]cCܷЬENӣ*>td*gP:R+!ϭBޣpY<*% 䓭` b u4?YZQy*WWTjn b*B)ɴX8h"pVh#sLCN]i!M)|{+@iGw{YA1!THa `ayFQ𜱑S! Nʌʷotomy1&O݌ALVL 7Hd`{cQ1F4 1_ԣC~G qYbu{9gY{q+Fޏw]`>`m7RItg1*]qU91N\"E(rrw7G^ј919n rMluD>6Le_BST RSN HT#W:Hz:n÷T2s]c*l4Iy)8}FLvHER;9]>6kw-d"q.m[:1OSuEvDJ%FU_AX 9T6t ꗔ@#o1)<]ڻ(̷% 9/ 8/-P(ks2PeKR͸ɤfiU8."B\m㵛 02a;K ~ײѳkeFh;HN-j`ic}IN9g?0ӧo/ǒ!T?7; I;&Fсcu !֧KBr= 9a"GvLz@XnfGRZ[2ōu1`o."RqZ Z8nP,Y!ۗ7*]6GwR4S(pN132gy_;i |y3rn__`sm/ -$@,lD)D}iay$vXr%jaW2:~R :SUx66b#X6AD?VHp9uOGl)6{9]ZBZ[؏wXj/$?GđXnh{|_UC~inZ]Os]dvrC6PuK*[J )k;aύ7skj tm6dm-Up;I\J-$Q by#,&j{)_UwEH*"3pǔ]fؼ8~A14tr]Y 8;i#]VLIptKMN?0.i鍻wLEΐUmӏq~^mtJ$9cu9y6-(x$K&#ßaVtAr00ܦe4;x`$K&8*™crNa tB4~"HWzEjɼbsy#"d'>9,\8RMt$?c(ӻqd$,tz J`ܾCώH;Hހc0Y1Bv4<?>I?mO.}$wptL~B*)5m =JJj"eqM$778h}͏RlptK;#VN!ӀcpeBɔ2/[vL~ϣ#b=נMr#mG2q58~;2 aLb$H-vAKձKIkxcTi1qpEgj8^{#79" 8p=9ƤA6qqVڡzn@qy#lՔCs72MʾڇCY1(cC~, w% nHJ u5kIWptLr泇cHRݾ?#;^ )~X‘Ĺ7_҉t{GaGt'9o1qZ86DpВ ~в}}/ǿz\6^MN.Z6Fe{I02hy~Z Cx6htQ*'vZo5ļDžԃnoM| 7]4Q[P}a\?A68uQwӼwtɱKn,__w7nຸWBւBRVZPr-|Fj7_}vw5HRVqMiTR5 `Z. Ic̡!HFU<+Ay纑do5{S{x:gࢧV"I.NhMtvMDapŸ{q:O {wioGrvͽXKB_%d{ΡHƲ'Ezu$Z ;#kgѵs݄d _vZ+C{=̋a6*l2&͗[5/Q]?]5ǽpF֋?W&|vpmaK\.sfѬ9Lv[T}fhku:?@w7ҠZ l1Hdgsy N8D#05tc!|8 Uw5K}K@R;y_4(Ljb]2T/ēD\D+"ӛQUA/Xo$T׫yRQRrglrsWZP&[ e pciCF5*R/Idʊy{֑Tr:ɕcV[VЎg*W&CB\%>KG~f["zI}V&WMӐ";p J$ng84}7UoTI3ooS8z?Rp x~|yl]5~4a̎V㏵qr7~Kov4G ށVxf0g(5xvqf5xYjYɸA=)v}$mq?EIQ4lAf#4\_c:-Ιʫ/-VQ7~0v R;|%GzW.Duk'MTvZ2 #k@><\9=FJ1TSǼ3Nc2rF+` 5ZN` kجާYMg 8j6Xu$wEyy<*%;5_?'ݕvYQ͹2Jy kX࣡[/ |%Bj Ev5^VkoQEbv$iv?¯㏿ љg`gW"Qm7K5 O];*0"7yrSd=W!Ǟ55xޔQXX RDر)c4،!7qȮwG12~<ՏV@& C)Z 8R㇆?0Hfd^"" ruufEB ]dbo-ooL K:ujcۉ[s:jKml'GLݶ ez?DOP\kr;!QhXo7yv! $ET맴A@ys0sX7Oa}lb;ـ~~tϟs7].l=sb;J7+=z~y`{`ӄw1bLWD!8 Zn GvM:N b}ZPnϢg%=YQˉCwRअ@.t HP j+Uez;V85脭l7Zވ_͞]%бkdiR|z1HLb| ;I0$̂(-. Wz_{m =p)NQ,04 tS+B{ Fqv{/?/~$x0E6l~!+V+'$vn%v/B*h:cov:BI2,WH{K?rE/ޗd ~wr֩xtBK~$6olފ ?kdMgUfT8xRXjZ] ުvxf RN.:R;$!@@T&-sV*{$n|IsܑK_%0ði@)nus5Mz+rrTfz,]ꔖA\`TQ.7NHD0UuӟDZQ7=8Q#pF'ֹ[2S҅fINeeǾ>dZA/xou58UV8G =c2ĽZ ~96Ah! Xr.a5R+cPCiO㒄o3ܬ:zm}7F%Ju@ l{L:ˢ*)#UaXtcsٝn`"MG(~u;kG -i*zlRINZ?_!H%%oC˳cW$ՄzG,~^u`aN:Ws靵Wfi{t0ΧiKuhxzsWнۯ8wϸhVJw/+odWBf-yCʤ Crd Kf%{ȬB@K^W[uuBo.sƯ/s0J|;0͌NalY,ޙjJZMƗ+_A秶EZe!3Zapkֺ➯-)]/~Gq~hZKa짓!hۗн-[q2 0w^l&)9c3,ZіDk.5Pzg| L͇?mwio`E)~.. F,"<ބ]=Ǻf`Q}G=~<ީ[73OǛf1vqoWV jMŨۦf2tnDw~JŷXJP*<,rĵ$q!nsP!P6z|Hn<ؙĹ4B?e# T!gq(F.wX[kLK_ kъ?G<"C&9Rmr2O_ T + <3&߅ޮƛ_36x b7iMy(Έ#Fk9S%f9x'z 퍻s%:_}Yg=v_[ŏm6a~<*==Ĥk𵍸|ΖdNb9pwe$b9z}-`U)Q"KRO!*1,jZA~wD:pq]~mܢg`~"uӶ'INPhh<]"phՌ]wu|TrY]y|wtW3 DpWֶƘy8]#%Jjׅe1v7zx !{8c6&Zgn[0s6/)$QiRITHu'guX qeMsWΰIY9,+V/@l| K38Tڵ@XXf zuM9:ٓWֽ|y/sو5C*H0 Aatm+(W& z⯬(:PB!^)e=kZv$AJǷcy<^ڂ(6}ʗ.Fز#6,v<;Cw7Ϸkaa^v8Ȗ r79 ^=VzR+}|eȔdiGH"'Z@hI(ڢEYHh}Jz$u۬|hG L=^-KjT8l_`'P9G{;/2/h%, i%i+ 5V?GxyD"gndcXhFkD_7UZxݓOF0w\7fg|<^1ZV|!"R X!qfQ *^v'/A ޢӺCo/4OmipWW%7em-7(Wt ExSZ0<>Qn>iW`E',1k+t4ޕ. v90vWޥ෉ ܌: r-Ÿ VJ~ 3uU妓:+]! 6&~6߈ g1VA, I'| Y ˊ=` Thؠ);5G沟)O*Urc{urO<>gC>mxS}XV!)NQʼn퉫d jAIr O:\vn_ɞgE9Bm?mS,˾.7(MAdlDL2(\ %9ճV3Ն-J*uJlVC^+ꉱce Xg9d nSnjIqvI6ъc;xuN{r-8t-{?kOl֘N*ۓ PCv s9{̕:{xA[,z#?\Zng\')"RbZVX ?PXɤmzRΊ'AL&})u˪- ?a\_^:}VKU`5+o.(- o,G'{J^↳oMsԃ)=! /[䕟 MU {7U //M\|' ד#P-1y<:{tDf("qFYnIjLa/C9)7 :Cz1Z6pIЯ e\{E&px0|hsƂxAa5M=FV*m>Tl"C`BJ:xTG׼_ǃ""/,4 #(:xTGyO$A;eP?"+B1RIy-*(. ZcpL) x~+àOarn )QpYbS^_:xTǨ,3X#g~FRpgK3 X8">yD x/:Ȫ2ZG>Rh5uImO 8'lm4xjD<d`wG>q=SVjf$ʷE}7]`,D( 8;FڔQF-G~ncc#sĢB\P/w`ƣu1&^ >@Fehze@ )I+ 8axf0Q7bsJjT{UƄ5A gعsy$F <4HDv 7^IK##F (hNYXN`j4AZq=RT9Uȱ#=xgLI4CelxGʬ%!&M 4anF<*#R޶[f=t Û54 1x?Dr9G/;~:뢅E0t3gvlU+WÏ;xTǔ7׍=ަP0ʱcc@Vɽ!ʓG]pdlTV>}HoJۜM9u. T07H Pܾ7%zKVaTI]RɼG`cP&hޖ{LV'wԎ¬M|Qo3>M@ ̒ULEHzP鰂K aci x7k9` 1 5iHCH$aW)NA(Rz :(tJD6t0Y~v+Òգ28Tii4{6%hӏƬxnӺQ#Jŀ')4CL "ǎWES)<1ua{ g͏tv`T6=&SR\=]k׬&mmz1%34k#k(;iMT^ʀMfjsN];q3XM rk2Wp\`)Qeʣ_%7y hfQ;zadW6JN>0"1Bc,E%hT 7{5<#لu˫yҭ B2Y= C1W?߿]kw bR((.@Yh6c+#@#5lBv?.iK$K־RLY5¡w3{eڧ+| ".8Zd4A6:uQ퐮{!Ӭ:ƀ F 6x$(1:SA0%HzmY y 4eVA4+l B{;ɗe1 Kom>uGepTڊ,3Y˛{;V ?Yfp$tyʅ mB[WHZus}JQCZS ~tAP]q(MK'8%Θ#@0&>kbF#ve,9õݼGep t?Gw3|+mҭCeֶBےשV@_U"R#G6R[+#Qࡐx> [d,o0*v;q}3˩w 1GSP -P#F>ƭ~˩Vw1 <}4{t¦~*1) |vQl Q.˅`)t@ܒ  3'LE?`TG0+sQ[eG[X\#"Y;xG>kȉ:xTGNyc528SOv4nQm89uzE8)ʁu1 ,v8>JgB f`Uy|n5gUUy$ $HBZ^\Z A$Q8hh@Ddw[@.UD«NJ^8n0, ĴV`m؜N\^-?,MbPV[A)Fhk{6\ÖϿ ䷶Yو?Ut 9b`%9濢ng0#W&'IA Ha7oG`C)@ ߄izXA5/ٗ `#$5sӾI~;{ݶm0{IzM j)U8E!P$Ӧ;[T93gΜ}fv2yb~&Jy>?7::n)ܬ "XbW3+dѰ*>sƹyqt+wuNG$A6"ق?V$̈́ڂ3lxh-8\ NM+WΠ&7(6JX9!A:'z FODEZ['{[p,A(/F CkfݯG]h:Mk_ڤ%8[p|WP %g㛬_Kᱼ9874'ӸL~4nJG;ʣ!X`}ھatN,r{jAZ:vEI:vE~{Sy<ψ:ws'/7wϝ~r&ə'gt&MBtPbiETg1fLYx gDQYi5wwEԉ"+7wC3L?fL;(Ep( 6@natSIK+2슨wtE~{⓰}{Ĭsd =_ȃΜ}3lK(§Y/.(HAJ*VJ=Ap(8Xxk0l]uoK w3qs?e͌s;`i3ü#J]#xP:A)"L>NdPX 1p &vHzgMQ:fpt8|2fb} b=D$c?=_Ǜ>LaXu _UýUp1;[ )E,jOfrgN{ yY[|wIGeEmD -TYk@>en^~y=Yd~(hX^KOёQƂݐH'98p&@!"B3mSRTF,:Yձ,55k)F'Lz>ܸwQCzũ!\#aVKM O1 1kQRp<EDX/5^#V;=w'1E>R<7 X&[׌w\CLe')նgiُ;xTW7h˴@7AݵB4:$*DӥLI„W)F"8q׬90IyA C5kfY^G*#R]1fMwXB`ϥ ׮ѹNJR^];ΆxjRrߋ :ICcU;Gsu>g各cb<v6  c86\3܀ &z`K,"q!{!Uz r37xhp^=j/#}Lv$aFva͛EB8 :IށG>)*rltȓ$4ɾf8L 8k(-bl\8amհMʺpޫ+Qeke P;  G!H Lݡ)pfE 0 uon%P0 +:mqkr%asw#x] `̙Y݂:?LڬSW$9r>\fcS Oߚ g棞а =*škaoV:/d8Re37긧|g:gf_poϦq3Μ<'5]^ҵ(6&AJF(-GGrNbsp6(w NX ڍu>mXUj$it#!>ߏ&. bvZ#|4M2,hT!{ lׯ37;~_2}~՛^cguIg3UvAvo:X\A?j7GMBs39-x_mYwWDg1(\l<٩)2PRc;@}?n2?lPdnkHeܙDvFpUӌ1ѧ>Ҫ%O)ԼQbWe5O OifKTocJSMpZOwJ]%y[j, Z8y4H)ʝ仭)NS__bgu2፲Js4>: MZyfeU h˸ďn玏%tZbDp&:_Nisu8KƩ[5`73kotcFXBT0J $2c(9JSDKDP{IC;Vuxb%4ZFVK kcƷ%]K2^c0yߎ"jtSPU'qnjlW9WnZ FEfeՐ7q@5^$(sٯ 'P/ `XL!񱖌-ai"ggMht( !V cJ#6iE_jXsQ b[7xX1|EtYFӡ6dÑKӐW{hxȧ?̲_'*I:ۻm=.-}:, ;d9RqHte_婯:ۺ=ߚNjADei>ΠXGܻi:^?871qCwK!kMLjSƙBwbV-6k@JdvD!irPSQ 8H4GQT黲ʫ rtLdh)Er-&4R-T@CL%~#M i1(KC cVPF DT7V!%T<,:k! +nnÆygC:_wjH6O`Y/2CԃOcD2XXYL^jʈhA #TG!eLDƑmEKKR,6bldp>Cu˻ [mj8KkV܁֚=ŝXk 62( F刊`) Nд9:deѨJWX{oM[w[*>& +7icO2G' |J|ߟ#C0!B*Caj3jX$"﵌FMCͭ75t&/'Us˖|гL 2m%pF|*7C]} t#e fxeQkwVU0:S׀iG㲂 TJٻ֝ٵbއ76Pv {UB7?={кjݫ[̋k.\s4fZeoƭ"%Z@novwa96<@54~ ̋rzk·?ovfe@ח=?:{q_+f?:dwۜ]nq|G2yȧ@߲[Niy*R7 G''I^*evaѬ1Gg6+Z! y_uoa} r }n2vA5EoN]M?O}wwK޵q$e/7Rd/] o >/)lyg 9"%[2g؏_UWWY1 ̒.䦓,YydH<3) DX,-,xGI4 Gڐw. zWKZ10w,>4WZ!Zvg;iI nj3SQ?稓z,w' f''ku'5ruȂ  2/X& ^^ݘjXƶmז6lo7dKm:.MiEI˭;M[`(ö._ESŒ /<|Ľ%Ze햃<lY4}Cbj(ȬA1x7YK6on͚7 646IvE_S;?ݼ49Pw'ޖ'-F_\w Q <$0,e:915Emʱ3ʅַZ`瑇,0E揩EkS1QJE^Wub^wՑ1ȟl18IGzy9Ӹ*C?|ޙ G';O!5Tڊ*)KLO a9EBV+Au'piq($W@rKjwzC* ےi:40C7_Cr @- P$L}5t/Hx@%}4GĦxWS!99Nza]hX? MTQ!,opbAXK}KZ=1@MF>Ɋov=-σ.mc]cCea:{Za4ɒKe~"h%rmҋ*Y_* ЇجVmmp|)< [W፸4W<y,![nu '1bePr3E5݃,Ϡ_\qXt)ΗU uUuiو8PkU;",uȬmE-(@ yݧ~mA_QZ;JD94yL(kX[ifV$-X6ОUY檪}~R7B㗏1W&D1_ FYxC5evUs<{)ΔXVTxjIHOޅ*(+f*mT[!|;f"UT/$wč&f=7ۜ.N}*p] mN\ƚ}(%>Wfx:kL{r}fZsc̱|2tu%FW>q@XE{ɱT 9Zr;!CyMըfmL>w05e4B/CAꗪ QIA.+ 3]lLA\v yOU" IB*}n44i%ATh b$0nhf [ѤŠxA_RDvEI1+Es:n*"ZDV\Q3hɳ r|яu<!nl(|=Ys*&kRpK6yI9I)S!~90]FU~YV7'N'喍WWS/r.n/<Ͱ ̈š،tiVAa|mf^;!aN l\DEZ[';;ÞV-J]cKg=/+l=j1%3k쬛l48>;ke=Qe & `_;EX5u[G$Ooʹ[@i,j wڶ0v"hw7./K^ _Tݩc7˕SYbn.o!S ~,ކ?tE(oȦg>9.NlI}'=z0XK0]ڼ9~$HcF}?P 9C-g˅6 rCѶ"v_-a8L`$?iA/aaroWt9s! +YN%,rγy6HQ"E/IkhCgpN@Ԇ<^賾9]c+'7.0_5 r0YruYjZS;&&VQR1qV{tHbaJ;ݞZ{7¶=|]"bIzv:%h,Bbk-VhvZi~3ɶBlsP*gRbr[wyk7O<<wt;~JGó;hq{ˇvڍɟ_yw5>!OnJt&-Ti`3UDZ;SBaݺ)V(ռL7-c"nڸeåBq00C9CSM ŧ'Z X'8Db%XH%t1 zӓq>t(2cw9 ʐ[ЩgCtVid Hh70d_fKY og3@z|>-|>.^3hefhI#O?QiÏHǃ2G eZ30{-#c+%tWhH\}t0moЛ:ICRZ ר]QQ8KKR &H6v{ mt[dfF<ae'S>ٖ9S٪w(GkQJH1 .yeoW12#XP1g&"kes褫tG6N5/fǷc7_~{xՑhd0'ZK(Iz yI"R:" fvzeE.qVqf@YGa9AQN^ B# Ҙ6$쎋Nƞi@ QzHLdu4(&tQ qZvIZ'i'-a^=BpYVՓpyQR{;D/ f2膗A:{'X}(*L4NJgS_z^buW~=͉򧵌VpFw*0sn%QjQ#+ Y |3d,:򵪚N_U~c wD8XʼxQcdkem3`#!rrN%,ČoOP'O%P0 +:mqkB%lg*TW1@Xx9'AyŇ9u~3 gͮLLT~7ߴBy2(?>q ށfmWMKaOk>]^q™GP&?d+󙾝g7-uO~,\L?8[gZ`.͗ {%]^uQl6zR.`N~|W=1^e]7&jYfw >  &*|8=^ٻM6MN^ luUj 8L>RÑKM_Aw,9O ΆKJQi#9*6J!۠LnL )&]t>vq^u3-bzӖ2b\48GSY%R04M)lV`Q)Z1 ~4][Sc9+y^5nѱ1ұ;].Co_} 9] X$J)/_J-t2V_viݳ[j0i|Zgfg&حqux ^$U/ U l}_jhd"V$1YCE4d٩IʓwO|O`Y> ET 5 ) f|L.-*3nw3"1 8V}zgԥADRWϿdap-لp!0YYk`NCzze2cJ$dep1&pcZs.%;ZNs>}7wϟ~NC{~H>Lc&=rUJ5ae&1 dHH5# 1ёb"~ʯzO2$ujQs؍G POIPtN*+#Z!]T5 V{ 4zFIGT=>ZΫ\9w54ntW2߫džz;/+=glϢd9on ՘ lgf͖ z3I Ӑ 1NFK|0254UIp%G-L{&! xv\58AAjALpӯ"!MQ-}F尩x:BhtV_ҹ*o+U6:HL2wgaAW&|{Fn4JX~53R+J@MLt(KJpPz"2WmV?_JI,̐~Y^:6p-É{Mr5>ikUŽDaa,Jr閮iYaRYh'AqXP`ET6Q%mzÆ8ݪ4@>uˊ¶_f9Ye\ǟ0^r1KuD޹ r,J%XMNKr|thnգCv0vF}qWw9X4n;J@@qqYJ''N@H@1Tz!xK㸣Ɨ[ H:\s@GPIKp=aB>ywnҪ:~0M;zFP&MFsWՓbjS~6bw {er4ٓR澫֋O3ʛyFu[s@+n(=roEfč1MS㉼o8Vn 'yNh ԃem!:_Ss@8BJƅ.RYJ_z Lehe40Ϭ1&dVF-hgqs?'=B38lJ֯'fPƏ kƹRh$R&c,^!۱# 4%0nǕ0hgo8U܄i]܏o'HHKu]ʜHHb0PKd.NfqW\*Kw\R5)Z)CUYbBΙI"))5Z8нn]ea};+ptk=,O] ~@bP!5S8W`)KyfHN/#QdeehO>'6=R^ɝU|<|슶@|fO9<T{KNZrqS NZNXhO!WgS DP g7Fl>ȉ\=xe#bH P4SbDD^ ;{0q͎Ȍ F:Yb>2l`&YBFg}2@rDӫY3w_•5^6K|:E١ nM(g>h=|Py?h%O\/?|h|&uJMP )EL$3De@_\U+wupϺ}P}L{^PS*E*j˩U)lCZULHk1QkjB 9Z|M4'v ghy=|wvt$;!6v.99=-9ʘ[)S<Mt)2ja]0h+>w@?{=Kedg?fa>_㻧_ Fd"{.x B+ۨ3Ejl`XC4hSliXJd0)A$ 'nB_J?p>fi@ds54ˠc]Y(U& ٬4QE|@RI1- ,-fC:rдAM;A) %}DuFUgqyWk{\/޵\~@t4'ߣFW&S"UĺIw90ter[g_"_CM{n(Xx<=RvB; =oϻMM~fT?ӂFGn4IOI9 gO4&(CVl$15}ZNnt;k XY'ԦQ帵A.~x?-mMZ#tZKdFjSw)27±MXRUvhc=LZ^ NG_k!:A fUп ,LbK; nz|]$j4]M1Jb"dPq0N ca;ǷEGgW}(\-kjTmK3GI@OKΫlj0o$ՙO9H-H%>JK7ygb䦋n 5BWy/5PPS:b2$OF"*0oSym@b愃Lz+PyN*ga 7J(\@lYTH9,0}(Tab| Mtke g%e;AUmgQ1 S&+dPy TyN=`29,'A$4ho:| 5WVyh4Gx6g!itN{P6 稘2fA)(UJ(RaP:21ܞD8@EY$o!3Ɲ>μ^PM,U{nK|5j3߇B-gxRKk3p,`sƪ30&aZTb2[$x`*Ek3^p}Ȅ5j0Vf*HFJ%D'BBʚAyq>j0J -A˾q KM:ţ>j0oFxe׌d_ʃRELV@t>j0o,gRʾ9 9廘PyFElʨ(\K K!|7s*jc"%;zc2Ī2)C%Q.A Zֶ#. q)όb(6yfhC%}6J En!5UP|Yk1tђx_!u0߃B 5u^zcp]9{E" BILׇB ^%AHoA|H8"@ؾ[k*f*: 1,T,cP<j<-ӻ8M[V~Jo`60RX.L$(&Eb q,ӖB>j0/Tgd$ZT1El䄢[V9&T0o(A[)bD8Ńcz<܇B ~gZD4-paMQ`΄>j0o}oBr{ %W3o.Av1߃Be\fV"1#1:0}(`ͼ鈢K8~+',bPyv6 %q+XD{P¼ 8ʸɸoJP1\%q.3XPy X`d( wXJgAJA#GLDgфr;7;>j0o}Q=-M(qfbXQHC ޵r#"eb I003@}bvYR$2֭k[,K99QSTWUX_x`kY12DZӾý&3 ޜx/03h[Jyв X.Pf\ثO(6! {i{uPx|"VX`0( xqǫk2C *1QJ Rh(Wexc!&&3 /M|PZl-hƃx6%B( ~7”$8s5Aڀ*تa=:d K;61ш|(39õtC|J/e&zLQBE$$xuMf(Ab)Xi(YeF"GJ4m/Y ۈg gDSXhp3?~W}'>>VnOFKn;;&~٤qY¢AYqKE> &_Gpqq/Sx{#4y0OVgwqpD?Wc߄q&2m*IAՀgzq c(aƒ4جcǵ!hq Fw+.<ʧ2ͱ"YM(La? S]/la`kǭrpʹLQm}dI*JD( P3u6i݄L9PKY q@ <#bXviN{J<|!xbY/tGkcȊLJ L"r `'/ 5:fA'$0'5HvJULƜ $ׁ8X$p|v\kz,F9̏dBlRh҆ sW2$xfLt\$ F`'Cd$W J8൐ A@89Ȁъd$1s$ |2) uLɈ!i'S`oEFED&-x60ƒ:Y*0X` 5 m{b8)e; #YI<#(JEv"@8S!a`N#(m,%a<^hPrW`mB:X$/[h*ώP2D Tgkל[, `tfRv$[NZk]t:8RH3 5I;FG 4*($8;b+]!u}Uq_,G3Uqv7r SXfO` @)&xXǒ$ dSa=fþG$xyQú85b8}}fe1tHfk_ȲAQeE*oTJY? RN v#څh ʀ+ jXz)oq ockdis˹U/>꿽zh-{B:lzAÍ \kK򞀬DVnA@A_iN` i.p1"Wj6 A%{l[Mws*z&ú4d)+$ڍ2⽾P ˽\;䬣'I7m&ud 妢rQo=36p*)c`8S#O9R%f(G梬 څ'l%g'hGTn.?AcM?_+j7fn&͘yu~g _|c^7KCN^n^?ώK昺[EKDg#]jq"RbQl;(smA͛йeƞE޵;r)&(_I`{£/4x؆6P= %u-z^((!.+ !1J`^;<=˹HcբW B;7ڰʙ."VF^RX٤pq 7$mhO|VKMDĤp !縦` F&rD<ɂۓcO2(P?mneF`XP⼾Mn6Ͽ>ERjYg6Ƴ 8 mV nvp ?+?뷾ޙ|=ɴ,iu[פT֝߇pTC0V~h챝ޞ#I?ka#@7hd"2,\籖[i1Hb3N}yH+q#I9(NsLT@9B6|?6{<]ڎcfyB11-^=^FJqZ^/ika7-èJ- (&KQ8`qM;y#?q-/>W]1efG+B >4_Ϲyr,[$vㄹsp M8a)5 lQY/N-)6х&½/Or uߝ|{b+CHd|I3 o!Z"eZs"8JaKV3v mY8gN};|k딳g{f~`<-/`Q=/wN*J!$pz  }ƠSV s&"v-ơծڵxLr:}Xlvy'vO >X0,qx m9Kþ;Ƀ&hk(7|L+]bXq}WaIHeӂp`L (YTL!b/iKu!= ۨ-ak!VZ"3`,g$ Ӕ52JX\1~m l5մVNia'P<޶p&ڢzU|gV xۓ3Ҙ8CG>oԵ=ݿ+i˽Xl{gibSz>ou" - ;'&-[uMk].p!% g+KU<&3:@E >ș_EUI`9/Gw93=SϬp]XWvhh3=u7=׀}o{[m`tEwOLWunnVH#Ok3/<8i0smnyOw547Fl=`?]B1nќ??T4x|kza< /ͬsk}3JHU7n7,֭NIa[gC/ǯ_oQaj{Y7޼IyՀmodUA;w#=yƀCn˽Z0buh^dn5sf*sųn5t^ݝYZxeXZXiH7^%N6^E&4l/8?^om=.l`roi[ ;ay1#N2 , Z`É_af>G{e.vbJX~,{mrArL.&1y,+e<Yf`6=a|NYT3%J}f6rU(jܞ8Ny N F߁^{ꮫDHKYߩ?{WFr q_d'MG/:~R(I/STMWFn{JyO~n kfV*58Q+:мMoCRjs&w,$S&rbcdC `$.]7vgK1|ۡ +B9z~"N9L`g^[5E_ ib ǥJM7 (=43,X/ PTdZkP僐f.%94nol3^TDh5*~  G_[ΖM-7QŜô ΏwPWhvybOoR]o[y- 3;TOSLu<(BZΘ&eAh-GR9DL0_Psβ3av~CH7uɤ[3hF"T$h5a  2Xq!#$%d js(Z~ *vY^lY! W-Hs V$kOB'#)c2Cħj HגAEU6<)#(qN2'%pYРhl !!-#5Hopi.dp)q>n/u 9  Sն?ﺽ8qpGWt<Ô049O`ޅOePaѭ /?̨ɟyv]Oy0VU~,6*4r@[zq1.hn|b4FAxszL7&H]9X\4yU^jTg&^ΞqCc _},L; z駇[w)=vn]%L;|4GDYdD%Btͽ S=0}C]q*9RՀ<ѝf_+wݿ&W~.p7~aXbnܗn?Wד}y_EHf܍0ćk^Gojrnno6+2aLA%p.M>g=zg;^pޕw=d]kU[񼥣H3(7H> Cn+wG]`x!ʤT!M{2Y? k$?˯~,O׏?ZOJ%w/_Zw,ŭ?^qkԭ~[ om(M>t÷mk>ʠ'Z1wq~[wJ,qZEJYc>~Wp7ƴQ'LVU jM4E6Pþ&aw o iI.f/c ᴼj"g~#T[.4 #6N&|1P+#5FȠ-K~F׿ ŵ3o۬Tt*]r4G98˹ gae:)crzfݾ#3 M*fz#?$Rp 4CEx].sHOaFPGu1xԂ Fy'xmC,(͸:LGaaK)46)ƹ,db1@I ֞Ydno{w?Y}y&(R\IBeE Z'Q:IyԿ0(h,2DzIlp e& dEE#JqqRc٢@:[z|졦T *}Mn, 6* &UGٻQ 2hBZIP:A@MTI[nP{fS޹cؗy;^u-FSN`W`qƗrI\P+%bFe`Zak fLPLlKJIr|Oք9 Zr3o캫>n1'{f0K@A9F^~^d{yj $ÝL%.[=~ޝY/ mI$~s;s(6c!b̄^3~ǖo9 J q\ч(cq+9]ֽɻwSЯOi{W86񡷫RO5k 95֪RRI>f޽b}ICn@!~r"AQOЌzPDt'9%d\邰.E%h30! <Ƙr4jNC>ڇ8S;_?P&j=@5_.G )Vz%v|e4Bb"hΆ(/QFECYZC0)9 h F!hD3ZD\ j΀聗9؉,H3-lMb舔Z![*NhOl"YO/G 11%`gąrSr?7>S PU;A(FƟ*}>~#vFho  ɕU$@i8T!LV%bJڴ U2q88``Y{'Ȣ}cU60yٕV`@HDLNFoEHəࣰya)z;h4UrM[!k*M ͎5߶͡wѥdRl8Gz0Z .ъa@ LIa6%f~֏~~tDăe1<ڹG4 :QfCx6;n##gddUf .  x (NYiJ䆁bcD.P|,UޚaOE/Cr!owo[W\+?1-^=^ZIz$ww-7 W%aI1H:қ |JE-zݔ*>cz׎}#B8ʅJj/S>.]&Ѽ2xkWTJxdVzOSr(eoM)z`༷ 1iVg&rd.{g |窱kw`QT>q*zooN7sL@Qh ʜ6FfDz炈@zn|!Pmyi2E!l,YTT, Bh c)fԄ6,UzQâWݰnsctϋ=_=a荒5&h?o49o]j^0SqE F3Ү+~*9UQFHČR!1[ϣK N1 .MhsQ-wm]O+*6+XbkNe8 b72^oߣy2@WX E. ʌ:#f6U9LrdʁV˞e HQ2@DL1Ѡx&(Ե4,# ɓv {ky31kES & YкCFI%ŴH&pA>@m7e" AU TntڣLAc=Jؠʱz,pj<8X#47.$QŹd$!Y?/.X 5{?]ǮLTףZn"qwɘzLj7{fcZA- 9:dxhb> Wʰ1E4e2{U2u%$+3f&pcCʁ9l8 x8sM寶ƺ,6$mޮh-Yjjv\c$ XTReE!$IGuZqG#o-=خ,G~#)OLsRYᄢ:Xnp*,(hw*ÁZgQ7䩚0T o?usIϋ\Q7.#SY?.\0^}BMpte4^?=ngaҧ1-^ׯjBIpӮ3?Ψ3c|{#nC:3Akh}+Qvodzqan-χ,]CO?9cd9YM7G7;f%[YfepppU/#6[WTYC_˗ 3[lpdYZWf)ώ'^b!mWjMzJm6w/!mۄ+<=r 0h1I%x+ޖCeG7'#seAӁ;^4{mrUA[m34gMRF2fs[ͦx54 uYDE_oE_ӝܭƲ8u1oC|_wJoi ;w Pٻ6dWؖQ2`\d&@$/HFE)P4( X69~U9U]AJ ,gǙ:D\H` 2^C?ixsmhs> y4<>YzPlčW;J@7qDM 755:_syBxQ8e&G=xt89:.;ξj+.tV:R>ʋ%.OB|{WKvYȵ=TJ_ Njǃ0r/?o{[~ͻwy;ͻ7{Ay %Ung{5=px-^mz5ojaWJen|>7bB; + ~<8~;ryۜC xX@WٮqF1?mr9 mM*AhU8b? tمߖӸx8U$qG:f+PLrSN4Ֆpyw9>ﺎƾ9"hd2HS2Rc ;gnvd2si37.#.,Z0&JQꤰ@qAjViCG;4iCxNpH )*C %G Y\]"j1T/PۚSUׂG/a}nF k k WqU-mWIY*_i|m Rs.XB#l:c%S#^# mr2K;y=x{xs )W`Fȸ,YE!/uWGk cR(s, 4p6FEM1@FEE#JqqRo޵ ܔg큣^.=tdkCrP>WYx{Cז]C7*NJ ʲBGhq`O>&:hSZ B#Ad:*dPWd::?Miq|1cqǸPW8palvR*PPUl:<~4'y{3HWV+@,8 {ų?`~8RM[U^]6,wO 4U\zhZC`nwgbz/"|፷RڋdGpKfð(ZRկ c$1$%^zK2^'sY.HrDfV 56S"Uc&% ^3,3ߙ!IQK 5qȰH0;Q\dQZM/w#N8`l?x_ ҇e,'mf6i F]'O_,jz/ūJgV$%+t">*A>%/~Av/[oᓨL41i8B*dX2x5ڈTK5"0nAc`j_ 5Wj_5.Wj_ 5Wj_ 5Jj_ 5tpHyEWj_ 5KGG 7gpڡMau־.ND}~T&hHU qQdF j"՚H@" J؀2Pܟ8Eeq5BsBYi% !t`TW"ՒDuz66zތ̖ t]z!w>?Y)cOC<>\UjcZ1)da8RLq*2x,p$#t2zȽJpLf$MΪTFu%$+3f,MN鲝%;{,5f ވCdHWϞ|T[[xEe, kZeWg"G)5Y(JP$ikՑ#تqȎWHՖ#)'E&AI9pBQ]p*,(w\JJfnf]18-HgHYg&rd.#(Jg[svd9sg57?BCς8-gmsWP>鴹\F[5{Cխ޽tԉ9iIxg2um̎7O4MKi(zJWenT% $jݽm{7W C͗Jn_u׷|ݨs]<2ݜq/Z Z˵^oq^-eEwuD\Cr8yCC/1[YWO>/IKH\3GiL]e"LQ1uv{Ŋ8]GFd49>.T#f׾,s6?~0+?z'xy[>l+}`~fnH6vܭml\t,O7bbYڍXa*;?WUrsr4dBSr9n DQ\IQJ<ӞIH10xrS|uym)a8,v%VFkV-)wH \vedn*U0O0טuzȂU⁅b-_L4kbXһj>NQ75V*o<77py{#{Ҷ}qEB׫Gi[3םn~1%<"9M[fdMh#e7l=].!sAm=7^[\u N>n;L*2qw+eX&g`Q#yG?e^;BKʺ4tR]W;rK/+]BM剡Y=q؟d86a_-My$`@IŝLPOˆ22Yix耈O79nk늵^UVOQ6p-ްH$ʕ 3&pnCd׈u@l7 \T E]6O=k_ŹWѭAwyK宏]AR+ܖsTˋ/>?z \h'AL5(bMTI[냮LuOL n\}7z}R㱷܇9^|0 vӻ\I驰/59srHbR"fT\ OaBI(T68{ +۸<|+hdߛJapí[8tcbYd(ouK, 騏$ȍ% ޓ' h$VK d*e%6I7rk.H A=t)@e*]NLWЛ;J܀xmѾ|V ꒅy1Q7䊼F BGTX&P{86#nApcG=sB3*PEP})@(!%BHu),/m2dT2pg2^_" P6c&<# ߬+{$'9wdz̈́1 X$R&c,_u|B+]"pIGr%LKB AxhJyg)em&:J h!p,$b=IE,3ZXh.P፧:c\ ^=҂F0cYwvI;ZWW &ZL72kv >ٔl|*^ ` )xHx({#W0COvP'7]LWW)5a Wr g( z-uDV"֗pWpoj!6gA[M& ]߮ڢy'Ҕ՛FQ{NwbEbh18o3 RHF%S69k8*|'c3z3nu~2H#WD&Z s\+WoXy"iiVwu #F:*Q(Q$gAO!ZGrЮZZ[֝ugƋqqW8\?3Pj]M*â׷9N*f{kUMrtKT+oRqE1D3+},2(}0!&hc(3 :ՌRBrd{uH(tz~*$X1dN@PIu;v!JT$ӛZng&qv{ۉ&?+P`y?\'q{1βRy NB@PTBhF&~vOc(^Rk]HXRYﴠ&n<"`!/YdKYQBx: wWd+1)AcFd) ~EAry QFI#Ʒ a]X!z"iE)i/Z &\zm+hMUZU/HV2,UPINTHÎhEV_aqhYA*ιYRcLlIU{]w[/£q cWeWR1'8_X*Ͷ - 1rnAW mԛap/0+ 5oq6 dlo?F{#`L-FMVpPʏ}^@X{kmkMWޮ?4V 6ޢ5Ko5ΟmS4`o4Nolz~7 ^ۼ2dֆRk,m7gpݎnƋfqoiI;vʅi@ /п@ /п@(^ *tɅ.%B\ ]kzkR +tɅ.%B\ ]rK.tɅ.57\J.B\ ]rK.tɅ.%(^]K.%B\ ]rK.tɅ.%B U ]rK.tɅ.%B0YpDK.tE kK.V$%B\蒋;̝DQMd$Ojcϧ_JZRʱ؁g%IPxPUe ϒYRO/ZI0Ed%DTRHfmdRs="hfK85뫵۷KG4(Fؙߡ 6а4Ű {|R{x{>{Sw#٢3WKټrו"\f_ouwvA~<C[[pKWmͰf@6Pޣ8: g0bGm3ѓet$mmͭ6lkuNUGl#7,pm4sLPw^>Cho4 rlT笲pח_ޟ~_?|N>ww? Lω:Z݅?с{M_Ѵilikڴ&Dt9Үr-@˷\}Wkޯ7_o:o'_:-w%XAvz vjl?5hx"x,=i - QuufUwv^I@̊s+]Q/w ə_+>_uYwk&Ų[M"պ9tt$tDuKo%%HZ̀*!"DRHVONv=Ȭ]Ceo2;sv&6!m I7|:;a,%ԡ Sx9⼡"d9hKm`u4[mMrh%NRՉ<'OlYwK85QFvExb:o"zئuŲʰLcn2-g`vG?Gnf;:)JZPT;aq؟vtp7 M , 3Bdyg8{!Д, ܅ Ly)ѽv6Dzt6zIHSM:wɛL1<0t`owHQB *HRBVHNDDc )ɴGHV'x."naK'@Mn+2΅N $ 2I@e$gS9Zks bkYwW JhԲAzRl|ͧrɉz{|y$*/4M| gt0Wy"bN/+mAU2 #jfU{pnCٷ7k(~{10/%lr3ao>>)0%yt7G-$*NloFSAZH] >X..ȳow5Vjﯟyytpyٹzd|98]a_v|B+^B lb~rz t`q}:B\z.j}(2||Zz_\{\9 {V/PEC&oO{T>(?D88 76U(Ύz2FfFd[Crk}go٘gJ2}Vx`ԝx Vޥ3{^ӯ]< oKݵkտ?\z%߽7c}}W=GJt>^ս˂2evwΤ!~\CP ]wٲCz\l/>J/Ė'8x4AZgua~h=mCVp{~"Ǔ!k&]㛽VAK>//:MpeI+.]:kGQu)܋,/H˙? ]z2-?1B)*X($(o2?AAxXּ\xxS`7֟<4Nydβy?MBsZoUZK$YT6cwVfK(mQæb8 Ere7,\ RYGhl N<5 )M 9'KiqϪ J2"f_\l;c[BXyDVduZ5<)  ՜FtYl@$DoowmXs6õ |[4qZgz(ܑ[GńE~Ƭ椡c(QلDIJ!Ԅ()ܹ_`C>8^.7 <ܨmXXtˣB%hKhV&^#x(uPKu@8C*}v,d: rQr2@]Xo,z̗5XX脼Ԓ"4D0iŠ *w0qy@?Sa^|M%!YZ{[fH܆`m ]C WfUS]_ uGPnlj6H)`U0Kv߽ܶ7֛Yw}2Wo1TXk#tdWhc٣.)1H /f90hT`%P\HaLDP-F L)ڽ%XRr.A 6#N:G!v,H d%x+ jw*XE4f@j!ik< az'e#KHqϱk7,BZ$ fJ4RdO&CdC@b{cሌ%UjQ`&yh-fl9m2AJ&?P-~t֑p֮ippF+gc޲ J)4I[u/djw+$a:HĀ"@U}Ao^%LA`_R ֆ)4*Hj'?I^2`Z071+Ju[1&UCQ݌lX4ET3XT @ ģtF$<QNМQUY3VnfZ+5k׃*U|ShdD JH՜Z'k#kDo)T?٨/Q!B0%V$N!P N%#<paV,]`-(E=ʨu!1^?MД(h ANz|d$Nz9~`yOv֜%QR3RzX`eSFwL¡jIR)g$8]`MWrjAHҨ5.S.?"uWW42o*B [!- R GVŲo_zCqP Jb!Њ4^2_իkt-*&[|1i@Bg0V%w5QP'R{TAU@\{H e pBiI D0 @$L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 @Jl /˧C=>ɐ@VO.uI SL1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 @N8 @<RI D@# S$@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@/ mxZ$PO'D hI~$PI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &|Hw֫pz~\ZuwszCOsl~/@}5=DeS ?>>Y˿_c*=N^߳ ̞˳)_φx9Z%ϽgRiH1Anw+`^~Ř|}IN<S2e(uپC(mC8,nF 4 L: h}J-o2)򸌻?fϦhK| n}.c_ {o?d2vu#~l ǧk}n _L x&R'Lvꤝ魯O1E\J~A.)."A8 yiָX]z8gU}5WEӲ&n2?aۜx?_/:ӳ2y1~V~'NQ5춙z@)C{зys6kEKz/OVVxFF#K9=U\$CP$x{v!6Ibʄ@19bPLLԚdWdvWamzR?7~?Vwi#quHQ EZaMN-hk2X[|JtUS69«S쓱Iq)U?D'_ƥT*Mr+=7f\Qg2Х(US21׹Z|xեpbXpw"?54gaUy1kbJ~e_BHn܎vmFor^Ic7KJ 0砽X{\xUPu ]h4۵~WĄG+zͅ,=upS*1v0׾ˀ+?>:mk uP_A=mq=ADLT`a Yc=(]y1b)cͻTLngg,VPh6MGa}s:?Y-3 {*f,7ԥqvM&s@v69`d_\Xu߈R=&*Otm*/5]z5O-_./"i̞GG-Y U.! rgϯ7pq ru?zXpEZ|}QqAn\_2mǚ8 G?ј^Vz~"j=r>4mn3GB맥 ܍~W@sDج*HîMަ(ҭu}F/uƉ|롲d,4 >T\/87j#K"+񿋟}70}b$]F,Bz5pjbּz+/n6A<t),zwxcr3WHje7,„'pkvZHJbqo1L$"ܤ,XT&r1`;y뗒o1_DbeZ((2pW}#%3at8\!XMA :N|pK9rј>< 䩯qh"lJDs"z ñLeI c7%keC~Z_95y`Q&RT[5qU+)HJx#t?Eʅ$R"m ;utRp՞IseC."9B2%es GsTF✋I)I"3YirgѮYj8t`~`nYdPж;-Dj\GqON]E<xr^>mZUֻS0)z9qj/%Ԍ٭wYU,(s ( P}Ydn8mQѽULFh n@atg91r_ PD%*`<@Vi[O5z G8rsQ YtYjN1XiOy iq\Uhڕ֣ -P#=#+pY1|ʓLΨyf@g[ "O9uO C[U*?ve)f>2cd>XN(9oU|>GwҜQb),HpEaq'MYɹĪ@{qpl]'sL rG$#gO.% mV*ґP|;=DY# N&W۲$-E}|)( |ŝdcG"V3c;d u x1j ,{^&Y,@ "A*ʬhs¢w?r=TKUc }MFCSI#}Ö`CpVO c<^!Afl.@b3? =cG@FJfH3NAjP #a!BZ΂Z$lW$0Ϯa8Yي`ra*e40Ñ-X7h\Sh!_g>m2*q0( D12a?/]oUJ~G^]u<752b-@1 i~q%`(}(o.Hew%Ͻ]*soWs&fr%hA 2o0m$?J!f:z0h;i0ե2W V槺+GW F/磢, iZqw1 :$.T5Ţn=0#X^}CŷWmp8^s?a4֕庶Efneյ/v<8/;GlH7]a((AOJ5jDK>>5 ], &Q8>j3ɮQ;*Y.i|#q`Ʈo ň%@IuPpONES{LMl9܇,e}g.ډwL|H$u|heT9mF x\9x*sG_vESsŎUyZ#GDe@<,`,jL]RI"֓X3Dp&I#i)A8qBy@_bMגx$]m.J3r<3Ƈ>~z&,t6?rYB7*ֳ.ӷa?6s>AU͕sJ*EP@Gjs&l2&ACqbp P3g$$` 4i}4FWjU0i ^#Չ$g=9ϳ#wOVuQd6UmnY>gՎDQկ Ͼ-=oH9QJB6~,%#jSy*I&7u88r)9{_OS?lAQQ6\]EvUٽ/9G}fk7gE _n?.I?аxCႺAagiO &;_1vi76}*:|Lrs$h :GC.@%VH9 Q"3AJ 1%8h)& ew']8^+m m-h;MQflc>㻬yQ8eϒA2ςy5,ruȂu,[42Qa`+ىtQ2a&9^Od"kBEs>ZZˋYoq*kߖY[X4oY$}Ľ#Yfuܰ#ϴaW"Au#Tg5)W +ǭG3hgY)E&uf`4#g]Ńy'-TjHD trF_zW OCQx›(p 6iy5*ؙF1kʨ!("~L JĸWx~ 67J^^^W3/zX=uI'ׯy 1~9R)ջ!e=dlarIɸ|N7UAdG`dvQJ?J\2@%e56΋Wo_@:^*hRK"q ɅϨ<J+ya#}v]6vٻ6dW{GGˀq8Ib`B?%^S$MR[=|8-d3~LUU?xkcV~|3mJͮU7whv۫j/iuc /^Y%$ IՐƀKq4GLiHT mWlQbGoC|bgyՠ_Q f3FV!7P.BS}g!OǗL;Z+E;zhE֕횮ujsMUz)׍C3o/ԣZBR)|X_|ST %IP88:XVK BpK_|%(t)$6QNs="hf<\[iNS׾|^%pbO xĴۘ\<:KHVkϘK?MQ7DjȊ- s.\'%>dnTuKn[Jws;TB"WK' Țik:V?ā)j%[> yduHDᓒRЖY"WB1bV[ ЎSrR )*b?2΅H& eQss ǔT;݁V1m6[tG}1N3t'Zy,灠'seBЀ 8I|E(y4RcZ6PD&@B.tGJ6c  23@S]$ 0p&t`as-CA@~&nɶ4=[I(l^Xm7G1Ay\Y8sY8sY8sY8sY8s4\!sY¹,¹,¹,vTF1 ¹,¹,¹,˯HΩsY6 p. p. p. @ $|qYooQ=꟟y} L}˓PHzbj'y`NE]Pرlw鐞.0:z.|BI <XC#w&@)QkHъPv1wlg^}M:@֧ϣDzvΧmuu@z+nojw 7|I(0[MBݕ=j8?Niݻ_pUnjrBqMݺjr9YG&{5^.į1߯Vqof5 ucq~5Z/F5_W2~d௰ooYEBT mJI7?y> /Q.q]QNj /Cb9F!kcCsrQP;I@{kEVDv"J]H6<ݝ^*^}"Ml6JQI$-^n8uIJ֒hD(A]@P+~h?+!,ŕFI8L%MQOT #"JjvTC8>m+-J-UrxW7z'o+EIK6x*X N"<!Zxr(j7te?(.;<-hdK[Wv6w(Pک @&$DxқLǗ3nI.N % Hy5Zi+'p *+&! f|}<zNbhEbh18o3 RHF%S69k8y'c&] KY*JmS]t _Z]l z s,\sDc&ZAER;5D4ȄMjl\|)1qٿ=ZG+sEYr/'<fKt1%A8@TEo ٔ8KAUҀqBTS s@KncAwxpV[8Q"8F|3A!:t֝-K94PFBd!=&~N BX\;Ds*%;&8ly-yxq]>r[$pahPJ-.ixiFPG &% jVX%:JD ^_UGT"(R׊tsgp-ʄؚxv2K+;tK?V.Ittsʝana]IVË&anLqњTߦyY>ߔb2>~*a&տf^ ,4&tӽ7'l'gu͡1IΦ GTCTQbщ-?IRԀZjv{3M+?7??x 6tɖ9z敛y]  ~=67v~B6֖@ln鬭 fV*aacp,|8_.&zrf7mp.jXWS:qa}>AەIh5mܰaP2ar{So~Oo~x2}7~Wop.:ahM뮚Ms w6 ]z>۴krKo7{j@YzA7}8Ms' ;UX|j~6KNߛTDMzW~+?Ņd_rfٺGZG_ ;^lJÔ*zH~?c9\=.ۇlݜ㶭o8Z[5:ODk.2pNNG?`)2X`(1~ e1A BE7|J8񑅅e#+Ѹqމn{72[M3|tvJYC>ltҝ&4i}FR1"OQgR〥D=Cٻ6dWlH}Cn$8,ecIJ|T9CRP Y4{m\͓Jś  &kd̗FFVG#$YҞ-UYxs8{6,] `HgϞ|6YoSLrey$ŁK,1~F"$E(qQDe%*0[x ˑEK#)O TRIjn$*G#)F9&G(mò 얮]7qNFFklR\ 6w{GgHSxLغsbzsіNgmͫYv- lY%ݽM{yg-ыz^hW_tSg>]x~ K9|KSe-:ʬ7X~t@V6Oi([(i`K˭͛厶<#\ i>?K+ߑ|uq݄.ytrQxkVKO&,N{ǝ7 q {/McP ǗwuFG%A^_9{]YHwfX~[Wш񊙵{mRLޟN) ̂.k妕 qC 4&(4hgRXRrT@ dZϚ* zOO~˷#arԖJݭeot ̧[TG\\5 f+˺:?`BB1K+7ujb[66 n;D66LOLc>Z-wJZnjŬ8\X$j,LY\сi߰0#0>,n863`㕹LдIFU!W5 S1ΰM=bn2420o={c]ŃyYYg"@'-z,\g噡Q{p8mqҀoiyuD\J!3 ɽhJQU ܅FL|Bkҽ6Ji4N^Wtp^BB한,yP~,VZ(I O**t'XPm6ZbTSY% d& O\s""YA+k+7nk/+c7g(fV3liģ9%d3Q5b`,ՐR֛x[u%Y{=FE8;uڮDZ]SgcY|0/ש|Yo֠X|\aY$'y|Y"+v\eЁL'k 45"Ha$%$92Zmڝ:ޒ@L/&55p*lm~Wab,^y{ݻGE嗫7ߛ_޿az3}__߼G?{ILkBk`qvf>L2gK&5N]ƒ''9Ũ761 G m}aȡ2*VUW\%r-x/ mA2 2y:8wlH=GX/Ji^_SX6=fӄ;|0~l@Iv9U_[mi9sq*~ab.w[R0}]JnLxHju@mrxC#BkA$)i[ޞhoO (0x(@´G|)y.@cE8D0ް๖E5 ?Qن?"x당m^"[OmG 11p5k-^4"/).'y )POL3Eb9̩[2N,k]6:pi% PRj"ȝ "tJT@A!T nV<sSZ{Wt'p:y֐KOulg;-;TE]R~Z}v H]. Z^8l!)a FS>ROvʕUdJHJy:&SIgSEmFN^ Սҙr8O0o|nM<] b];PhJ%XъTjڻ ]p»N OgS5 G<]W H,إ5odB4nߔ*Ew4M'-.zo^#R b(H"Q\kG^ӆL!'p2 M*睥<: D*Ft(%T$5 Mka@7hdq,z+`$S@1rv[W&ZrǛ!G=57mD],yFbZz$j0Pv/z%k[^[-8!~{9n?Un$Vh7JKu59^lѢi&ruPHY"!(OԝTƒrm\g]yl|ּbd!JMDLR1 J Ț+7<4i4C˺Q9Q#((-#AQhrJRD.bX^=[0v-Xa& W&YJhm><9.kn˭%˞ʢikΎ?.rE\` J(DDL2'p$zL*!'' F5*#9^j oOd 6X59(`*5ccl׌QqtaSX] BƒEyn캬><,OJݎF~/Aa{^}O>Yye pgYmr'fJhmc"~5p˳M]cA?,jmѰnl* wZP@A4p=gXKr%b:Tu!@<ڻQYۄꞳYZkel%C"!pjT,eAry QFIڣ a(n/M7!c57J!%YK"Wf8^*j ,N:?ȣI;P= 5-H nl.9W;72^o]tr]^0~&`в3vM&heo\\qg%_1aDm.M`OQѵf7.ՓPCI/֑ Y1(l{ΨF ;;&g _̎FFcTWgg_-IܕqzB1_J 3ד?nI?䣡lu~:ӏ_Jox׹2i6yG4(Fǽљߡ WnZJw6a|{eI~Ck@Yϴug<]ɓ?ތ/[gk1F1Wkz݋ɺr3_GYq(wi[ dH  aÀfQ>: W0b]Уݫx$ͣVv9@#,}>uWd9=˝*S?~_~?}Ho>~Gu+0>GՒ&{a w=`h8^54װuhMhre\ke͸\]?WKo7_ݑϡw=':wIzl?tB$O+Tqb*J?<ʧqnmNuaSU̖T1HC6챂o7O<ƾK"hmEh<1"ʰkC;u0?)nN9j|CsDِ5[P0#Y+}<;H:k}J)R&)X% ICmr}WUT+=A@bh˸;}E)_O"xL732Q]JCQ41]hi`"ڀHw jWg=>] v`^Fx xʠ(:!7E-u ϧ9S XhT V?HI߾;9Tf,rX`@X̔ `* pVS KUPH4E Fntm_G1Wɲ3 (X~uށ~ݤ#ñt/F_ٶ[tDo=;tt$tӆ^wChjV&\^~uChL{S(\,0.T.5|˶svYGN9H8rB0dh)$ʸ@SX1ÂfEK=OcD2Y+냉KM-a$EV 11[wqxpr?oNv9Ȭw2s0ӋBC>jҚTpj53 .] *9AH LV4o,1@!4&PfTet@ȢrD`L ٣C&gf g=9FvE:{ht.ݜ-+:fy$f _='prW昂 e#O|$WAjeVItypg(01g3BQCIRx(LK+a,+[Z#w^ɠkA;3Pu} o_uHN$%-M+S 0^-.t9TJs8P̆vlBJ6Ayvުy^xsIK tJ<]oԙy=U%+Pq]֢=_zlk6on|w=!LG [2 Fٹf c)qW9<0E ->"UDR01ocDM4︗>7(WJ1eQsg<7I4:oK&x1|varufj1&nܿ6:FlWL[<,X5 &9EYetrsr%T8 yS~K 4D )XZYh6䐛tw* 햞7abЖPݵ%6[{Yuk5ih2 N:<,w'wHf'`O0Wu'-rȂ ŁbmoX&ʽ|0lmON:J7u^Gjnܷ;\oLmd.t xP|tZ܊ tY-?k,5d}J_31؅{B}|,^qopY.7l#m6WR{Rָ\LV>n[L*2~{+eRm30eK?{PiNȁN:PU\s6vG&ݍ7$mj5 gVsƙLP9g8FC~L9bFPM;9&l½f {idM'Nڹ~)$'ۮם\[j_[؊B2B0 Kc{+R!. ,7p ].ZbTC)F6X8N11&b|:}  +Qe* B?vzԎH>kZa ЂE#D;G e^#´%Q|iGhGQPsԨ)Yv,YE{ aMZ;3Pe8<I(7N%w4 mݚj`@c|~5l *.02b/OЛ|_]1םk֦/9 {i-/K;d2ܽcp}6/|X1fz=NMҺ*eʮ 9z<*ϛ~JY\&JϨkb 'd(sPBDEZ['WSn!8 ZI`%}_jeOSGWFs)+X`T"J A<-Ϭ)eQB9O S z )%2*ER2*MRȍ"aoQo/.SAURjNa>Ni*gn. JgjMNjOϪ^ٛ^l7?<,}X6S_<}D'TKAô br$=ْ:Rj f(H,rj7ky Yw;7W+/.A1K[j6 vt:rf\]?.9 kKHp0J>]ao Jć[5Lr̦ᷫm髵sa84lL} Ywo|6Ӈz) WvZۆ̘ZHF [5嬧/xtt»`֣) Fbùl}Fb,wێ(MoJ j?ؤq8|9R`8PPDT)GwT{i{iT_ډxh:k0 yiL`z$$X:#pGdʛU]GC6FY; ZDl @a^R0#4'(x}g;.˫-g9FfUfɺJFps?k2"q*EDQ$wE8wC CWHA|p,XP&v`[~;mpO߈CԻEֿqǞy-̷ _}H6v !HD!Qؚ+!-3QϐQi\90$5AZX,@(vGE`0 ȍJbXX R"2sl,|60N'ծ.r˶O{(!0QM԰hqG=9-9ak45a&n)D ɪ7_7{<^k.bDꨥ̢>D#0)%aϪh-JnB)A@,yGm7cL%[3ff͸GՅ,..{6Yu2TT˻&v{/7L'*P`i?Ekzq=gNRQDyq-s#,Jɪ)r ɬug>,I}Ʉgϵ?Ef(:i,N; hkbKFbR[XA`Ir aK/ڷ\=jS}'i=j ۃPlHW%4,UFţ;iױtŸs$;#fT4&OYFDFDƤKcҥ1ҘtiL4&].IƤKcrzܕbH`V \qNZHPG*)c8@BI &Wƅ_$;̤g.͓T&-VDݣd + @ :;@33K;r6"< ]Ϛ@0]iC(XK\qUYw{_ZN 2 (wuwˈ—ޕ)I=K0/Ou}6zb&h;s[暺;8|"ϡTT-.ݫ'3#|,uR>zzQju|~Q+y]06v=-30%CΖ }wt iof7[BRG YL'h8Ldf?]78[[%h} Zmt>tH s>],[:,U*њJ?q.`;_߾_~Jzz4 r.s(Cn5Ѵ44MشBu=۴s7w;_:\-w߮?8n:4o?i~bVC2CO U*{^QWҗ*D~xh )fYz|-GZFf;bbQ=|IPZڤ\?"o.2KA+Ã2:)E|j#+\~Np`7$-D"l/ IQ|ǀlzjbaveeNGg4!0ф+=A@rh˸ĝ[wZC Ea`( x"9JӁ&@*14>/o68ЂoBc.< kq5 km) k "qřЪ2A-v7SX,Q` e +gIqwU:Tu ɩ@Ձc1S.x#YM)x\&F /*e`y3l*Cz  J{1 (Xesg=5B+G K|[/gGW_ɩ2˝2*[Cnv IN_ø[%HZܛp7~v)ꔇ/PQYxjjGM&ޤ@dNJqH8rB0X24ȔVTe\)S:aJCM1ݢO N~dVM+9_yF9>eĵZiMqT*HB.e$F8؂*c!iNci2ZtiDkM̨3h!2'h gΊm8%x#z GYvDxb嫻+9fbk_yg,5!\q cJ.4<4tR+':ϱ%ϑt5#) O3AD$) NDd+NӉB& 4u-pnosp(_'`^ºt&wg=1ӂ ЃSm8KuMwF굲i` 0BP0A)GJRݗ9NxQ>폍䅳sPR4mœɅ*wJ677CXb Ÿr.`9V&AZmKE$"_hY"Tw1Z^ `׊RTz;gKܭU< Tn Xc6,>|m FImf.P>1׽o"zs60= oKT!zڔA[! ZCjOI3I4:b0~xmZKl0'ÜQ [XcI"mC@S'+vMR2 a䒘2H4e)JK1VV.i{u-tIS9*x/ӵP}Lm7>1RS>gA<yԛO :3L~OV0_R{7 0d[j>̋# u`0E&cgC |כ~[ʪQ+5_IJ$YnJ}uozSW޸| ߙ} sx /G*0FԃkyW_X3beҫ~żμM&huLGPO,>]x t7GdvE.ZZp E"E2;˜< 9({D.c`\b҅V:F PŤBXS4#;>=O8.NHur )N\ r277єDh:w<`U~ x ׿ KJײmɻso39\"ͼ-`(Л;BVN7 | ?7wOrű׸OaTa_m^(&3+yF:H*0ԖC釷;>b ?I.w)wf:,V;]Y&o+=,j9VܽaŠÈtzS-ͷUh^aCUMX=>%(ƪ%oº?>:[zĕ+c\5yNS<*qr"* SɆW5V`AFs^bh+1ޥ؜J 0W3F6Q]\ٮԮ*L%>^JɦJ=cʫt_Cf3n25ە׿u.E {$DH X-uK6ބ!܃]>gC….e:.V'm=dJ(c~3$|fvR|nZF`rRL ((u)d@VD *Œy~4/I.rE%DLdwcKW(THQ|BB*UDBm iE&RaQBv$oS 9|E7K㸁J, f|y=J-2vDzT6|[Hm>Rr{K;[wddǙ34 (:^6|Y$ԦBt+0[ lm@fml7S5}|VYB_͡> "~>)VK磉6Җ_u9PMB6ösMc.:zyY}͵Pu/'zY, ݿ7Xp;}QϢ2^zYSRWO9^/C4^"pfoBswj'3ۥ^+)JX3CŠ XFbĎS-{1;yh4BkM̨3h!2'm8%x#zrb)oWWXrW>E_+:LDRҶNVGSNng'7,|_鈃"H <ǖo=:"Kp3"اLp%J7bi1g;Ja8Cnd]@;o$*OJт[(fU/1 7:.%$sdv sͳ$t\9`uBqbXn u2-,z0;ɔԳLjnܷX7d&kC%=T1$-ib[2տ"|Vtw,!Tc{O֙cZi;̥U [RV\mYF1n b&A7x6J+: ":סZچ݆͛Un兡iwОf8ɰ7e}w&(ocdYywbg2A JͿٻ֞FdWew[WKٞ;#]mhՎP>"i7\`CgUʌ<yT$A*meH%z2ó3yxv~aNwVG~q~.:Mw8N:t{)VsƞOGشjox %jT{ \[s:#l U 6A 25)@EO`*mXkJ rR T e&klbR{}zщQuoNmr$^(5dom( dK)X)|[pkI>J2.j/.u KUf^ڷqUu!w-wݒ*z) ,J>?\~j昣CYCKkB̶ѢP>wOѤ]ٿ'GޛQji/ Ct#@| /RrM)\P:qc׽bsC+cGB$ EzTEghK36;k:K9HN2}_zrZZigxc;ʥkO.?1'(HtcHa\P贅5Ӈ"HB:}tn-B9$scikXuNTE C(Cߝ-eP&ʺ 5Mp4"DքҠUsR 1[ i/wK+^Q$L{ŽR􌓚J}N!SWMHJs\UE[^KJ#dv8+<K?W.BirA8?YEyt%X"x6_e1ɚ qrSQ驏7%3{￾,L:Uۺ'1PشT=n!(*&81&Nޡ 2OrozFd_[o5C$]9K( ARWoM4 `^Hʼ^cFxF?Էd>J4-ZbvF^\jn^rM]?ZqDKNԬdcp}wf0#ن1c{oW-qNegNQUq] ~9/|tȏ;wح- B%PtՌntg30h(2e%^y|r|vi٪`wxɮV;8y5ӲՁ ( +'<|J+JD|x +5쎇jg;.~?|?ǃ߽?ۃ?~OΟh0°qQ,lЁ߷_ݟnij[8.VcGi,;lvl:]U( +mR[.;>qW+ɜUIوUfeo.Kqɢa Y[pXΐ YzռcCwnv&9 &'k-S]+v0#`*:MyjI^8=}HZwlɅń݅YLJ:#b?3 cJiP0ظd!4Yh@8 DUulnql<p ^,H|';r#Kk9 B9d,[ %l)BV'7 `x]c7g#E·5q1ipqW 2 ۀ`p4ȭ LvMln~}/Fvm7ڦME>RVɳO,אԝKiT[Tʐdj|zVm! NQt# xhXAGU R,6bTFbH8"ff`Lh>#ZoD ǹ Ze.BJtJsΐ.(X;[-B^aa2ͪ[wܴBDp|;_|΃uL%0ӽ`hHEB-Am?FvBJ$=OH&P"Z YΖ u*Af%QLG(.,(${9&s%.9'8!ɁN(NB7,+,OӦ}nu>9HgG[,Fk 2.6+Ӿ񨱪/k} U*@(42\!mEcDzS^ev1=j0YR2d .AB霹,ǸV9kdNus]N{,k34<>~ n͵뻲:=5V rӮM6&_>fyhh(^6A ߔ4:*RϽ*mKU` /$2 3Lm02c5͌lr9K9~e4VÁ52kDHk5.zLԔ)"Cg]s%= i3[8q2^^DmD yH\EE3BSkdV{)0i{ug3 9 Z|w9{;ǑT.!:JXWWj_ =aG-i3xtqzzGbq`z3)b@Vz_hA|qq,2VvCrp"ns#v]VokcL,,rpR!8\d3˜;I&2+s "TQ[S!FML3) dso0i\du~x7Jَ-K2k g[(-|R[Qy5]lV3Ƴ{g74~8_ ? z蒂IG%6{0C2\vrp&)l|ߣ@P?!)ٔD ASoK&Ǭ9bffGe;GiŹݛtSU[RmNEo=R'u%M/gxZ{c2/b3rVꩬ1US嶔*'z eUH'H$;FJN7Χܔs>(jLt]zD=WR9(Zx +V%eWhώ_IF`G)?/'9Q1,į8V~0‹xȽ 1`jBkFy.WyJr==~8>=0o޹ܴv͊#um'gqM>jZ2Y7+o}:8[OFәoi0;>e:WVHv9>,ZPH1&ǕL߭Ut5ℙ+Aiu*Ai%(VJP5߃?x3L2[ J+«VJPZ J+Ai%(7' |VJPZ J+Ai%(WU%(VJPZ J+Ai%(VJ@P J+Ai%(VJPZ J+Ai%(VJPZ JEV[R J+Ai%(I@TznVJPZ J+Ai%(H *Y#xQ&rφ:st(C6 d- s7zL}˙S`j R({/x<'Vhși);)M)\P:qc7tRY C2X!xmR&p)\T8c@oYRw$0ɨ{Hq-k.(W[;1^>O.Imv'-L(em\֊\Q.SD&D1[0^ULLMOwV |h- Y6Sj0@>9P(gv8Wl8BFxu;<~גD 7ZOHt6 I:q5yF^\ fs0Sghf&U•}=m_͍^|TIQgxhM^5-M%; yweϡTk@}h-gҺ5nՌkɏ7ףoo#ٌ+Mx]-En/ףZQvg ۉ@ʑ"}v$GZ7 [? ȵ(< W0b{ΗBcڣMtlu:u2|l}՛ȻR+k_3 wxXT J;kwǏ[ݿ~~w?sϟ~)oo (lZKڙD3C릆Cs  MzیkqZ1-mq7=˼d'dd1#k[$95U_TPq `^.OyQFZDVe̖TKE6éq G_oBmLc%p "j \xek; imN9j|CqDِ5~ [am֢dz:d |jmw}XёБ_IaRl>6^]}ji@v*bhHT`tATS_xTj=ZO'X x+7K e2 \1m}9s1F#x1q]6 &eF2 9T"=&bFxଯ/8?hL;z#)]BXχ^.mn<$S,kB0N'n$5`#Buv*6XJ3 C0WRfyCEBsЈ1hښP K3*/yvgO41r?+q38ǃ^o֙HgϞ*u6Y-Ssˬ+dyRP?seiBY^UTF[CU$=IHG2%GDS%)@(rITFR*r M"-*P/Qܹ2aG&Aɞxχ'b͙42%D6`OLў9g:(G46ў;f98EqGcTt%AYG >Qw-l)6d$=%Sp[$.؟Ga7Lor uǵy7,7I\S\lݰ[g:3A{&i8 ۄ흕'68egб_tav,G2Mூ//'+7JSg(8љĒ @H)w x PSK'$^ԋhxP'6fϰِv 7yN;3$RO7 q܋\{GyYn)dW 農N\|f_C!]0U^F鴒I1Ϋ"IR]/ nHm;XyZKB7q"IK׍ʹ2ro_|ݫ|-%.g"XYYl1Sbo ۬>Fl}ThQץ!΄(q͍Nȵ c xCx5'ywf*nE9#/m{%=gkհw=B  &l,wJ, 8 Q*5"Vf{^ `{)vC}\]xy {oAu.N GUG3ylmv[Ÿ;f.:CeF.߇X&Z\9eeW?~7w?v{k|2-4ڐO=f:نa{ߺFƫ@#A* 5{T2(oae;.qPA+.P,x#+5W9XE~u]n 3kQD>))%m%z%)FmU 8(g(Nʐ]l2R.e 빱1$˅M*$3⩣FgsHcJ*S6ȹ_̅Nɸ3ƉObvN>Jy4_k 3 jj 9QFAzd _{<{vܓp`k/B}[ՍZnlPӼ<8>N1 4NN߹hw+-@ Lu>ʋL쇥i݇1se^57~|>6R窾^ݒؑykV_u-rɓP6߄zl >HZ(C9ĜؒB{~7޽h/԰r߷׬ċe5-ۦ_ۈlXIvvE>Hpbm_xD]WL>E5<5vP 'ܜlK!hVPXJT *S%xO`5dSN u@#A=F/KyfO‹IUxO VhD(oeR6Tr2qp(3(^; "&{Ϩ 6"KFjN"Ozs\_rj!U`,.OxhJpъTnhwzKV.wޞ{2ZXSv> (vet-͑- 4ʮY&ŋp42vyDxPJ-P@* TGQqV^S\&ĩL$"6dwHPF0.hѡP']mo#7+}K-@>dg  dȒǒg߯ؖdVKMY Vd]U|X13,W.{"Vyx!rMNgTQ+2=x`)j+jpcF/8;X`ܙǛq7$v^4;4v `n0 ǫWro7ɏG2fy lrI@ڛI!%ݣn qr~Qo %)i˻4|{l؈MneKu59\$IɓUYXf.ч8@"\}IRA t\W`iH-%x_ٝ$<gh+Ǥ0N2U.jĚU ]QF`gdGW0s:pMB&/?r$LCrcRׁ Ucʶr5rh:l/.˾/s` {>m278%vU>%M-^a|YxrV +}?Sq.L 8)z­7Gm#9fטHL%a\v3){Gfe pjFmM1J6f:R= '삈>E3xZzLJfFv͸F⌧B e ͮ#a{nS݋ӠhENK PJI00sr6{UDh`2!EQR4Q#\4=f]81eC U95vqy,Zw`d᝗! }"(gٱ[!DZLUaVx Vq !Q&L"&Z0&H:>!fQa5r֨_Dx,G}ȷ.#-է'8JhEl^ht n=: $IZmѕ{ZNX@JZ&R[EXk|v2dn}zGHC7˵' k3xOu'E T*]d%b@d 4y^grI1ބ[5حN=kĩSnqQG6vhۮ5ԨYitp`9L !L&iEQB-C/UpfWGT't~L<؝3T~~c{=nͮ.]2V_" 6Jj3ƫmjB Xwq?-^*5xԬR:ͼ,(` VkdRĶ eq%Hh){]`ŽLI{ |i)mqU[#_ZgrљNj_륹\˭c}pI7K xdԩ*iP 56rLYaM_Oӧ{(aх,s&'}FMfe4>0F%fN.J<p;"o808)76Wln7,̔l$7՘-щw[WL r!hp$Å1 SPAΨM=Dy8ay(v#cGN]0G 5W//+JFo;N ."pyҳOy)=XH&-Rg.OJs2IFk^`3Wn!ݬ^uF^:N2^JhZ! gJFV l;-d䐘H̡[twt5%Q[8ܫntivf]drLb ,ҚeV*yl GV6xI(ȯ jG窸]m4m+( b cBW3-q@` 6XF;pAY`ȍ1^WLdǫR5%. -`B J:Rt6@.ѱy5rv4|JB+ V~:Į~ǻeG\?ĆLGN6eÉ1&aS1+"O0.y(d5Ӈ"HB:}4)E;2F7: "IfcikYuYCD$OTE QXyl4 EQv1 #>GXwa5wW"HsD'HkBH$*F i !hA175"H$#2JKx*ΜdPs.1Ex~ &" =%;#9Gnt4%+^!O4 Vn \ G",p,^4L* .SNy +@L/<|S0ne?eؒuI 6}'ܒAPTLpbL\SC/K\bjJ+G?YV:5lCxn:opxqF}wȅ?~;uD+0KHI' $YXx??bh[kho1uh˘929q\3[n !.Veߍ߆S,'׋X:d *zq'D6yk-wR{TI-Ux[b ?< t/qGF$wqv⇳;}VT3y+~y A%u4 A^%K| `NEfThKO~L.J#۬3t66`_lnj aT)G&;g"XIĘpYnZa=G~IF/u,Ș R Kil|.>@b1TcT'tnh6QiFy 3 ޤKG\&⊔WtLK<"UR4?{Ʈ*d16 4n*RYuLS(iPD=<:203nE* W!2:j<:RFxm^nz"֬C͆X]ҳ:Dm zGԃͲ߇uR9trA0SbU6'H5Lb$ȁ#S6"/~LйL瓟_JƵ.s!qNpW_oONNЗ\qW8?rxU?'o٭@P:=N[HDǛPU'\Fdt PŸ"$  '@#z rW#"<"U+DcB&RB -V@Ȥu`s-(RܷuuNk]Lbɪ:XKF%͚B[aUA)ݖ)^Y{Ko|QbkLwni+Zns&&8RQpמ i|1:{jnT:` (֠Hu2cu#.CK VU)Z$.*kLd!]ŀIdrva= e+P~6[?4$ޠ1)(sxܕb~1˷gP9o-Uoݔ3"3!Dא}d#۩囯9|ZNԡ{0>s9 0v-Y>m4u!KO:w\ Ӿ{q@{XK,ʀ\e{7.iK.2*rɭL)CFw.G.A7:-7d^Sv4adx &Id_7a4TUln!`@KGkC) Y#G'+E^-RRbL6`Ďj+PH^R*+?;~jucNiDoP,Uhb":ݖ]E<\rxkh{&'yT,yJ J7hmN,,Z;ږ捅gK^Ur@>dQqԙH}AR\jDcR&6+25ʬ \EzJj<̕Xr"[g:* -g;2nf)^R,XhG,+ޮՏ\_t578G[SB\ yQPc$mPEȨ9ZʄEr**b Qy쿘ƚc^Vc(csOJU C^َ|>+屠v7x)jΨF5s 9XEj+X14P@bj!c@ĆTauc4,PXL&eL^ci5U\]VI1vnَs~ل b7x)"ΈGDqX,)zkjr6UPjaMTc(Pvз$Ł,z6#˙ShKHbVFh#jHSҽݖל#gKF,9j5½=(EIDo]0|#NX0Vr:vC*G{r"`j;M#Q]1`"bQD[|LT 5oߒk=VZa{؍}2|gW7ۑiiY=ȁNm?1˳ַ׊mnE\vCϞ,s,z퓎/pc<êcI;}Z^ ^&SkX{1\"%\N%9(|v*#Ty}5;*NE7voqgc5^4wi$c}Ѷ{&g^hm)0(T}D+9`MشЛB0*`<^d|k]'XhtZ'e^CpwGiO{5ZP4Pk猆\ݏ{{ppK9:l6D2eEtV@SR;KPX>omO{ҽ'*S]SJkٮO}V;#;3.ȿEO~:\_^HA0@v`k2(?rydxaxYڷgi߬EPhlqUVljPV 1XE-SJF' q<̌{%(ɧKUYY]4*BJN:ޝ-gOL040^>Oels 75ҏ{Й۽]\cN(OȆS-62G%V,ZbH-/ܨQ{ w370$ƯbB`V j{COѾ9}c=쯏"D_=B3=9ruipuCA Qa6hzkW}Nso݀JmWl^a"Ch"AeGT0<恗󬒟¾TTطM#ٸb2b}~wܸYkzpV7\F6Offd_V5Jd F77O/s:2k|eXBZwv}4Wݛy5g<\ooyu=";JɷLl v߱]o`h.o9t仆iލyK5f1yd6͝utEoXw;a ;`<@y`PzZC5^Oxy>;;x3ȹ8|EĦf(MdFU""pq+G Zlqk+{Ʌ. ߹bb,m><^\-mf_> |7CK&~jѶZΞٴjr$ )`cPhhAWFD "4Y^ #0*"$9Z/C YAkŀ[M\bC .A.S<" ޕ^L|'M&̾aO_\ yH٨K(ɔb 8){Nl^>,&[J6Fis6֨ۊS(S%cGL`u@ {W+NllנJC71/<8{U;#,suK"+oqHCfT ]͕dŎwh)-<8([݆XfR:snZs[Ydk'?;q[^ޭ$ӿbCviˣ5~N=\R֑M(UgY8FlY>Q6P+.}uu)$ƨ苑%ٚAHX1hPKP.Q(B5 @ۭXsq.P"eVtv~.*gXeEzMZUcH^5'5b NirӒ.]R(SG?~MqI[.$: )輵 Dz1(ǚ儯FcJEHה"$;c9I=s%@ϊO^ 0+ 8wAb5c󊜖 e{]8mPSEL?0n啹򜃧g?{WƑ_&r."c{&``N0E$%^DMJdz]vlށQ!*(1N.əܟ(^דIjzm[K- w%ol$F:8q5tim)fNq&?б~u?ůK[ ,̾7}^-޼y=|}JF{?nUDz58#$OoT5,lfksݿ'W~/x5>~.8[A1͉_랝O敛yc(B6z5;b-r$H_7bHMða@6sQ>& ?e`|ݎٽ׽2~K6)eWG<@`F(R'[_L|!dX~"k*'<۟\9~w~~?|LjDNr@zB$#/%0"C $FsŴ'6ͤn9zmIclBW"WiǽsG} >{+`1&OHj2F)T8`)QP)u@+R"d9iKm`u4[mMr%NRʼn<'Ol\;x%z Iԃ85HמjsXtZ\W4K΍srZePs'jMBq72gQo'q='dj~Ɔ|; ff)opoCQOG WÌ:9a"s>Cg{yMU)q3O795WsojmlLoWջ:gi|j精_ ɶ}tsgɯ&INow9e^I'%_W9i;=ko_q-]0U^XT%2*cWȐw1͗sW$\N>|_>+4 הfB@!b 2IvAr™w1W;+p&$c%RJ˩R $@%-Eg7 K "a>n~K _&IY5ڧ&;U01^].|Io_ywԓëvl㝧?OTӘβ۬=}~<_Gw(*z+u ϰ Gm,j|T7-;0s+{lKo05`-U)H.% w&h[<\]r?Yc+t,dPV<ޱ @wd:7}r,~'seROߐi_úS*mMp瞹pPw~y]@RkRRy83ۨ=8' Ȋi+U:V?āi.J+1F;1F-1D>))%m%z%)FmU 8(g(NʐB.6Eq.@.T&H(gSGd3 ǔT=&,). j|/>&0k2mF8,:ߒ;P.C>>HQ?z]/H.)3s(yRTcZ6PD&:mKrI.Q`V vaZ#An9|)y.@cE8D*0ް๖EEP`^w% IsɨlMj[7,Q{p^"[OmG 11`̺g r9+| Lu>ɳ(TZXZE B=1T <0bjB<|}E_9x1msJJMd3B= ăCx sG/f#ǽfs3i=3X_&?OrPzpXiC} BA=Υy:G¢^iu=@x1 ωqf+I SW4"oRvTr2] II8o\aq 3ꩢpgR1*DN҇*&h`P@p&e6GC,Y֗6#g1 epBlh /T(jXfb>η5`?6ӍاobbK6OLiJlWc-zm n6,aiq%s:.bFhQ4NScp@ :g(@,J@mrpN*dR[kg`Pw_~6ҍ6Di<Z sd̀ʕzHq\e1Q"E4?{QrJ RD]JyƳaPGhV۷lٿܵkFdwP_5HC tK˷>)yr\Wvw9Iو0X#tsɶ-*7>QLdOH>UBZO4O1E%*#9\^jn:=g 8l :%*I% TiX͒q=J9,,B(, y+ T)6Ɇڧ?omבZ=* nz[, &('Nrz)~Ψ 9I٠mٔXӐ?1βRy N"5P#$&DrH),ݠ("qǶR[ڢ-]Ӆ&کȕ)ǝ$*P'rڠKf ppvPOB*PCFdHh#b&"f#(D l.z,k~HbD%l%b+q-FiRQ<(`*G2P* >gǗ*EBysl)H+[x"k{]O) +񢖺K±z͆o{< ʫ ;PxDye ,/-QuCTsW526*<hJ4ֳ/T0m]Uj`Y{3ksmց@ED qLSo,;%M4UZJ:is6X p1'lI7K어4RkT *RYEk<" T$M  AAQ#hɬL<a V}&}kz4ȩf/ZWV^OClQxS7Uwn:T-JQv@ &vIVQ*ɂfƱ|J XdhJVE0 w[h3~a{5ծ&*]R\?}[*"+l\WVDTu@D墶 @8e{7W9Yrlod$: Li)\o@uij)AmrLFi:č,cD(DiKzJ!uT*^M  &kd4Jc,o5ɡBg:EKU'^r*yR:9wJ]}o7*:|$h{_m5je'΃p+˲bkmHbk)f8yr<&iwWNvEgYƲi}଱`MDy EіluYwtx+ZڷMqxbLRs%Q08>T DZU^`YOwLΝr>*vmǽXrͮޅ7ﺽ˞WD$U%&f G]"קù辛Lغsbzu8rM6VfaSsݮ=_526ZqC+!kv\y(G5;_qm,Ե{sjafM~MbJ<2L:}'7+˭wd3Ib8'ͳZNڰe4/T Ct"J,Rg{gosp< >Ny9h3Bˤ@q%#F>i'So࢐E*M^jCgqg\3#Zh"ģ1*:mʠ# J(; vj-5pFmU6u}< (pMcsCPJI7@5![#m`LPIc%Q$ΰM<.+lR_g0F+7]ܙwsp$}e=sV˪_o9DgK&5"E.> ܅N鵢2t\Ux㾥[.MZq'|kP7ns>P\ǺUBl)|@/J ,+$!T(' %z v//~>iABX+sGcofDRI6+BDTtԶ '5Ä8q/7r3?s^HC{?Sw SZtᘽ"q}YcڤM`Q){Lt4O*SB yn'F'`PfX{F ;;<# rh2a9AExz5kI䍣 at6#PxY\ 8!}T>ʜL>jfгaTy^|3[)zpf_5͗~ZBDxo_}۰4HlX9'ԇ޵ԭ{͈V_O~|>9?x=n!fĜaOռ9͆?gE><~}b+ 71Dz@ea_rk= aQp,|4\6]^ڟK6\֥cU_Mi:b#矣]R/{_CMBnܩ"+:O;{v~zwxGx}q$z&׭а7^j֫}ͫք6G]O&5my{ޫU ھqMNՎVq? (槕O||EE?+iTFp?/6%GF$vq6dk"fPԣ9{lCnNѲo8Z[59\xeطKc[4iWJ_Y`cUwٔ [Økae*JlB8vfcG ]Te={~lj`j48K>*%C$ᛓ)xYuPCݦx^wl SWnsQX3QX QX.j7W$E\OgCQD>a)L?P1I fBqlwԈ;8s0?+G rM?*a@ #Lq>; 퍁q!EGq\N]+˨x`zVǷդ6Wm/“vs~~TѼTsӫMio:%xXN%%wi,\vƶXvaGK3Rƣ~-Z2a޼:N2sW lA,_(or;VVDVYZXZD z ')Zos*-.vek#Y rtX1 ܫ rǽ-یyMPz[#+P%Z3+bv6mj=>=PE9{j˳mX[z$tvzq'au/.+lȌpkwL90Ϳz÷J+*3RhkE]0\ g(G$i4x/8\ђC;+MS!Z$Bm Wd)0WJNP9=x=s$ċl[,Q|k0`Ur~,_'N=wo^n6vyLpq;xc'?zJwJ mFy7IM8i } )CBhJ\cΛcNztnUzE[ zd' kqi*B IPБ-qȅ H©I@eKxFnIHr(F>5`^:Sj9KTn ˖5>hZ ּt;|OHZ>9H]`JFu)yH3)#kD*)&CΙ$3߮I8;9=Ԃ2SeL0Ԁ|{D|mG T3l x0Tlw3q wZ5  4q0;c$:'-s^vwgJ׵V[$E?Bw7ezN7vy??*aw8ɕ.#-tm~~> ʴv| O`ͼ3!OjM6z IJ;H#HUe|(f1bM^; u"<+p/9@U1Zm"1MP )K$)P51ٚOx̯jf&ξ\rxTsEd&ϑ5+WoXy"iiFd#eXNI% E2>9,h)g#z@Q J'ShrQkx<qк_~tv/iIZ㓧^ry8_VWTou<䤺ZelMOU9a%n>*ej ɂE\` J(DDL2'p$zL*!'' V5*#ƽ:$:T=k 8l :%* R*&TRmkYW¶8㾺ZօӅ gMγSvIêRČe'rOv0}3Kxx2 D2E,E #)!9>)m7^Epg#9YV\*/I{DD02WxYh}ڭqwmm$yIR}3` /1J\SLR@RśEiivTUU]}(E; kLSCi$*P'rڠKf PN>LB*BFdHh#b&"0z1ĺD ꨬ-]FurVǢǾQֈӈF Z@0Vr.!hj8K ZY4AپF#ΐ 6$ApeT=VQ^aXEWP/NVYKԋP Ӌ^܈YQӋO'}!/w> B]=f2S??Ԓpg Xp&wk?>_~"]HH+YHo,%tI'[N6]`rP7i=ҒR(xbJIմTBs4ֲɡJ4&ATZ+[I' wF`=z䌻֎/9cO09CX EIzX\~<Wp5y|*&Md$(>jMV! LĊ'+<PSQ+3EQl m z*Tja'vPo9ϽS"^dBKk1i,hf{dZhm"xTe-pAd ۾+Pwgt=ѥ\hZڽOG_t\ݯ}^-0:z.|BIm6; D锨DL RpcB#spqpWj~O󝧉&2c񩷏mџAgzZ'(򥜮2+bFpD+Ǚ$%Loш! <~m5jr4De<"p(3hw; "&{Ϩ 6"ÝG,9Ir0e"jB} Τ&rb% ':qv7heU&d:iz[>)Q_(IhP D*ͣJ+\Wk۵=ֶ6 M*睥<: D*Ft(%Tp2c\ka@7hdq,z+`$SdR˯(N/fh~<.ov27d]_:4<4_λ3 `n@LW rǫhU9w?__pIDмRA *NxЉ( F$R2 A@>LJ)-( i&vpA󥔟NSϬ9^C$Nr%'% )U\ILsBD7ZPqFX`\P8..tJyuTiƾ |Lau NZ2r*.PC~|X] 0V PgK +s/J**,$!r]ac-L$Q@dn{b'R4f*:,m9PDc&JQO* H|U=Ш#6u6_FVR]uc8 W>/j ~. 26Do H" l#$b T)eeQlNxH#iYm\DP)KУKS0$@al^ 9vpk'?]˳ȱͧ >_>|k[7Ñ…a*cRB9+@T) Qdlsj!$x?}K u%h+ē,$m@xҎ- B\d P [GACf\}03E؏1E=VFXwa$5W HRG8ZJEȽ ޥDXA؎'7C\N_ִ @LJ ɹK4()h@0(D4yOE'HibNbg^0~"`в=w]v8fyF_sgqLqQZ?Sbre}쇙}JJ(&=Uq{KeMp{ hdaa'W\ ܟ$^d$L^ԒD ]G'$:N(t\M]Mngxx~9)|4>ΟΏ#Bk;ozWV&S'vW;_޼~TIh4rWݖ8~ƽ8>R5׀ػs?^3<]wӋwl1H6'~Ӡ~I>3 d>O}=Aݕz#~JOhoԜayRE6L*'L6{uCϾ3ٻ;w4)w- $Qhxw?=`i]ji\o4װuiM(ye]S׬aźo6h~O߾O:ofO>t11aWlZ'>oKTqb-U2*D#~ O܂`#EKGt^lJUz~i:=ͷPg1DڊјAĈ@+]?G"۬Sdڱ:`_\6eC0.a5F2%!ZfE;ja=Gă b5p %ō rpl69Aցb1Tcm*{6 -x|nΧW>0ȣgkn:Ga@]D\PZGAxIБ8-ws YCOwCR(l*:pAb"&!Q&H|24Z+C}%Jb/ZS('SzY<{XM,%7CGtVs[vss#Uz4~&vE{;81Oz?!V1GH36^(7;Y^椌Pڧ!p&WWW Zf֑onݐ=- >֖e1޴W>>gQ:ڻ/w7@zMu՜te/߼I=2z<'j<ϧf^/(_p7U^XT%2"?9'ۤ[Ծy{k 7c$@T`t'B3t׻.HS;ɍG"{)D/hX!6+q#ʧsqR9-KZm  xkdYLXj#jkLhK+'#gY!4ugk/V >kl\0޲ec;7?`m`7B^r km Vg9>:N#)O TRIjn$*)F9&Ge[mR$t ](w[ %HY%S6!('c0,FΎc_8o5aM;7MM쭭)Huni4w;lffL»O=s ^V>ҭ_HZ*ٻ޶r$W}݇RdLv/ DN: ߢnlbeHbP<<_NVZ#yJ-/h]:&Y&&]j.IJ+o坢W0}1Z|%8%l}3JVMf7nqĦ軓/-̿  QDt6vLOe>|P+Sc0< 9gEYct1c$U*5m(=3eHDeE(-%r#  gx <^ s+=T.'^U3n'p/ z]1=^}& VkX9&cpn!lKdiqtV bD^c+ЬժXnhv[1K>)|e33i]Ŝ|O/s>&^ʖ"0(l=2IZHPQh&WEG@XYIՌҲRg#O5eeMui@Q:?LlV|r9N (6_=zdzˑϭwHQ^?P7ʛ7vuUe"NyTY a,D1Qţ> K *TeKJ%EJǦ'3T(srǜl--_H͆2G)c [IƮB½¥W%bWrqѷQ&$' :?=-$zH@Z@VxORP03dk] L |SY偧zTfS[(JX GEe [b."{n6;<L:vڦ6ڄI!\u0if5DFfUC9Gc%Rz]l>g$0Cfd(Ytԙ1JYDXeky.C*&Ȥ:onp61Ux(|jmch{[u6Z$jSA@ IVwYer'8"07xŢ ABtmuviWڣBl]5p>!:IɎv:AvkB\/SAd<+*pE"0L 2s\P}ŃcW{{0a m`]T\LV T{KOɐ,9e;Wp|L֡O#JWH>$1 VY*,=8^Akطvy^BLX@CT`/ӼPRa m|G#ƣã+nWWʯ"<[GGI[A#^}qxuձ{M5a]0 g}ḹ|-{pVza]:)}w3!-̓U6cy^\i"k$LK:uXk3kǷrxDZŭ&S+PNq@ː=EM9)) 'l2"!&oXm%Fx֣Ys%W{t#ޗpvF>D@Fd SV/BAAfV%(Zszl|u}q0!pdb'G2x#ەb=\d|ߩ8*ݓg"]Nͽ"0:k{l^?ntPw>ړJzPJQPL(SwHUܗ^}]m%8}\ Z[9NdցR< iMI\n+q=KXjC+VZ1z[~(>-ӷX{LP}22-ʦzt@wZ.-S{'j~Ґ\F%?ZP{#Q99@ -d׽pWОW7g_4ar:}u:jԿ[iYsJ0KvQ6󶨂2ku iQskVϾ7f g=uˊͶk/V; =|f鄳 2x\EpVR 7 vuM M999>9`43kK:Ij`)m>pU0l}GƟ޺v'WovhZR䪎эy7Pˍsݤy:jl\;KҺ}N7bZyou[BJ7t<.oqDžw6gM7ɶ i=Fr}znL\9 ο y^uO'--&&:H6u!;tԿW^UԆOb՞9uh䈻/.PHIF! dm=:\\^N !/E1&K X/f֣gr^/[.YW~oӗ_M@]cK2Y&] de0A{'P)2+Y4a/T{0 K."DkˢBtIa"u01( nbHmL5b< "I 5VN.H @NQAf٢gCˋՋ>'ڐE&Ј˕M09D! fRY3Xt:#W7r f[ͶѼ3zY=lh{:;J't *ڪ{tBHr ̹~wlB M.7m!oʡe!nO{wzQxTڴ?;Aվxkhr LFݞPAt%S X[SaLJ 'ڌ9s6S|^||N;fM2ak?d4#b^=3MBE䍱`Ek.&g']y19#cMQY5EjhLu6yzy4hwN}{}df߉KŌj8Xo'!NUq .Y܆_w?͓9jvOFף JU>鸒"`Xt$u*!3AW bN. լCL{k-w[f݉GK~wqv Χ\' fk~wp\\NFSnx˯`Z+rzuz]ͅ\4xzO s݂jXn̄_@H4Or0<{f}@K F|3`bUeGy]чGo6o0^CWG{?KbZsw>_]3Tw,#}45jm=-0te&.7!Y25r@`RD"a%D(68p& G2A[Q3Njc l3d >Cϐ3d 5Ogz@)}Gѧx)}}zd>ţOS<>ţOS<bv1}UnUd}U*Y_JW%뫒Ud_ #AL摚']uVie~5&QD4{][O]彫P]TQƑ T\IH(FN I(Jr,0;72$i"F36( 0)0d(եcWyg<M7 _Ԏyi<+ ٖnߘ2U/T+usn^gw̭}bzBxUy-Svj_Z=z>+7xoؽsL,LQ^%HRPdH ]$I%jlg{gŜ,f;p\g_tNb6XP IL6jfF1Q c-n;Bp -o15nd622z}&ރMB焏]w슜[4܌b Mn8>9nڝNݴ6IݿYi#ÛEI+wqɋ1jWmjƏ笽ZBJW9[Ʉ("TYt@9Ƅ)rR^TLzWR*!oO7̥A%ɗrjR]K.G)/# ]e!t, u/ T.(c.7IެUJ<κ O}f-1I^&b2-Q- +_P*' Z)D ![O/GSxOC()CPZUej&Ẁ\Dt]]b(-";]Xj^j>!}u0iNM甚w:B%_O <+'ܩ<,: l!32Vtԙ1JYDx3Fe_I"QQ3r֨_dXiHcWh;D! [@BZ[kIK4 om, 0$jUjc$UFvgXlZHuRׂRV:}d8MNBT!B98]]"^g^rmCuv%;E \ (njKx&L-^dt>o>~3U/'TG6_cqp3,H=X.O3=zi#꾗)cf:+e8)CkR )55E:u&{G uuͦӝ%c,َ=Y2VC#- v NZǨI1G4EL Hr̞m<2h"[tId W.r1Z^?Z]u33zȳfASᰩiZl `}xE=+[KvI/m=&tҜNXpjd`R6Ҩ@[1Ef oWx,mn~FxuzǏrO1GU1\3m[6f@W̊Z f^5Zh4$%*.e>'ȪB2 ؄} 'Ur^ӋHF4dk@Z$ NJZ JujzTLZU,' l$Kq嵓~dK4H~wc hSM,"1C5W(19v O Lm ]rr0d3ƠEeݲ0iР4sJB!^*0wCg1ZR)ޕ8c& ^~%|.=ZBn̺`1bP (Oik8;Yx)J摖hca<@͚:X8QK͡w'A]9*Q\1ȬJ"ī-F/, @Jكm{(KyQ³Kh3JɼCM$OYWZ)xAJZd(Kb DV"ip/9#&M)+N*cP$H,"RwQ{u/)`pejDXy]Պ!1*` jպsHLDh6F800{%!z=~\Z<$ xܗU<#D ^H@7Aʀv!o%tMކFL1Žqz5xrI'#@)f@ZNq:Pr")g.6SAqmF|d)e8bzc"Õ,cK#c5''tc,Ҧ#,i`J fo^5 \YU[f~/zuX@KHE}׌I'9E aC:)hL7Icy[`tZLװ|M̦b&AS}`b+Z`8FL-΄Tah ,E0wy.j #d`09q.Ffih͛Œ5FDKh,wzT*|Dpz;`{`}maM$ >^mW؊"D W}\'W(SXN#7% ^KJ["^-:uތ+mJcGTڲNbA 9p;؀,<nP%!JÙu[xTso;{]ۧ"mt"ӕZuLނ~QjD LH) =߁ov޳.jEUV]ԪZuQ.jEUV]ԪZuQ.jEUV]ԪZuQ.jEUV]ԪZuQ.jEUV]ԪZuQ.jEUV]Z9oIkv`1oF:`]#*x;\K|}cXhqk-մC: yXYDu-Y:X4$ >pB 3~kQyvK\(([ Azo!3FXJfj)0zLf_[wp3r=%D#q2 sJվ?egS~ޅ@>)K|ftw=YTkb>nwM~vSS%N{a@OUc8޵ٟ9Gs!Zh9gޯzo|5Y9策|6ʕ|̜><,u:{X:AMoxdܤa)`|Vi-@Lwʅ9ݒYz8|7mӋuLXGʧ{.zu(<$ڱܺJR $U:JGW*]ttU:JGW*]ttU:JGW*]ttU:JGW*]ttU:JGW*]ttU:JGW*]t|XL"7ߌJ+[Q ky*d]#t<ZL 3z&P@=gL 3z&P@=gL 3z&P@=gL 3z&P@=gL 3z&P@=gL 3z&P@7h^sv;*s.1vu~ݢVN68l@;_*pVG-)^]9L~1ߧˣm>/cG9;\LZ9"P}a;!q߿wv<#t"u^=[ʆÓDdO2=o.rw={yR4WSrQ9:PtZ}(Rq|cu{&zH4Ϸ+wq`W*,j+&18iMTɄsNsWޤBo7YȚ=$PԚMaѐ F6K-8-ub>wƅ˔j|ʻ$ 5Kb6Um^ he!2B*pXVn;|)u/oșߔd6 )N`O|QL'Dwϗ%όH.'ɿſd19_~vu9ztlDY$ɴWe8cҋlpE[&>8N.PɼBH9C}u?l|>=={t@|5gVw1geo-?oܔW7Vu<`;|:&%O^>O]:{]=r0"e۽?zԭ[šӥ%G]>::w:{qLcGwF6~l3ִcynXGuvK[c=~mű.-;˱)v@Rw !ڠk+^xbx7Ww& $xa MXs퐸Ǣ57j}w4.f\'kPIi:d:|[wb:C|]HN&n삸eBnಋ'ϝ[̗hAVs2#amur~?I gUT1R$9KhO'QQ钡ާ>/yzaS=-"0ש7SD~W_DVr݋~"Zt~:\_i4F4Dqk~-'*6$@A G5cAY^H@r3b/B[ګkgDW-rɨ\I\k'Β&tLj깿/>==azyIX}te*YUsIP:|ef#pE [LX('!~[sk5{lt_~}n-};mw%[m][Wz KU1bο@J>ƘC3\>ruqĕ^:V3Z>}}uWwR}7OwhuqrfC>wy<&–3mFݗ . j4&z*@rj\&QHzF5c,@1ϫ^WʲHZNM<0-_VY{xn;1D]f9 1J2Clt1ĘE  3WղlBzPTPi޴Q RZEXSm}J /%)ɘJ.pe< ^GCF:|z)]ZLmՌ?{׶ǑdᗙD #{]`<Ƽ D(&W$-k=QMRIK-0fUTfd9yUjkYmr 5Mܔ)aLµ,ܝ(Pw.f`D-^(HKfV>jgpGk83%2wOd(;8Ku]td,Tx2lt] R_1ƨrk1z` ]Q~b1*BBp wI[3 8!rѦ87€` 9S*Y!FP(p>>UEr'o*n=8TC*)+z[#3K ju}gZ%iѺ6:8)fcKpdqu$~hkc0-GZS ҙfUCQ>D5#%CR cV7qCs 5:ϝ0vּXD$.hC2+TdlsF.E!/R'y0VK*4(x!Ԭ;vvFq@̸ D մnlt$*fV!5^ PTj@t[ i氾lmFJʆY ?(|U2x&VN[]ˤYgDA$Zt}]^ў6  CP1L\@Rg=a=gF(x@P;wVC^`qgS&p_gh'-]ZeS5!bqsXF&ED68/ M'P .#X!Ũ(׈Tߔfm4[@^j+ _XNll82ym v[wVrEp`܀AS@du!pVaH a`PgEFE' \1EKt[s6r tXYđyL";#5H)($I1E>0X6)ti ֑l/dUM5f4gvJQ"!\`#)#M VWy5R;K@ي[Ն? Bu#8)y@IAPDV'bsR`B})`5R.qDlV :V "# o i$[񾀵ck!z4G\ _úWᗪ$3f]QR pD1AcD}9;*rK5[Szl/#B$I`Fs7`y:pĥ8I0o:@CVhHW"Q 0 #y0m _4~E534D. iyM0g)JpHK4Gi / rˆYfk#Э-H#3x`>c¬Hp#K4-W1jdܶMR(NOk]v]jq'KSBEEX6OXk0"1;K8=YyH! %y.p ;;Ĉg] x RX1J p[R`%l +%ر-"x+P,vbrbFHN{v8=Ѡ;#(ṟ́۰G=Zי5HT&Ke?xP*ZE 73U a0,ƬbiY&RH'JZ mOX-3, afVXj,Ur#%B6@ªlҰA5 |+fՒ%0% C$h>O{<fbUe,0:+[Z pWðTR*!֮͢YYk嘴AY#euhhfc`ʗ/-G=!|YaFlDCp0(A/[ú)Wds\Π #UHZrp ( $SRFvzDGC@]TjcX߼YOȰhCTm&;lE^1H"^SmS]xr3pOHc|G?$o ^^ѭ `\8gY0FhQH0RnB=x 7GTXD 4)clh)ЏIh380 Ne#7+Fjfz@IT)z/Qil[L`2bpR{'k9V#;{&&#S1, o:A5^56B ZxX@ҬVMFX&[A-3b%YRiDF:̈N˴##=8t>˕sU8]—c10k&ɥK`$1tŲs'4\-kBwMX>uEQ> "W`j?&lzFz/n>=qEސETd<vKʈ?|-˃Tb< rFem}l9LT^~{~[ޭ<]}Z8~csvCޥzwPnSoi>~ksS]^RC(45<.>sÇ|}ܰPrۻx}qTt23&j:<*^ &ؾ8S !3+B>[8u HYXCg=;ǶOf=_'s8>~! Hh˗MT]ޕ_&G\ֻ9йKF`bl1kttP,#ѝQ3{v${8:4ϧ*1`.zta_m3whQ'ZAȫE DrGqiyS}P4;om֍q=x59DPJ ҽ햁4@zHƹ{6vmN+?o̺ȃ0Q{ *_[#:jWyˆhsNe.ȗi3./_<_?_;2Nu<6󬢧j+x3 2Sљzt$-ږG|~N׏/Oi^^5&+R\jlt,͍$ͭ X3f}L^Ljh嗏/Nh hT e=*mW%rs5ju?麿o|7^'7-Ƥ͕1'8>ziWDlX9s^_5싶IH?N>O:z~ t0k[1 -Q~z~z<>9r^O>ըOݽS'5${ݤRf:o/d~WH0Fm>fU9>{w?~ٷO[O_?L_<黯uhѤ `~0GqGyGHw>:*|miWxi7'dER;_\{ݺ8]e}ug? q;Cw&}$t qvtt9V$yvr'GN]@/e0.n@\Ѣh}jDT R^{OFE9Ilv)`"fZA4)5=Ӣyi_Fލv z:w4? TO~?A5jMQj2[7:K͛Pzdzŀ Rvov6(;͜G8sS;=SKWkQ=3>)F~W,c1b/6;ZG9k> WmSJ~ NRW69$}J7JhmTa?Ctri8\6*s}]o$%ۇTTiF ԅr_CG4'Œl6 /^ü7{?l+M58]r63UuHPb4ls8>-Pb(K1R e)CY,PK1K1R e)CY,Pb(K1R e)CY,Pb( [,Pb(K1R eٔ}R e)CY,Pb(K1ʂH; z)CY,Pb(K1R e)CY,Pb( |02~~fyo~  ̑ta _mպYjYGZdgƝ]]o#9v+BAE 2d3yX` /I[۲v,IvIjew=VQ,~\C^^Tx>o ;& ~ZvYZ'U==2||~<4;< A^N;^g=yRu6ޅZ?٤mϺ.sxݻLmKfz{ԜَL8= [͐R6YZnkor~L;4#絔VO/H,u&,otYa[rVvg,3~;?4N/ȝQUYQ;Pњ(DfHoF0< 3z}'{88{Oյ[[#ofδkf[>CFKo`;̨.>AL+vei^Yv*jZ^OS{q $.vT$6'뗩TK̈\K->%~L]Ǒk210J \q24q#0KV1[Oњ#Um;u֔o)D /eljë>c"8"O%!I !"9, K*9M$QeIbvf.4TGy_LK"!BpJ-<(D𜋧4}-Rq%!Wbjꂊjqg^~A"VgUoT {?:JGFr{qj fk M#q`UF2 F/M ŘߨчCFkho48RIAF Zi}$DP1' yH2Z?LׇtmszP5?7N2^ѐ(CT{O繅2%) غD˻L=čqܫ>a:un+iW IĿ 7"VHDhPDJucI6:ۨq.ꑔjKǖ/P<?^4sh$Y:gON` 4ͳUnF}+*+LuUl ;Άɓ׵1.rRxJ|0 )*cpma#73㋩z^¼qL68Wcjb=C\?4ɧ)F) $dOoG5.'m8dDŦhZUrK.?zn4CN~&^"orW~_yS4xh)(`4)x0te>LSt1l~9ʮ_qS)=Ӟ4 LϣʞSk¼4N=zJeR΋?vIFcu]@N7Zـg >e4'P/!P/[=E/yq;D<Z3^x=e)YoD ǹ Zǜ YqݨR).T)T08 pY`w[@z^yB紹mwVikŗgvr}4ů9TL;EO=y%Kd'gmLJf~X+HUrCcF -Þڽk;b)y@EX3c9#`H,rp9L(=  ~'ۅm2k2=[U=2OyIfXo|(t8tu}.=^~\d{z;5n1'}yqTQ4A;O2C7<3>,Isaf[&38̦:TAdD<$PPU6"LGŢfr2Gb^lQ.+z6|,BUDgX0TD; :V[pIC P*,8:O21јΙ+ d1tq ]G;-8vb鞓NH1EJ,ϛ⼺lu=fգjry4)~M~ ^ H$ @ RFs& ={ Ócc'c&XH& /d"se,gЙt^$##LL WDlT;Ī"LNމB8RDȪ;cSFćtG ijDunm3Ɂɫ6T́L5y!JVdohYV`t[Orr?4vL`᷽y%}cVi!`j8O蛫ǻ1 \[E*keXT#5zz~aųRxl <)hOBd-9'.bfY]#pށVyx!rMNgTSWdV{)R2W&ՊG.8]q.ؓW.df U+Q& $&5ЛI!%ݳ{u>g:6 *1,9& G! HDBFtd9p2Ue&%O]J1!qEf1:̑@A teZ8{:o3"V&_rskLF!i)i'*5JH)CFcGi V:P0s:pM# ;L*[I+^~ Ht!'iԾa:̮ Ï]`>L&{2-}-c1pTdD~ f~C'dE##ky0S'"4%hO7q/woj~-4gp8U֢fc,Diu004oYUߴpm;gn-ѯ)- oX̓%J34`_,b-|2ʽ3J$'% `:w~?O@C64L:i4Hs!]¬ u(M*M۩Q.BSm>^>n[^ն4ĸF+?{OƑte7Y=%;gdNH8(kA4E؂Ѕ쪬++SЈsfgX_goKAǿ3zR]5.Q>V4XSl/g >Y`Jd߁@f~A@Diх4/ZNv fؐw^"4Ejtql[mnd߃X'my;jIVQ$vKr_Pu)T)Z/^ڏVWgӪD٢N{վj} Wzw-7oÛ eP'9&S^r3|%gS4a X-"`s.`9OF~\ؼ6RZ6|{ky+vMϲ)uB[4l ;6JLG'rT9pveɄ= 1w] sb^wap<_Kw[V]zTm7!\1wKeU0RKZ>ۆK ].=v)V2[g a1uw(~\=K9LaJujyі(u ܙc~^e evd n©^pYɣ}Q 2_Yt?;?wEtY8#{1܊:3ٺ}Ό6 :2֘BՔRLѮl ^ou%lef^eyIa=W/w j%ch-bbmc{f ^#mxd9,s 2P㤹a`q2}/^U ) hV\_Pw(FDn"ͥ Q"0krLF~?N1nй[u.8%|~ ًx,\| =У/G߿9:=jo~< 0z1M0EB3e`cY99r*ߢi/:%`I!׮cw']h&^sL;7V*xw.ՙs&qnup_$Ԅ=g>ġ2,RyO <=hzx%u"%¶: e9]b^8+b!igI){ϲ;-+991uGEr)σ [qO#9xtxPsskI]>4Z:A0 d52FOrzΫ1=~ϮfF0n :ނ<xfxA|7سp}e-Ҫ˻mI^~=tslam}N;s?HOs;sN07o"Y:yt.mngԾ"O G&:%S[*E@X+ ࣲX0r;l0Ra)p3B)=E,5Dcj~}ЧDwMVRoxRt{%=1YqBL`($ . <#h"`8 Q+jހ }sc cd+--Kp>0&n3E4^ȁ9);Ŝ(aHǂ鴸8a=H~sK[z=K] #6ӛg• ,a=W0G&Y j ŝJJ J{HJe jG!XF F = yQM4A Tc>hϴǨ#E! OSUu|W<\7|A',9<׳ 2b !RXFu@ q8VQk ıxθ~ȁW3MӢg]4;E]{v=yuOIvd/u!WnU3Vf:22ݸOͥV_Յ։sƚO tash2e6Sv&lF3,%8~'Ki2 *F$O ,ozQaQuEr{Ƕ[ seA,+kb'~j`5*!J*X#HY5~aoy *RFզ ^=T{x<ll0qkb@sY3*2ѪM,k]J~MnXl)Q+M$\mJ).fl2Z4Wkj:ǻlmuj֑u3{"Q`<[HS~z_+&7c1ŵC4ӫd-ɵy%慘K⛔jNȄ5go b0W+=ioWu'ΉuOt8HeijzZwwaCz80E s$.gR\)rPCvRnR8Mׯ?_aNJidӞ^ʾB(3un 9!v֜fozpuãb:M{)ee?̙/m)c,vO n5+S\\zaT୮p- 򂣯^w SP=)- b_y Uqp莯1y5&T 󔪋z /wV\Iu5EQs+5;NnDAC~ }=y^񴩊vȔHA\#3R(w!J֎D%*'xC3#K/9 k):zۂm39OM R&3cAyXzBQB(|&үmgq#bZGloߺz97Զ.| XrR9EE1(u*Qa\\2B0rK eXq3d!a+ 6C0R0l:Z,:Ęct`Re*5C3nrTVOJ#O/jqԼMhrhQwzkKM6 vخn:VWu>[Yuϡ#puS-|WOGshe o5Dr;\QƂSK.$(k98[(oGK\1dXJiux.%6!`BLJ;f2Rʃ oSrzyֱmtZ:3:4&9ШKIIZt䚪׏Օ_55ӣ͠mLA@BrM0D=RriWcMamj|GgSP Lx3oI5kpF"IEdѥZafXTqG$G1cAP@U6{o(KȎZ¬mF5fTtg蝦Q5*c*C88Ÿ9{\ԎjjH#,D0JpA!5TC)^8+`\&Jw0?0R׌MOg{x#p:f<~vfv U%?8^Wo%9[ y&`!ݮ${85cb<4.afV5+AU816rg#"orgM6f+P ssNi[8o_Bmfq2TKP2Rc 򿵚Gn|px=9:AEg amQFaZtZ*nW7Ǜw37}_+\ BrO҉@lJZtܠܛ{rAt]J[iκ/%Mt>S̢{&t I۫|Wxq}rӆ.o+ujqƗz"s])3\ƩFX鼦 PGHD6mp&֠䙓SMH`Q\ 4*_JiWFEM1@FE+=1iPZ_uVх>vmvnK})}u WRc>`z~7sۆɶW?}7>W l:VnZژǫ~ZU 򂠖+HޣTO;TgBlc/iD?'*; ӫ@=@m俷n2V^YH4!Y+(U.h)g`D/4V{˘JŜ4R2R2.tAX"RzʐQ.i`YSͽYx^Z7\J1(|;fz  ?;DЇ+όqCd,"^~a+EBb"hΆt2Z " Li.h4z(Er,'h-gXF&X^Z8ء*(%4Y/DfAetoue/Oq">xHS{]A'hvf]ór;+lGl@,KiIN*PI jޒ%t(ex,[ ..9x1]RPէ dNx]EeIx!&霙,KZEsK/f+Žf{3iU X_?J idQg۶C} A'z[y)~J.G_? /e "1O(e ^pG\_!mؗKUۥ*RH:qEBr4gե,W l(!pf0pGQEF(GbCDb}`!oXw|f1U u>R04#ӪS>d*[)j-"{$LQXXǼUw^{Fh)|:1+(v%dt 'oT)\]iD&%]O'#'O5kVLZ JfJHc/. B&hCu>xD4jX,33FAm1Sh mvF#gddUf N  x (NYiJ@t1=/(.J#[.od1M{owi`/=D]9U*w5E\''ap6sɌyq!,I9 2d2 Rm\4U9Zh3EoKqq9^>IrsI`*Fe"fZjS!3L\lui !ژR@[U%2:ǘ%]32Ll=.E38bJ8$Bd6%c=RO֢Me!T;M>d⦟^l7z_ä?n ë{F (!(QA!qk jpNZUQ:URvѸYP=9sJ^l:nGlrHQzg֚qI<]:6ڲԖF8c0XjGg9hҨ:SLME=im89 To$Ԑ2@-:DȌA!pLS*u>fi@:i*jyX6JU,?O%Q?z~n7(P22E~@pX}\FIqYR75B@x..Ylrw!ojh^~~ĂmNT쾙]߆Ez5)#w&Gwԛf~9RnZz?"]?S-yL}ެZnARzM7'iBBF^!'"LJ1p˪~JHo=hus=Z u͝nS?zf͝?VPQ?ɅSY銛NzB#<XeBZ8YA2JbvDye. XkwA!'m8&i<%9hùZ 6Q Od )H:HKXIA䤕;E&IkY'#nԃPBJhE-Օ&{6|4Ķ7,tyH,8!fj$QeTMYSIiţr*n1:€4|q0hb#*'8~;eq{Z#FjCݝUS鲏TK?<)o:;^l Jtvv O6dr@ny]O'd#* TLEČ~fG8إ8e·iq-ɶBT9EQƢ *YL0(P I $.o۳Fɷv(^q36xx1ž.|gondds97;Wsn?6:h>`WwǼ;1ClNz"]UuQwr.;s+j5|LԻ[vkUe/OۙFqNNŦ2ќК.+ukq{1KJx20N# +>E "mp&֠䙓Su#4>Sݙ@2- H($"dtѯQeF-Yd8A#Z p'*/$(nJKi󍂫. {{0uqnwī3sk9J;HDJ;TXI!1q}ܗ4" `zfUlPr_7;nʽuܤ qDZ)E8rAK]N8 x[ƌW*椑mbR2.tAX"RzʐQ.i`YSͽYx"nWm:~nKCjʲa%ݻp2FʼFk9C82*0ѥbXJl,KĹq;`:oue/ObNoK$zࣃO1Z l]ór;+guGRBrďF_蜠l @-[,ʸt(EKXϾG%=/K &!CtR+UQEp%msf,eJ k9v^:;R gcx=l?VaA޵6r#E@p@ ]md8#KId~V˖zLɲ#wU#?_8,[{lg[={" 5ިޭWBxhh $/ o4gЧTc<ԑ<<ؑ4B2Ix!-WrV!>Cu&[&Qp+"6НI!1ƙEKRCq 2 Y&JƴjlL#L_ mt"Y_ng@[y^<}E{måg>d1`&Ոm8@U NFŹjC/z}vؿٴ ƊM6=G2'o,3~M gq}s:pU#F͵Up` +jƚ@B^ӻL.2Rxl <)hOBd-9㖴="N\ԶiV,3L뉘PJԏ>pSGN ކ1|7*Z!$3ZŠwd9c1ÜHv,;e5-`5un"Hw(PELD*tE %SZxe%żko['g5o1hBhSlfؔ6mjóFᮨQCFGKi mf*ΤxYak6ZD+\e/*k=XEivyq4rNfdf~O~ aaݪ]Xwt=%-޽(,eiQtV#E4^%8omTMРd$1&nqlv:ͼ,(` VkdRl2bJPv$SʻN{-?ptܹIMi.*bj%ieN.Lz >ͱgGOĎ 3I߹}:}5;XY 8U }cB)+M< ǚQҡ.8&rZ`Y3b4zL$אWza~!đv/&gW O{n$'talt@<8`IM܁)9YdR"gBH,61:2)M8yoSw9B- jI2ՆYY/@',Pn=[ht_WiDľN8?SD7ibVE_%qTM2zf k{E9>SHlJƓ"WtVzA't0pL0n20Qv1؇l,f7tw]/fk6-E7fetͅ/Dh}ץvAc~wZ7Ժi[h6>ͶÅNZvMGyohz(wun͚*-:q)KnKlX/nmSz携{Nݟ$zts㸽 D F.)ǃ0MHL_ ^ChF`S΅v81_gEF6&jZQ%d2̌Tc8:cc,4}!sz/bLJΆ)'}F'Ɲu}0q񹤒X9no{uc!ók0x1[[oػx:-VG<l\6f%E,*_|U[N?\߿Qӌ]4{  Ww]wyFԉ $^mn ݤ|_Nl2Og սUfqɏh,;8k{&/ _-g^ĥnp)/tOwNS}O^̃Q\K'F/sdb_'7Q}9|rA:ONF㇗f/ :ȍZIKSIffK*K9pt6\|N-u&O~mMJ-p#W2= '^tk)fPQ[wvtOns zƳMƽ\E.x I*dHmWSaaj/yc3NrGQ*Vۓ+Opi졢;bs1  C`UF2 F/M Ř! m6І7 JAF d c%9ʊ$5H|PZ]¾FۛճDa쀱8x) !{kQ( dKה$q[(-I!1C!{uu{ 6{US[zitWÙ3-$H+逷-+詙.pd m m 'y?-^=f-nQ#rD]]m4mr3̙j KnKޕq,B%9#Rȃ8 NB/a.(VopőHq$S9tuwUWյd$FWڽ1-E)) [bey7vQ C 2X!xmRN?K26o-  $'umt@?.cEG+oSwzlï_7l˭7k:|FƘTNpnHc W:YE1$0.y(Md5Ӈ<"HB:}0<8zQ2ōFA$Xa薃l!I"Г Hu|*5CR55{9m?diK9$&bbf"3$1>ZPc/Faq;md0(^qci@&huAp#::Aj[ Jm#d}v6+? G ['j A[WO7TWnjHڤ]JFRRɅW3Y-u̟ȝ<)yJ&$5&>qKAqP1 1qt mO&8 )\5{6jrtL.ɵ0RnhAY6#jؗiG?MdJ7{_^:^~TiGAh\-C߽iZkΗLr8#/R[}7zq3kwo'og 8łA|-uQDl6xNh}mIҷ#<Ұi< aVѸQi&w9ǣ|'1}ӼqTu>ɦQj\.\u\>eΣ䄏sxďw!wtXyaTO1}Y-{qN;}s7wO߼;žprk:uHQ# fo~ۋ {pgжiƒC[T-660]rm\Hʒ(؟ry3oʥ$_@WQףK_UgvOzq֨%Uqjb ?<=GtI O=w?"6]}!B+d̗TbJO՘u&NQ쫒f|;!*MxlR+2 ).23?2U-GsdR"Lj"G4JAV/RFras뮡Z:ܦxQMC m+:(xcڤKE\=OZM'b6Ƴj72K7]2yy=3o/b)VZqͯm*G|u 嫋I9>O>*Xf) `8evnz_  ڔ㍟V_10+ #9TtNz^VA _i΄Щ!}JG<{wg~i4(HX*DgJNC02c 5I!R0N2U.(14.2Q>Ll]LV'JIRv9HpĐ'x0[yXP0%RB/P[G/ޖ*%c.PT5E܉'l۰ٳTd ,w?M$ځkojD똹@PKQGfeADjU3jkrJ9V"е)БX`&eDl &͵+Tۚ5r֌ata[X]-B'Յ ܮ 2n>1īy_`04?356w"k2)dT2@a <$)K\p&)'[ؒDh`2!EQR!hFd2v̺,!8#fףحQZ11&m-xV-kmi )Ek7IuJ8/՟! j"(gٱ[!Dخ.+m| Vq !DL"!Z0FMKIC*HF5[և>y\Q[jDݲFԝF4}H$絔Zk/SDe`.f "J#:K@Tc ghd"#ke)ȐB57m$ 0$T9[D_#mu 8fuHָzZubEz"!0XP5e]2$[v`UMSŃmqclY>*lQf! fr|"_p4s"~BяU#ض" bSi[/Tz e掠G!YGȸdQ"F)f6!fKf $>ЕjT}Ey`pyp5V[wi}!e6UpnUi^ʐȘeU =\FhQ(F /j:wGvzN;nؖe׮]r dIFX<4F@|UQHa\fCgoi2G_uIXSBR(FA$Xa l,:PYCD'A@-U\AܮX3T,N۔x08(aݧdW;º^qD8HsN"H:kB@i*ƹJ.]Y Ii-xXοiA"JW+-e^y'5FCmAzUvk{x˹*2[flڎsџ+^!4 nKx/\inF'xW zOuQxMz,|Gϫ⇙e?Zk~CR<}o͠,>IB GܒAPTLpbL]3Ce< hrEpxW͞]>2cIU3ɵ tC ҏutvZ8 GWuW/V1Yoi֟.ozV&U~jmղJ;BWQ;4ݛe&zӺďtV8%ޭ̾ݛ~鿵铷WNVbbAv>]W("6z}U;Oh}mIҷ#<Ұi< aV aѸQeI> n =Y捣 vI68WrI'QV>?/.]W<Ӌ;:԰{^'?>9㷿ɏoߔ߽>ߞ~ {z黷V cj$Mo{`a ߓ_wڶ547ZZxpY|q]Sn .w6hqq$ޯן_ҏI,y"7\:-Oi5,A~zi+D6pԻ?+j~ZJ2'TaU%8'nEk#F똭@js+S~ꋰK~t5] A[В1X̩Ĭ*'访_6wmK_~}T_r ۬^ ai1H>| V Q<bIfOuwꮮr Վ` !yk˷Ea`A BE BbE{Gc|YxAV:F.䣢h(z`-' ߜM&#"a1T'Prg<*0XUG\=LzL-}a_o~P *n|)ćy)B Jc.(]ԶthoK|;R{K{+g$Q>gt$NK=.xVcGkW*r4šn3h-9ÆBz&eVm{VV0=ltge>/&NYweƢ/n=8a*"Qt[SMH*6tEH"%ܘ$K02jhxQ<4UiT)DJ5̈$Z'0J%I'J)-) iv5O3ب4ԫkk'/\0B I%jI@J5W#/ǍT'c>2ΤDB_ Uh".CC(ڳ.B|'#ԅ4>hΞyBMiNz)" "$ ux(eoouZRx=*WL2ERK(}]VBpPpOO?VW֥hƒu[ߩۄp{UFxNex8׾2U!?^(`*hƎ&Tgp>aM{:.*I+H7IWUZ!́H]6̫@|<6]Y1l^Bl%;BzQnͽ4"4k _[b(e]K\ az)pg8{?4{DY>ESd(KߓoYpNގ?M˓3;]mv?xnTZ'+ ~uV?OOrRO4gcӟƿ'M U̗/P^E.Ɯ][HY9~4.~i<2@Ndx1 ZRC @mZkl7w߿+n/Y`ǝ՚zRVӣޓ:UhmqHqUg }oƄ3BH@G}Zu}hVA7qs񩋽QJQj@ɣcIJ}# 6c  rgHaMxr6z D ]^$`;VfeBsx\|}D=^"[mG1 b1RF.!f*cKfDp19eE"bB^,>ЛUmivI`8q)bR=ϧY '_):_Em7 <չ~Uy|>]%/QEssā]6}3/xJgя!3pq6î$%EE?Ayrgl uz[4މ\M=_~ KBjF\"V*$HY~K5Y1oVI _l+~PQ\gW*^no[+|zrq~uR<=mK瀵 *tUrg; KԸum(5!8[(6{]鴺ɦ7!CgcrMT'8x)ɍ 7yMO>!۠v? fh_zj <3*?-+Ybh@)q ΂\NPK* #@0ZH\zN#'8Eq 6&Pֹ|,NBR$Ho\;& qpηR FS4.U1C&ijb Ja(@ J@mrpk1J-Np)#YNRǸTIJH2!Q$cP&8 ^GB11&)r3b$487yRpBNu>|H̟݁=M{b1  ͍cP±KT")r98r YFKTaJ!CCZ;qQ!%7IYP,'2{hMQF""Vw aϬT#Tgv1'C!yqKPYsotR; Rr-$,U*]ɂ6&%E=G"oi&M䒡2BV$u}]CoH8h(EޱFVG#e:~F\3-.HĶQkеL%D:Z MB"QQ38(ėb<8$X;E!2aerN)X'*).z=@zC`$q-i?{׶ǑdpyÌǻ00c !!fƃ=Y}'f4"K̈8'+O>qDƌ6Z(b&Ac :`ԞYdP8(x+G) ,eY 2i>%*Q[ήYED 򁵆ka:`i*U`;#0pZ(Ϥ5  )3D[WaIm(y 16Veқ@!٠!.th(ZIUfg19ULA^% V`%B%L[qc֠\cdNm8EB$zP2 dӶ* R2ERΎilE?!oVSY2Y2V(+,S;d6I d1W*4v 1NctNeHK1vLY6wbF ɸzR !1Vn4cyČ:MJK$-0&lJ80)ЙVNecX"6VVr VQ'J[e׶sDwF%J92jܠ`5FY,G<W/: TER+ٕ(27j{TP}^R6ɑjDyY{EBUTXQx %ŘTY)G DDo eXxm{; ~\uXgLDЄa 0Hmna#eیYUI5ŲԠ B {aZ~?iwV-!5K PA&'z|eM=B&q`<F/`:p<(}rmV28Ut)\]Hh)7VScr,Ҡ;l _jTcA#zPr (1de+Rʵ*6VYh<{ ȋk2 Edu}p[q{qP'SaA[w56}m6F82=nbTnBl \˶#D ^HAʀv0!Jp VhGc ;f20jxrI'#6Q ZAmJphIMGhͱK;l,@茚$frP%&JI>2~2w1`DhJc<9ye>F/q %dgZ+6C:jϢ;K4YF%Gj3Ko^5 RYU۫ ^`VAc-B@̤I@ |^3&J X{1sɺ6ZAnu2ivڕս]vv:U5-1DP]\tBÑ$0z`2o6:7  kށQfuZtkHk sQS9%  `LyyW雏 3| g%<%*`rH|( Z+5HJTp A(P`AHDUZ# RЃa52[66I#+dOU.ܳFrq\U8/ڔ|^5ʾawU\@`HactiY jCP/jC bLčr KFx>*cZ6l!&\K]+ܝ.ńE@ZĔpIDa8ubɱ:.Pklץ5];gw 0@5aKB&u^ӫ ua{*b;PX. ma~e>O?m^j\Mѝ*vQY\fYk3&j:v'!R=@U hsRY#3R\㟍6P@iR} J[ZuWgn?Iӏ+o b^O tγ^g?gw^:yMd31|@9YyYע*pU~=޸/{ό%0I,Lba X$&0I,Lba X$&0I,Lba X$&0I,Lba X$&0I,Lba X$&0I,Lba X *=#0 v?p3W"nֺ'@X#@+jvR@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)J ͞H<<#%̕(BC@ZR}J lH DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@r@ ='%P6vG (`sO^ R)D%1R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)%Z{[OԴ_^_6/>vRl(M'W75{@ Ħo\_5$ G0qiQlI~qe~Q8GWa{~}OuOӬˡO'](~@ т~ͪxKW"?'E}_Y:xffLA Pma[9_\zA+הk o.\y2чӫG\ۿM k}_FzkسfʒM,tzZ'TUq ļ9Wߊ[~: * f(+ZT[**]ѣh %-ݞ)ǵb½zz} UO@=I5F \ؠsҊј{.Aɳϐ?(-Y 5~PZGXYbw|G<,<77{xyh緍S%c{6H俧j HǸcM{Ss[<,R9Qpq}\uNi^ hi>sjm>=tr:{r.g-pqWOng?ӿvxLȭeL65JO <UI3*v͹}_\lH"o',_>>.~`a8ppgkQr\ L#VHGD4"5QMڅHLzgj%7?_m2Z3/ޡ:>C7X}nlKx c>$4*׷\}z6~Jt{n{6yw0^8~I9޽a޾tI!K{~דF/ ֯Ck{ko;ph}&]1UͶ7[%|(+I,R"*ZVU6FK5Jmk f֘WtCy}|-+\נܪD>5:}?'½ɻdFnykW7ͧCefq $ݟql?ӫE N,\~zv}^=v3ہV7X[j~~ӚټWƧkJs}i Oh`v:(~]jwoth*-\h]y6|埸g_C| s@A-<ڽ8+UءNN (E goL1O|8;A7ʏHl 01_FxFTE! •q꘼iۏ[)%\w5Yclƹd}fsn@9rlX2ko@5@{ۘkw4e~7Hvv7h6fVy>3/>).lcnJ[tJHi\*@n&ǂ@_;xY5SۺӤ䃉 \ex%s :Go,k5"TKp)f94xG G@dwo^rd0lM LkiOr^&إн9}cm04X Wk;tNI笳&*|ޟj㓓}ce|?`bbnC(̌;9l\YH "eD89E^ū%&hVcN=ɜ]O^ݷ{;]&3ڭ)c^@i}yWmú7rvz.o̮> v&!yamb.9z g].^̺oo?^vvԩb-]7z+bJm/CKx&Tm_wmQJJk dV#ORQfGGw'lȋgz9_).Ry h,vw03y<%)m1}#E:dSI[+GDd|q%r0Ȩ!JBR\bj&Hϑ'WQDҤ !hQF-Q nF.j=E4Nk7i6OɟR= r#mC]MWv,Zl_749o2܆_.g;zʸ|<[v%D;q6H ͬHDd$|Gzyỳ(*K*( gd,(PYe!eQezDynE֌;k: ۻ]|8^ecXbS(^&/G9tPs4 h$eE%6G\ l`;8bKs%p 3A%vDD02lޣ%vn< Tv18TjR[R`\^lMNEΩNs* Q!B:Pi)UJ[1/kKB*!fH!׺u&"!3z#(ºM<ꨬ-, giQ"N>TSd rOr@mL3Zq&_"j|0q[hekPHs['3 j;g#OЭk/Hr,Ɓ;b,:Tb1KZ\j0:W8uϼ>zIyH)HP}Ԩ4YadV&0Q+K%/ho,6=Cԓ AJ5Ĺы:}zMm8powjlTo4:-JQv@ Bd/]R(UJq,i0,q+Eo+7cx/g5lnwt=B$kuZH*)k!.%RTqeEHZDvh7q?͞*.{lɲIut.gt$NK"$x#'M ")yLvY'ڣٖy8{a#^I]0%#z<ScLHV MW*4GRvvLڑP̨'3I-'?ͯ>P48 l58i՗3xTDsq䤺3.~)B!yNEіP㑐#3]>7m,{, rsas b>fD ̎IoG91!gꇸ\,*[j<r(fa\9ȵZWmH/Gy񨉙XN;}6^L} Ad7/ɞ}^"7w& !7p; 9QN(F̮G>3?}L&[kifoY0֪F_5m ^-ݧ'msL]Ւnoh\W˓{10;{:C֛l;Y (ךY +qdUV|?4P5Nzu1)a̗Id ?Ȑ{c՘NΘ|1+ 9bu֠EQp/"p븒PDh}1E(z .yE*M>(ݣ6 bc8pmdjݰ78hV o첿 '.SF$R2 "BRJ $9Mp:?Lw"x̒s=D$PBxRZRE͕t4H !DqgC깾{`۪EP2kP_Fv{+/YH_*!3wZW.) ߊSe5Gf@faƔ9I)D('lo:D; RJs6 % "ڤQpI36vp,7^y4Θd4)RKZyt=R7tqVOϫn_A|unNMe] neNh|suXـ+~4_|[7o>e(`iƎ&ߏ?-pv8ǦߎϽ T1_RC {$Ŝ}&e1 ϱg3ySSn)5X Bm{t:$hM5djYclx^}6+ލYӿoP8W^**{j7̗ -^N]?U~!i՟a]n8oAu}+HR,c8N4WNj<[P/yLy[fj呇BMx[?=LW5+WPCjdWQ WaΰM?(U\fK2i3=CuW]|15sl<rr]Q>+̙Ē @8R"<ʜbY&9&27^DÃ- E ٳcеU?9bYΐȂK 8#DfnfA BM);qUSBjB/L>]'U4mT4vf=.'qAe N_r(4&#Wd&ϔ(!SE*vz2X+@-%_ D/;EPԞUU`Y-W&_}ũJ!. H^Řmʉ1P܃){YA+ Vc"nV/:3%OgGGs= j33pIwK5d`Eb<=-~ԺڋV{ ,8ۧ< n-=v}K(]W1_s185јVԓR;t :2aS7N=MѧQܺaRW.B8Q6ae:,9].%A8@TƆM.F$g9"AUҀqBTnNqP)c$pPp6V;EQxU! ņ'^ɨHDu7y$_ъ+Uy}_=_%~~yǶ.]w. T1 !]ɠc6RE(.,d"j/eB_A+ŕ9'J3T,$m@x9"Zd(@F'1/džp[7$8գaeuwIRCKfp#YJY\*jA(!xRxc lOB9iQ;}G><1RbVH6m"F(Q3RiF:āraBx7>XlAPI r0,-Lq^?7cӈҤs?γf ovY=iOMu{1Z  x_6˙j@EEE 9PA3;+7>Q^šgZh+yDg#1 +wLJp5.h,ţ&D~:v,GțF+v[_#̯&o6A@m4_=V}?4-M%7D2.#$N1P[i[Ẅ6溿-|<ݦr`V_\4\\.וu0!ᓣwv^aB|CM_w oz ~ lc5^FI|<^zv`mrvo]M뭝sU,,_u $?Mo=NUg/BɰܩcY^b_x~pdۀ9Al7^%&ɐm3UG2eOXTW}U] ZөPv"du@?/ӿ/ߝbN>>}9H``8y 4v_C&CS6MmM.f7+ƝB\-)Pty5-nҥLbVI++U惲KgbVZEf.|BہƂ+p:"Y4C4e̖Tޏ1StqG_Bqv}OEf hexP'GJQ&kC3:~7[ o%>Ol&DT0`2h}0D[%nպL8t^/h' %,ƌ k`D$AI@e&%41]hi`pkګئhVM6Av]. kz^녵䅵W kiWŻZO?/ \QLR;!CΜ2L"*/:QlP};8h~R;܇}]^35Ylv0Խ㔀jy+rM· \/lk6n :Zٔ`@ A}NiJ('FvF&x5'yt&HJ0[(.m|#Wav(IbM-VY|{ ^75*K%F: ߜ ]6Lֽ|X`WtMK6"S\US勪O+Z+J[H cB ƉF¾aκ=~d"'Jќdsk,Qp2],hJ;H(pQ !S$\CPFI,\ii5 c@ź+s9&r6ȹQjEZikaH!MgUil: ?\hҩ]8.i«,{_y4t_m 3mCX B>p*PFE“g&lE Z$$[BLrșh/H++=@ }wxXp$dxlJ֥:<ԈS4^؜ysDDH F#CnrDy(Ż.7[_d~y㣹;ɽM8y`dsA᝶T3˰pNa]\ nB(>߰]I]Ϫ2-X|ik6껿B6z͵aRCb/B?mZe+oWlv7T߸ݮk=oR>yzzk<]dmhߋۥ5 OBuT~ikRLIsImSԗ]lsn?=kn{ܝ[kMpgM} E:R \"xi{Lx8u"JATVrb%i xHA4ʶ]Y-w`Pe,]'֣Uo-.g/XUr+7/KVe36S!궳ں.% Ӧ~$9k4g/צSPr1v/ɭiCYgP{W}ESbPVF7)CV}n#Rg2 Τ*q4Ti['׶+1ƛ_Udz0b[xU So8U7UjR b^_d˗8s^;=da8S^^v|Y5XB8FѨsHȩ%! I{;zE=y@CͣbſpOݻe-]VbT1z|ف#ZndzL]lR+S'~c@-"_~h ĔlԲLx9^)I7*~Ly-$fIV2vpYd%Mee9]1a4f<)6-i1OysRaNX@r*ݽ8{b*|kE/.V2da,!ku!w  [mV Id`mEh3hdŨ$*5YPjpI7fs4& حj&O Yk%V$VGb ̔@c9J֎D%*'Rx}hCڐwD^?2pj12Ϭ<nbg )uy]NCߞmzru^Zz4 |]n`, {F] 8(̛%YO>&@|ჩҷP'-נ*.EH4:&(fEwZI;` Ay:^L PF{u{.@d knC6 1^ "IiYcm2(0;<;$7-Ow,9HpFXJpAŁn4PJaW: x~!X:8!ۃAOcpp YY8)\-U8/-D`0R-aadbeKESt FdM]&O S3诅e`(b'ْBl`#; dBCp$ Lݡ…^/CWճO%P0 +:mqkr LUtlHǝY\v2ݝ}"LӿU603v v䯪=_  h;t[&}Z ߗ +ART6a$[XO:+'3#Y~-.<WσѺ-&sa>wz9;/پze~ g #@v$WtQ7 a ]H1~e`j^jѣ٘xNY7jZ-09i`(B\1Kg.KJS4,%bH39ӗ/^L{~ۛO|w:= u~v r)a׆~u ƛ M8B7j÷W7q߬w r8@|$ȥΫ/oq.f :|'_ m9jVi: QnQБs=?xASZlbSyK(.®oiأ,  ԁ4H)ʝq~I˄7ʒaX6i|tF"y*LZys4>-j}G&:/}ycFXBT0J $2XQ.M40Q8P\CUlSc4qئEׂc?y;{.Q{5ZZtZ[oq(+:z\M^49W土c.XTle,(]oIrW]ܵ#y9!·.~J)R&)v=3$ERÇ&EigeMW@cZ$\Bڜ)JQ߶<0Fbk:g/SuNKa 4.xVC/Tdp4 [LH$_ 96Ĝ 21 u`s (: b١@ъ@z%1d?Թባ^G7 ɁC^pyCO.ác  MytxxƛC x9S! $=4. کVG]$uI}dKG^JN1 -!qm4ST[YRA@pe]$uI8(ꍉG[ 2ٻd.QiC0F'f$1R)Tm)OA)$t@+#R!"d9hKldhښ@!KS%*/YvgO\l89g p]ΟXb7X~A4g,DzjK :J!9BMFw 9h3Q#5"f闇+ib`wY-[vON~p*Y7bDSmWs}u7o z\S^)p{jX_V۰ȵwWu[am Bvoɼυ-g( 4< 'Xõ9A-fX%CƣT4&T!ݤoG}Vc;ne2ǗՇXZS]dS.ڏo1@@׿^hZ0UmX?no_S7+5ݛZļ]}-4>Bj铿zXM%iu[?w&wA/U~5MX]uɝ׉ysY+Wk}/M?Ai邡y"K-WH&dT8!H9oMv>0j!HDf[쐳"O'=vtD/NP#NP#NP#[smz%)FmU@1"Q!ETlU2&gƀ._ĉL<PbO1::pTIY*pUW9S~ԵS8i/w-gi?;,LXPMq:qgɕRx<@)Ƅ8im8bQqJt1s̙:@1Z9`w88S'HJ{q  IxC50t(  $+qɨlKesƩփ nׁSA灘*[{Gd?,#ͣ]^y{;+U+N/01+XH8xQO .3LL7/Ÿxe]Kq0JmĔù.% w*<0͌Nayc9ep[Jq$y|t|uSp f\Wm< C7UZ&kc0]?ߗJ 8q0Y #vT~yT(+M*%,:([D4֘ )GB3RՎDxQ&AqEpVAq r5$`Sy >.g%-ѣa?s!k)a 灹'>1^}&JX&/dClA5c4|pSJb 2LE@zcFQv^9?7׽r,(pBy`08Ly S@R3Lxn.y+dXy*F C$vㄹsp M8a)5 $QYΎ"-y&6oC( Ab ˜0Ladx <ʭ,p,Ii *hD+Cs&"Q$gAO9'0%i͉t)/p3jվKrCK^NJ$ @٩ÃTѺ} &'u8g[xFkyRJiX\` JXDL2'p>UBZ5KQ 5\RBrka!Ӂ>TH`c)qbD2K6*- gd,UR,e!dQe|DynߴyX^&iu짩il?|No=&&^MeGP- F QAXA۲!+{6P̜fɤ; &&Qn"72o.(5 aA<]:ڢF# kI S4 N8*PΦ&rڀKҖ[M)>aڗ5%!N ACdH8hўEL4D%fcuu!=(Q҉L g<\sc޹HbD%$b'7Q)Eq$@5%rZE$ GY(Z[1`8ȕ`p3?|OO~WQuSI@a?LSW"k\t>ɫR9E3(QOSQwWe~JX/]fJ/mJ=>X: [C"s&S"b`"3F+P:wS{| g;*yҚahԧ˶hO3A=<>[ڔ7Eo]$;F"!$sdtO K{J5d~@ b5;8_f7m7 S.}ag.0zq\I3^MWzUxn!{0Y?EX ׊.ю~> fXuwFe+tLL6% ֌wkcz2j|e%TK=o0hTe6'zzeNW SY o>:>z,6C@&tg{[mp8A^L~wͲJQ·Z|Xޣ0Q^f<ő iRx1)d$Y-ً8# sj]Fh;Kj(E3L%t[)MyNjr~VdsC) 1T1i;cxd^\a|HW5L]!!qJٗ4{x.~w#M?\fep2d@.jfA'$gqF~?Kn( `{s Ή{ǹE8r>*Fbn~a"aJ9N)DsbI%jE&0FSS^SGGu_px'?%|G4L ͸g?Ͽ{f.̏xre7O7x. L=H7VwfZ: yx/9dAFe |xs8X`@/ۢ[]^tNfovRr5qV{y@M6ibLqu4Ij tW@g-J.DQP)K3I\2,0G;|~z,]~]>> @HdQϰGhre2;bՌ qb9aw}zum{K,{Ƽϒ`xs|TL\S%Vm~V'HJ{m8/I>/o g ľCC Ƣ_%%R]U_DJK5)JrFbK .9HiP).T>8H@`2 \gKU,r9ژE&sĖjٞؒJaU FR\)ic|r/RCb{M Hg1GT^e3@=@cՀUǮV,qbf`Lh>#ZoD ǹ Z*V蔂_sIO.dl,Y0!fBչEoo9dZף͛`}^(ƭxoz&c[sу7;;vx!vicoZ#AZI]GρMJFG!xNw==wCzd\aqwT sfHл(JlKT,@WNiu9)yU:o'=#Iu㢎>*ئoT] J/}ȣQTj{|NpzwL$$8QdbxMs+xĝoZnA7׾dӺ~͈]]']/D>7^.1ƃ0uK[lB+3?x'.&VʧvaGl%QֵZ={2FY.D)ޭnK}+70e_`f]ͯO.:Rv4);`ulUHJ/<9[)/v$HIϚsSgP/.{z,^dkS_e//y=(x;'YW+}UcݩW Kǂubu[T1ye.n]G\9sY& qsȜxc{E/Xu;9XEdYt4;Z}VC;A/ 2,?[{)OtE0+CCmlќ C?SGxAxXh hPx!ĕUB !ae"(x+^N1j،BN:"{Gq e@I>M )JQ*X;t&9eG;ԣMY-n:KGy퍔D%~-7JZCk'7۞߼0oZ ~ҏc?h"Á52kDHk5.O_EU}VCMv!%.$ : 3np"e#{:e/\1U<$ܢә!U4)@52=x`̕bPqfP^,0z]98sORw./#/&xdCU4CO<$3iF\~^m0Bh4Lu^+A{ͥ>z ց!Fe"a0BjE ɀ $t"H ҉XDOCea ;ːq"\}I@A teZm:;=iOD]4Lgm#U.(I 4.2Q>Ll](hUEE`%5i@vU~VЋ#RC$OZ \x0++jlWTmK,& G8o>J^^59+/oz+='TJd-ٔ=]drڞ+~ږo*|Y_2EԽE-SS dZJ)L203IAK{%t3ƨDGddH!Z@6C, Ԛ/^u 8iuJjR]ʾN`]H@\ϲB$< *&쳋@t0H@ₘ hDmroaO'֒}lw-t{Cp:ylQRkV^ac%>&+KpfMd446yhPy0?٧iS2R1yn($@梐}ц*EnRIQPZ  OHpB'}uu7sUB⼇򟕺e9:5&\jEhQ(u{jo/2P 8JmirRGU_ȊI4'qԯ6X%(Ⴒ<׉c|ݐ>!JtBB$Ѓ6HP'2ΔtyNڡjّ`/;/3ƆVzGlhjt]AWײ^6W‡1&a1r $W:YEGnD,$!8f0{z?ۜZIhYA]4 / #5QqQДd`0M6l謡F'E@-Uz;;>}>UjugHꤔ9ΰ:ʺi 5A]StA^JV1U*>REfHc||//"iI;[.D: xWZNj"FC"s8;\;$xbvkߣ..i" VfA^E1p,^ Ɵچbɚ ~妢Of2xvY\iOmvz=RizO&qK8Ę8ag^O&x>Md5kf聤+GaX!(Cۙӵ r3=azG$hڶwۘ[l75&DOſZwk?j# 4Zcgq{hwoZ\-rG{e pZ`m=ۋٺ>m{ldb9kp*s]qmo_^+`lH 摮ÀfUP~$%L86`ye'>/~p4UG],kε'餄:pe`EmnB+Gx ydʗėj}p4<ÛK?o?~_?7?~~x~ku@O`꜂y Xx _mi֡-c|q]%w݆qrKhq $O~z28'FJ~ǜU4H_i_Ϛw'8vV}0+ɜUN ]~ꗜukgHHz3z=lJ5Z~گ-p{\ѯ^B{?t!B+DˡTbJt?;73t6<%rN/J`lR+2Ti?M쿝GDpg Eh8mX/x/Y}A0T0w77UM&ʛZ?Z)>}Q'~ k<`؟zWF ~ޥ ]l# V8omT&hP`Tg5/j^*ݼTyFLc,`IItYnHKF]ݚDrdsȸY4 'K$SNH! ךmm7rܹ#MC7^VcgHət<^#bss #:_xI<$=_հuV l0j@ԱA#&3 {KgYbi ^LĘ :#SdS{:1nJxd˃ '[\ʌ!δM (냌lSǫ٢f4v{TEQFGmAEQ8ULNI a(< Z3^=.tqgo\sRHBrtGp2LBvrWW!)VHn$e%d u]k·_n*-l"9K:2P-'] :no?qpڷt,<]~L埯CwFlfj鿭ލt4"Emq>owA5CU61ew=|c?2]ӫ2|_/%=}p=mqM EjΜ]2qnt<\$¿ .}+tbO2ٟnH].SprMg㫫z-6աzc86釃ncF6nKaI )~Åϳ\bL\ΞxL7L̔`ŁЙ~KprusY HQ5gReq  -R x%Ib[F*QrJc("!7'UlUc57GpXzŹ ПgF̌3LŹ͌]>L(hӼg:śZH,&ď'<>r4OiGiR-e0tyuEu9ڋݴjg/gnʿ~0W+^_Y;:j/wfדt.`n*^O5~ջ;΍?Hؾ'w8e_>Zᛠ15nlt6Q'z>3Ңa`kh:aZ%iGG%kW9#$@1{}{Vpǚa:&έs+K&\Tm;|o{l=`r-@ ꭼw~L~|^ d⾡}4lќ UGxAxX ʣ* $,22b9`{̆#WlT;ZRcَe҉͛H@yVp *{5|l/T||v1i>spm2ɂY-2{T8v9uoHiZcYr`{Rq|7/dHBX+lJnF͵Qdp` #j1h%zS/*=Ѣ.+E˘q<\#37=e N\\ʐ%ىBdQx\IVgTѠvUG|6P6]ȍZ-=jNX2nc#/#{G8B@*ו ^dsm:Hfڌ\|\je%ۤC |c%Z2)OID^Azk,ѫ7!( 1XL@6 U spJG=R% W'lW8F]y$Y*1iZdKId:kV2(u$S (8dy6A&hՅj5lCRHYSLN"A21'PyV'E&K~uSX&'=czqzE^U5Fm"-qZf!sJm2xB0.0#s"TUͨ DkcbEG'Øѥhr)j\ˤjkjٮazd ..<8- 5;~p~ 7/=6r^dMFA 0929eu r:9.WUcKZ@P= )ڔi4Fd1cRf~޳5vl~p>fY :salׇQ+dX(jǾQWֈ׈FX[T=)hRk$*Am30r"J'LBnP!6N hY#t F胁( dV:(r%vPM}}U{ҋ):Iɞz*:z׋@\Dz@ < *0Xv s0HHd <'2i̽^<^<XK:Շ>y yN[;(8~;_;p§<͚ol  GY# 8bnJJ1# Ԫ ~3NU|@;wNy(eɌe62fdQ&SFL1@ȆL9$7C;FT] ZhWn=z! 3qVIP'n6L d{C\k2/T*tJ]cb;2%;\=DzzgLA{Ec}r6bHAU`#+E!UZ8e&Ŝo Z4ʐ䲌h%4 (g2q8o6CԓI!ISz@iSdROIȝCQg2Mhy32`ҨE4zQB-2A,*JRM&r󯂉?; d۷z㈻5֡z! ?{ںAH%hCED@n3`vנ8ۮ86ۮ8ۮ9e Ah0y5RR$ eq%Hh);l ڃ_x8;0SHos.^U:K2R C]1c&z W@ͱGG/,r GsIofրGHՀ1Y|)# zϞT< y-mЅiGAk6 BN@`( IR 8K+}lprH;M7=lEf  bw=X8n܏#k1Zc٘[pB0!'jà {KQ= LF:Yε$dUp &XIѢʁN6+Qd<5ݵWz=޲|p-?0T5♢ Y{Q\8Q-Y_Rr$4%I+ 0'Nj ." =#ɴ #ntFϞۉ/W;9͈l,ft»w]/n6-E7fet͕+@hsץOfzFà!-E=) I%N3$nlWX26<^6Oi"Xq}8cѦ:#RYTetx)yT43[D4B*ȓ5óF8F(܉qz![c$<:A`tR;!=s-/ S =Y saf4£FXn#ըhqJzO~l6wUY7ES>{]&AQ1w,]H;˖!+>%FrI];ƽɌz߅Hyt$TDs=wv}+IUUT |_vR$NU:v!3*nz=ϓ\Rv}&9_1%Lq/WWsYq1yvYΉuoQf}iXg&ƥ#hMC3a0 gd ʏùWQ.\ Ô1#.Z՜_[w#m`mPAmmEeoSoģLHJq?+M;ݚUDfWKMf猽6qe #ڃLh @Jm1Cab%NkGAv9gnV`Z.LM;S#@sa+@uQ,8 خ>Xb{@-nZ?~bV*4A|B tτ?Б%wW»<󶣱o"$hexP'GJQ&d:iFYR:&hB$O< 3Z+A\ h˸GKG!Z<ƉzC 1#,Z!*%a$2c(s&s- LbcN*3)iѵ`&x^aԁ{aM[VHGv_GqVպ ;D>wsE0IMl 9sN0Rk16bco8)#z)eKa򊇀݂:Ȣ5f٤m*_u˝l5υ lXt7{lw``_;+#<廠՜+'7it[}@{TaIшr޹ LP =g*3{ms\0u>^ccgmI 9Zr;!YQY H.E6vcZ窚ڗ5gM&ExZ.HRkңH8"Nɕ胉Tͱizm>% s/eO9-r͙A6ƒEkCwʨȺO.J!bq!(L,\ii5 c@ź#sС0G EέRȰ,_ -*M(,B"Ϊ8 ?\ҩ[vu/ ѽ4tLf( 5`) #>X!:}|1%W~rz_O]؊(x\9ÆJi` (FQ9&<&<d d$[Eq9L{ qGP`&Q%Q D /p'"3Iek*\孏=_q< vW8_H|$b2,]]a)sǤ%xM4=k.Ћg?*MYe>ܟ\%uyOŨ3.{۬ʵ9Tv9fZ${ȿLM/o[JDH,dW>M3E:(#N|!*:yPNa{ `BsP>@"Q1oYޑ"y>W@4h$#V::˾>s㿽3j:Rjsfȵ1O{ML錙܋L#NmN P,&G}`m-iwW/c ezR^MUxR:~ԫzmJ<y,![n5P3'1beP˳,$$D#MHXI\=44ے0_ }p%m_<& m@ܻIa~ W~+4Ϡ&ðZ}A}󇅶YAI"x\s%hv}&Lb+sj Kճrymނ K\H7 0]HnhRko[q2 S@͆'\5 2?tZodjZ2ͅ#_Ofv լl_ M?y[T 8LCڹ@S67mlѳDzmkWwKi,,QX,[Y̗[(;F fV cŮwh+YyXHfPGZZa0ݨι+#i&CyY&~~(eu0pC*d ǞX,f";P ep vsm}ݎkJ)\5-A[ޞ 4~Z}?n%yS.3DBG5 *Z p36/ ˱A25A {Kc[5a(J (QGǥ)I^(MJچl ArV#jibKhe&P <dhXᵌFMFS -VQ)/!դbyK1%Gyy1$)$<`fBQkX@Z-m 1"0P3Ș;7Ժ ݂D0s̡1HH/E(13^2d%4Ֆ .m(LFD@Mu b6TAPʺ ScT2:aRSy~Tk s16>E$"{P2 d* R2EPΎilE=vZS:K& USB EsTeJx6 44`40IƎ]k,q,> vy9<+f@57.WRZ! cF/~ )dgZoJ0]65, f,yՠ6*gUnSoĽ"Zz7*-Ia. f|M:[P61sɺ6Z4fk՛)N| ~Wci$N>KD0umQMW-d0L&Sns`*Mÿ;wFEԺZCm58EMsp)A/7C|C W؊"D W}\'W SN#:lg4+v*6C!DE HbQ<TCg=kU4~ a,$v819EvfBύL #6Fz@pkO=,%&}*` WMIµ5#Ϗ;jyцy6*R3QVZi#e֏rfF8av:= ڼO#0%7aDD,s}ksڌ!$bU.u_v)&d,¬lzH"b á KPq9,Xc.QmvEC@Tv""DoaLn`j?.z?Nt7N+7dTU:TYh+ -p/ӏ{bs5JFw.rzթwWR~mvၧ۴?Ժs/fG lI 8%H3 DT)i)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@߰M!)`3b4*v0J Xk+`0@mY!%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ RUŸ=$%'cJ X)E%c@C"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R}J '%7VQ9#J X )E%"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R};J;K-KկԴ_^_V/߽rRl(M/&6!ͽ8?>iI @fas]3)ʼuGhR>j~jm?^7WYC8t 7qT2,,\2~-b5I̿>}xtJ_r! ~+L^ [\Ps`eY#nNC@OW .|z_҇/xWλ(hz\a,SfӓrUեnaQhwyC~S08utpwtk+P>e=p:J>G^:pUeh}h{\`rţ Jۜv[ݠӓiP];V#H;osL̷%ܷm鼮5l0MHM-ҷhWNhS_`#~`z &vjd 5!Z>Z}z%[ Z[եy:Y}[%lAM%V c^C] 4Y V meү U4 t|Ukn/L6))qãPjh'\X%]LDXYY>eOǓt|0{_o/#Mk=k\;4S->sҟn;5WP,#̜:Y/Ww\+!{f[:#oQ쎷-ۣ]ߕ>ޝŐ_ Qz^GX/]Լq C3~q<<6U덳'VzuOҋkyGzonógO͙./ 5q?_NO.>n]mOKL(5Fa69eq5]͎/Wo/iB(ƘpHͬ$ ifdS )]U@Oku_dX=gbnk1X(ucdHB'(:D jVdr)<${A(Tǟ>y O͟AO7f-?ךƼ~fQC>2cOOAn](`)#Ml bؼ<{Zz/H$>BArNP/{9x< WV>Ǔ:̿c[ee $=@HDOϵ| ?%>:kTrۿ_L& mdi 758˝LO'4x];TiPiM6RT/SR=ڗi 'r֩~?~ *y d] Qto4E @=@7m=( 4TA2wSVd;crAfBڊPKq+pQ~`y<1vB5zH\3I"kq s(Q${G|'Ζ݅;Q T1Xfcmfڝn~pâgKa{U$H煠n{}do74!;yT*S<9:YVA'' ν-@XJ;E8"mlp5e^kR){xI^$ɄGAݍO0PfÀmֵEnύ>wp&OWVѭA؆=%z>v||!idcDFۨ9~3usf1C1jJ%asJϋ>\ЧkZUNbh齳f|,}- ǁ9ofذ yuz w7|E6b&%L*+K9zcXZK1KWۨ>.W2(MO0ެwG k {c|Kec!:f|p"?.ǐhkq,+)VYdѝt:q+LВ}]2@կCrI|ȇD/Er96Ug$RҐՔ(y%rZ=5=Uտ:EHX@ꉖ4$Ob1rȘ^w=n\d39Y菽xiwej0w+U`]~T6ГT~:ndje*!lSDp*V7̻^+AR:mkP1"5B<0HAC$+57&G=?˴95MÚy*ZwEa-= حN8:{8Z"GsB:iv,UJ[1@9ڗu%!N h!#2V1 xPD: .$G:*k bևQex*ea(8hmΠPc%RJ˩I@O ՑEȲA5qTqd$(ϧ#*G2P* >')E_CXWpȠ 0kIi@Ō $d<9$DcR4E% hx* zzdpb)8T@TmŞ̛z/=r7Mxǵ9{ я蔳'}?h0=D@bStWoT e-;vC/{(Lof; WBZL@)i+ѹׂBt+lr=EVIÝ'ZwO9@>rr*K38zPAQ*.ZG/X")o5: AFHfe)h _'5ot$ Kᢞb6K_1}jkfClT8AڠOrkD/h 2i⥵K崊RI43P2-0`j~2rx 7&xS=]Oֱ};G܍Շ때.M ZzJJ-8_(Qŕ Uj]Qm$"*N~T]vًU5X(h3 :}sQϮp<&+)D{4+Gm%G]fJotVy(ƙz[~_}]IH/4|,׹iLwn:z%td/璾}r6k xթi`+%ljLPsy<9ɍ'gfY 2 Bm4WL[y.*!"|N#uK^kv">BXχ^-/:Gm!D} 56ӤuI XƈP>E8KzF!uh T*^*Mh(Xj#jkC,itN*yR:9%Qx1-SwwP=@b,_Dz4pG.[ vKxn U$s\.D`kd9r|Ãעh e[."`( $ltP=e7t6s_6#3v1u ?uT;"VL]5m:YgHZtS&]j5zzǤcvm]f.ݶfㆶC6nw=>/a$"h^ ]A'XWKdN~l29v<oi {&B{hS;d$h'sP \\Z)'5i%qkf[qE'h,mo^#?Ηݸ}yܜwyoGqL~K%쥤AA¼TV"&1hd(34q3hh-aP :hNE Kd}k$x#R+8PQ=Ġ >ʅ Ahf Fə9,YֳŠ9{$N ˯F$/t۟OQvW7P^W] +bpD pWeÕ6L8 sc-Y◑n}nB "Ie7v B|'^*ijNڸw)*\Se5ڏ`JK(0E')%r RQN_u?LYY{2e@>-'x>y[O\ _Bϔ\3&ǫp)>BƯbˏgݻknJvwii㴾Z;:j|OИo8OG{sZ~=Wm›r/oo!E֗#ՒOk!RغG{}XZɭ&fMvqde^v|6.}\+nCx~Ӏ#%~wcѸ 9E;佌d$>́b.z{Ux|_Ϥg=X{A\'.Q“1xjԴ+i|aƳOi .s&dbBgR"<@<)F1%mck/E4S OJUؿv} ebBPbVHѪ J%m D"/*# Qab҂tHSwĄ;~|0s_v^o$ + #w5)i-<.GMELqQI,'tAY ۏSf>ttWSB wU}8O{e9ΨF ;;s/rf:^YDExy'ju?$rW9:NE5>NFo 7 cs)9)NpY5,TNg=smLU•h?,/ֿA@4cv܏=}Er{n!qjԧG͈wWW~j.jy~:8{cV 1i/Vݺ9ͦs ;]I~U럺mSGuou6p\^Y_V>z/?ԉk>'^?l~,O;x|߽xu(?;dkFbx2jۄe= wxxӍB)[<7L羼⹻Q^|WWH_f?{Iy>_a{z7vs})EAW6!5Η/XE;qYp?0 GEO_77aymG}4գ:D0ExMvA}YJOSֹDDԊw0tR/CUC@VJQDI.~mV"1]VT+QdB[Ҫ$BK1jb%]ɮ6$P '}|`4 ^f?u>z FSXW@>q5+sWuyqhl -_k'T2zRR~vj;DeߜydXokѢXa gGzYq0?-]Bhַx{-OO-lqUnx%i_|@'K.41hn)?Cޱ3YxdȎMȋASj -!{P jcQjҰDʄMyk}MN 57QUR(0$٨I%Wlo ӄS>N5/VWͮ^v՟g[a {A-ҏxqa?x>_Y\wߟ_>ZE_tJ)A盨daҥ;s.N?݃lcЫ*&xqU,gx~|xHmv?~_|2J}o.r "Y][޽bofN\{+:dqoPtF x|w*^VG=.휾-Hd2W4, ;gu t*7-Z c;$&ӘQ<]^돽Ƀ؛˘݊>x=[f W܎U[qEVD1 ;}nq5?Wu7Wǟ?k&er9FN}jTQ[AJ4$m dHU&meY2j_]YVasSWlk䬖_c_&v.&IRE&'RF⼶l22 Ib7v(|_ *DW-JHN eZXr>bJEUNԅN}yj-%J`ITB3JpMVdl)I%eGEz:r^{?Rk䒔uZC8* JTGTI Bki{,t Lf cSJx34f3Nc% $-u&L*;zߧ&:$)2vaӮ/4zQޤ CdJ C#0oLp+XZ\$N`u^"=Lˆ "=q<{[Ob7Yxc*yn+uH(R4\m?w2X &>tFVcm8'*RI`Ef-\ 2E%daMI-%2 MC`kgC2VM`rsi_!GRӒ-8fm@ii fJc8qOJ=i^:gICR [AioeM1jD䒳"ac5V@&Y)d_+=VR]J y; TJD|S(!3)4\BnxH v+ **TIh 笀<6ԠEAh7x(Qj1PyUc-+0 hty27p,Մ61c1Wuh0!-CHL%eOVe]} CP{ 㐔YlXJKI]̡2< + ٻZ(Zl FPlQA1x7 :z)Z wUTIXI2,#<"f  Fk=I2Li,kZ̦rʴ] Ge_=7i@ Avm$Mr3._s*+1e4PNiPY0(3Mr%2Y,5% AGŜEN0tS5*?%Qvdr=c@T Ÿe ` I&- pXwEM KѝEpi7j,I{CRl eg3 RYUW6!;'h2ZëIUDIFP˚FK`3/UB?% D:-dJ)ȲZHzҰ WmF/꼵 NDž Lo 7X=X{ы ~)}ŬQ@r|bQBu"wi$% !ĄLD0,|vQ4]82*}t,}U2z(9Hmi>j\zU scJ"Y7nq(hRcQB'!$RFVWE0NO6g0n,9ѴqT/>"$/eC;7C6d m*|ܤJrB#+QQgjbQI9mE5Yr&e5Dw <}X o7Oy's8->J / T+QwPx''sP6D. —뷃c("Ft3 u $$BoLAQ{pY$\h B;ڱ-< %DH!Mv%AK]`&Ռ&"Xj5YҵD5^ Q e#Kpqص7Fu& 3%"!(WCd@!rP+;Q5\-J^c,*'a'D(AV ?K$@ A1*'r(Zsl|`C+ =]DP5VY]xfP R)Ԥ[oETĽPE8F 7i[! AS5/V4Y }cȔZLOm0DƍumW/o0:/C}&X6m^\+hy ԍC7H7ӌJD3 58)]Qa-|v9Y:Zm*5fZRQGբFCo bL Lyza·TfU(5n(!/k CC*樇2tyD0CkAY.j0uFAtPX&$CB6z DzEPBzk5z6=l '3[WT!& urUpՎr Rexy ažb;J(0RSdPGb$djbb,b"ZsOK6q zPc@sjg6򢌙+iPcM B%Xtu2 $jd-'k=kDv ʷҔ/\%7SAj+v'^6B zxXA¤aG9oSĕf@hbeT}_HO4%7aBD5>d8N֞S|:㽱l,1eXK}f w/2caVCMv"MHb %:)J-0 Ye*?u- AyG[V}TQnsԫ~yi8x7d7XtՀ$tYh pݖׯ/Fւqi0ف5/8tFì^!Wg;Y~OTeg>XU}L$P3!`GC)8I X)=@_# QI&bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &zI xT$)HC\ҏV=x9A&F &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &zI a x!!`z< k{$$$H!,;N &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &byac"z<$U;`F0 5@%1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $C}~ƣN2;x}ҏVˇ7?ڝk-Y^;9?tfgQO2[ƓWu>s2KNywR~1[I n;n=r(/N|^vG }}؟AYbX~8of˷SkãWu#;O@d]E\vPAd|p[:[ mh7'XhE5o {4Cjߩ1n޾4=l& A3)oGҰQZsL뻣4;gҎ8L8vqu PIf z)R DsWDn摝bdi.FٴJP%5d(0@3L*eo$򗝿cabCqF0e#ٿB&e{^ۛHFO E2R9xCġI3=Uu7,sJt X.ҫsۋeP}= :Yl[ {)S0(zF)sB22"K[8i~8 לcz{51_@k fAJ:EiZKvԏ }Qv}9XpE3x/||E4:Ih-\ k3V $ |[^I(Ը_@Ҭ,ZrNN.KU+`c3EޓT듳39gIm9[ /Y/$4eҐ DDe$Pj5NB }WR4T|orc5 u&BÜ d " E )hILZZɐ8EO0N0`7AmkHYtY,/WMzsV꫓e9ajuL <{gj/"*$\/*@_ɂVa\۸+\VyJ-7돭*>i`|"A1U31MH-E<ѫEͱŽo C3p^.F z qηWY{;04 wD5pƟ̚jTXov=抭oc² VyWRj-ʂMٙ @!8 *A*EVCLiE%QEWSrDH΢Nּ [4G'Eڗd ShJlbKZ.oVn) E(/) q(ɖmŠŧ$Ń꣗-xJ~*CҊoFu!a[2t' .&A%jiVv>3U٦:bX<8P%>!L`a̸A+,2xe9D7)&*BȪC{#VFQT* 53)LCRAHʐ*3 M8)xd4|3/=[*OϫQ>&XSL)Vo{|e%ibu[PZw\4NaNڀ^qPeST&nw,FC¾cQs&$b> _WfZ}-y 2?/0PrRѯ(j9V} &FkQI?qDIIx6]vzW m: )p0;p:4OXa&ɜ4]-/YFyE7Em_l4I>7pMmZvDKKZk VP4AMRwzUI}!.Zgerl vjB ZfvyxwEK-C275WnF^̹ ۍ+Yu:8wnߔvɔ 5Mw[as&[mB~z>qpVt>5P}k`ĝm-2~m<yv 3>5R}ԉ(X*E@X+ Ge9a4v⊢\QH#Fy9DMq(}.>@Yj&T rn]j/s|$4^Fi^e%2]L ͸W' FB"q @38R} +4>2#cx R^ 7."Ph"$Y&560CYE:kC8z{9|slK[z$ZXGͽZ춧7C̕[&5iI?eppRi­JH‘,a5`3L4#8RtӝA =x&h! U/M_b6jYfXc7|6I#1`̠i?` S/.H4 .D O(bM 0BuxNxEe?nCr$T.<_ V}yXǡ/zr.`)2봖uUѠ>-}U 9~,{DZ_:T5W5k{ѼLj.=)hY@҆ƛмn2-?E ǖ->cJ+Y}>PhX%X4J9Bs٨Ӵ%¦D H;B lWᮥ5M˟C!S=캟]o9ꮏj} ܚxS.scJ}$#LV 'te ̺2AMl6ά\B=e=;Y bQ]a苲ΆͶ]hx?og2t Jڎ!ΨcBy8X ʹ!RJ#:UrZ&LIX8FҎY <0E-A^jFu,vNKϟ8oY2$} +j4ׯU<ĖXmYw@Bk$*DK x%Slh<gD| Pjב J)=FǤ"9̰4N+ E$5OcP))^O1YeX۠k̨-!6蝦Q8k T{gYr$PqqsDGA?-lǟԆFX ñRC,[S &z /KQ HH]}":Õ2y2Bl4sߵեT A{vB$9_J ca 1Lp2̦(D>6.|"aymKfՄ9xL^"v[S(Q6O2v(bA 1:AS>S`קa]S  N[2݄6sp'X(\ˋIa8-̓XWIy5_ç̤Z%Xٯ뿪 nz|(=u-puߛxӚfpV\& Q&{+Z 7/^ΦWՃW`.s `a/uzK"@ȇp0!q$j#]7 CaqeDyRFpTd~h8T/tݟn}zM6UjXŽQ霻RmTN?,_\|߿xo߽w�x&_]%:i#ro~ ߼Cꡩb;V.g]-xquSn .vPhn~Dy/u?w./ܑ 812!'|IUZt>󞫺X-QAE ܵ/X~B@gw9Vџ-tUIn! ]Op^&Y2<(PS#(wRl:iFYRd\)hB$OwfVX x іqjCGjS>; %,ƌ k`D$ˌ(Q&Z("nNPO*ٮ3WjhvПrg &7HG 0>ޔ?[qes]i J[6Je! g ƙAR|]dv2-v{97=@X̔ T@V ᬦ4. &F !_T@h p*K\F:¯ (Rz\TkPZ$H)*۱8-NYcHI֮Ӌr\%{hѾ=CD&u,MQvIy8GR\:jq8"T((9U~K_;[7Ro- wSXO@8N|W-%b_yQ[<4?dmx쵲<2ap A}p TQ͍%1!(o$8Aw)̮1ڣ H* ax}\aaLPy9LM'Av^%,@2Fϴ,sJCb`˝E3#?zÈ3(5b1*7n K@.* {Sht7;[s۶i˽?`ӶWKνsZtHR[ңHH"e5?ϼ?/{WGdJ@6<)ၿ`b[0 )i]+uX:tRRYNҙ)VdVDE0*[׆ l g %g|`;D>jiҭhj]ܞ`m~+WOy!Zq71\7rjI#WT]5-P.-QAxA>5GF9EyQPP"I$!Y8!Ti@@%J%+*Tk+{sS~4k_fW.ol7ފm\'fdҫҴ3aO'}taBwvݻ35itYycƅ}sačWxeJll/寙%dG&Ev}H8>ie_Ӱ/l6 ?Omq}4GugiP^ьcZR0*={|TIuMNDrhBs>ٜryO4'6T=\w&Sɸݙ3Q3&З7}g~vl~ sV%'G #ip錢+[4uǿB9c32+ז?ABv |xޘfawQC .TowR}NJb౼vFrצ=:Nyjosn tNnxjQ嵏rl(qέ{t~hww'njW7ٟ'/"\Y!h}4"*ٔA)BQNYiDs ,v֩Ƀ-EFVI YbLY aOe%IR=rm§DIxPUN0IR>خ֔U7R[r*E-)d EUY[[J^&ѳFNS$Jllll,kuwIm1ǭ^^h2OVKZy=%\T녭;nVR(GMS0^?wa~x JgVW! ;TUEI3뢒rvפtP"LJTPVL9:5A"tQȉ*dKzn`>q:y%Pհ/jIɶl~2ǿ~s[uo_Ώ[wXBmWh- t CIȪWI)xT1.fhZ/rG|38r ^ 7+㛙lw/ڝї>HRb$Dٴ{]Kq I6 _uK(0ax+)sɅc"=(=NR"+P& &(IB>ꨴI ^Ix X4Jdu(&=x~ų/<ϕ1.ߞO\煚L7yneޙg;soY^EA<iG=QK`tW~=ZC-ksIBآv콅Vsz>+W0^$K5z#q$2 %9PdH@ 1tI(I~ ]΁) ĜSZB!FY"RBW.Ij@Ѻl7qYq^#` zȃߎqiۙe+XJ;au yɒf=L! ]taevT *6JP5Yj\pqi#x O}K 5LgLK K{0:8]VɃZy^VY*`gƭzC+_la[[uC.1R'RoXd~[!+߽ROlN&DUTd尠Qd) $.TWh+)7' L5 RRɗdAJ3&ݞ?ҟ/K3 g__LlG߰L?8[&;h*q˟6?[b 8LȲjYJD9Et[Ŋ ɇ~_C BRBǨnS[Gei&W8̀P.V=+/qq2NBuڽiCkk$\L$ѩ-fKGTC9n@r>@JۉoEo$pPGѤ3c& 2p̕PCI8.aogAp:uC=##6Fv ZDicTrDU22zFߢ8:8`1uRՑ*)lNgOVN'Tz׻~kƵN޴~zuoEʱQ\Qw` ABX0'KvueO'g|:;*(0;27\x=Ǘ~)g_1+N>l&{a"Zul!m1-$Rn|MW*o}=ȷ ʣ/|q !-e+Y0wC4j-IVL2.6Z1 L׼K_ひ|]Z\-cHLLF%i#L Hr14V[tIoTNւʁ)>N5([ױPHu> K‡PoM{nS_ d64#;6ڋۮojMkCRкGddIph},N6O*yQչ*o0`/ã)K' &8'>7oo}ηuA ]{Of&Lm/JFML`INTU%CFf F-S]CU+7瑾wAIJ8kk̦@@uT" 0bBZk4$Ip_׸W^n/TxՐr̟ufJ7YKN}.1PO' UJsdlS1ؚZ"Y7~~\$.&;5u[WkX>-W:vS fѸ6C-D]iiCK߭ D"[bG~s)~s-ٳt?y??>4~#=ӛ|}_+}EWWؕEd 3*L;KVp&_U" "!Z #𗦵wܟݜyŴ;xPDkg2@=BsTE)^فvMg1kWPβs(RJ0@E QIAP%ަ~7S#&NN]q-P,!%HSQ1:ˢ;Kߍ z q=+iYP4>C159ce1 0vCǾ[:?)3EBK4R52%=ֲgU=P't"*tbkg0," q$ƬJ 4qPn {ŋ$ R`TCv%#XQX.$~QxF%o)3^M{&oV"@r>'F*z2xWb̮VkC*U~e=%Ybm~Sm6mzABRq˳CKӎ_m5xr<ۦ;Lڸ-[&M7x7ͅ7n\kf㗍*F@d!&S+:c ӘQ*+T DB$Ъ ERNH*֔68cvAjl~;RK G <܂FXЦϙC2YhRВqZ_PT&RqULGNVn/0`)mXĬ &zčN*@EefvFw@)aD 1'cФQ֥16iyA%Ja1Ętή)EVV7/S‡r'lB[Dk':SJiP9*δ8kǣɘOF}#KǨ4:ۤtq}}{| VEQ\ʭ{t V{ti4|ߖ?v-%/];FKMӯ0׭ \!H(g=J2\0't@L֕u .zj䙳sÛRuh{Lm'ֺ9"4(׈1n9,T KEH)%V^7(E}Q}bm|<>m;NIζ˧[gJ9/5?G1lnL!C$:Fu,m8It zӆ"s VEoc9. a|Eˆ~[L^=[48|Z̵Udp` +Q5ZcMKӷ^jk"Cg]s%= i3[8q2 =r hh#"X!UDx vJ0LZ@9[t\G&Z{Kakov2I 7l/mlwyNُP"9o݁>^U@6CU_@O~py$3kƟU@ZY_5Sl!4Nlio&E GM k;R@20!5"Cd@VB^rQʺm 'b PzXEEf)9̉6A9[ߨwߎ=uGD'>&)$[J6Iƽ%%a"EK;7ey.XA&hUAE` t$ *[I/zy$8@BbȓׁlStR2vڢڼj|7#%NZ\G.bOfko8dz׫o?M&qÓs cBa\v&f cL.aFi9 frDkkS&#Ճ8L.1ٜLk9W3)ښ9kzX.wՅPY^>.\QTm~Rc>Yf'qCGɗ񝱏KYIA$ w\!FpN9]3Iy@6>Y_2<4c'dB66QCDpduYCp:G,^һ)ZܮdL̝IwڪVNRx$YRP-&r։~X $'ƺN1XŁ,dBȊ 3,jA5GQ)"hs>l.p(]шQWֈ׈F!8T)伖Rk%w !h ŌwRdRiޢ@g A5p6@1* 26\ӑB )D Qs#F2+#KyZl}}ԁY .:q"TuboDz"!0XP5e]2$[v`UMCŝPVևww> [նy6QG |8qAp}E?jEztܬRՊ֜NF~&96A }ĚȔi ilPy0x4]M1 ,LHQ">=J+#:eDJ(.QA}O{tgTeY]7<¨.K [p1 wJ&VZ9-9 \/Ph uNf%Lr^0JXHؿFy)䰣'`^YZu;]v\E; ,Ҋ5_+st(#kh+`MЋѢP>iGk+\e2wmpcÖ~F4j>-zBVLq>atsʹh lmS`e^%\P:qc{txUS r!X`h@"o R*ԉ3`#Y<'Rjli`5/ cC[+'6hjl8^²&7,B'=1'(tcHa\rbd5ӗ"HB:}2.)Ԯ yi7$sch2+5D;IzR 0owoǖJvgvJX #[EKdiI9$&bT*}&zK!т=? qiA"JW+-e^y'5FCMHBs\T[Tm / r}q#L >OuivO`ۈd8J w0ñxE>|0RL1I ~rSٱW3_W.ݕISK޻+)ב'1dP،qKAqP1 1w>Z .㞟(fsRG{8rZ =w1 K9eh;x=_hB?NOiB[޺Ox682YZmymWv$F^{ue&zW5'W̐9j ﳼ{]ʏN/ެkqN1G8*9?۶7ox5#ƮIҷGuHGquF@Bpi"Xwi_Q6R5:HC 7[0 XF>U^Kc5/ݼT\zÍcR\$!sth2nK xdԩ†Аj%جc,.F֩]QIgR`&57dQ͂2XgGGPFm}&vlEf 0xJdIQK8؇[5#O`b[MEXUu+ =GR#[c{5ϧ]Nv32k@Zzf:l=ߏ>\5k5Q"kfA\Tr*6#b :b!iYƜ,yBG3vX^ esb* s>a䢟~J=fc =CӪ} wf>gx/RC%HWyJ2z %#5BIS/꼿pOIUM'KNB6^VJ ;#|wOPaEdj&캻yxsIJdZdiyD;6Q"ʇ_vԅMD ZYN*")=v7rr,OfJ|akнQ"TVrb%i xH#FyP9Jy)}hҴ[w P^/puVRoxec/4^IOrVsDF $!p āD8DH\Y0 Q+nDpDzCт9G%2!--b"TXo`\Җ>vսZlS̕/+-SV"_Y1s/mҵjܛdGϋ/sZ)Rlq9 D^/H`k^e??/o},WUm:/љ.LS)Z\<63m`mžDP$6MHmE׬3h㷏/b:)}eOμӭYeAODta8[j+ܜw{&u\bu$Q{s1&VvJDPDjzytb8/NCzB&U@vQ *(*%}Vƫ?ޠjl4;xP4if8WXTJ)9Փ[`2k%xK5ky^o3 rȕa2QyɧIdHk}dl& [ xY(6%W2:&=A|zؗ78E`_s˭#:F %} =s; VR &c^.ř{ݷmƼpn $g\Kr1-nǟմA:Ԃr>H}ܚ:%&֎Mg shgK  *ߏ8^W"BmdJla@dڳ%Rt qያd)f.͓Ҹ[ڸf*)y쩱 .]` Ar9ACb.n%lENjb"u^?k~*a]iS`M@ZX.S㰚\aAz$N Ieb׳Фʨ/1Rԫ+UW_).'ŏr@M9@/m`URXiïS+ 9*li>uj\CmtrWz F*л$޼ bw{W+u]0 &A>}}gW #@u$Wt6 ia]Haa)Yݲ,t1wߛ~fmUj:_"9|9 ǥ>EUc|.Ά3:Ѷt*']uT3;_xw7oߤ{uo՛w7WpV`\ $ȃI ;gCw; kmZ!Ļuیr˸߮w˥Bs{<@^gM߿ozKqsϙH7fk"$O3oR7ބ],皨H9Uޗ"D~tVGl#=yS;;F 9-պmi$t^O"x_JXa Q(.H %CQ4107DkN^CTWg5;jqU.>ϬZ c,fOsD0UB8)KUЄH6J(\tE?` RMTiC>  ^ wE%EJ1VَqRglQ@2joSĻ T5=Uqc}, =c6^gtK :GGPҚ' ar[H1<䀆xδ7̂ƅ@;URёqƍO7~,ZGt GNF\ a>2Dh,4\h?JŮMZ݈jNꊾۤIGvZiMqT*HB.e$F(؂hXHm-%14-B `(3*h2:ZPȢrD`L ٣C9OJ p8mHg^:E(5zYH5*@#/8xpGEr)σ x$͑5(2/-k\+p+hldQ=QRNiWd 6#:ŝ2(P]!zg՘IDe4zl"Bj4BZ":ƝE7)hcg2 $טP"V2j1N\@V"tPloݬY?I(EFgZ/k.Fg|. L>nƇHACj3\n@1M<:oChrHATG m1~0cEeZ6W/6~1 ੢0I.)t]l6C (=ĨjpTYvU^t7,$PQz9nQ#y%dٽH0"o%S'iJ8 t4.d;[eU칮SMg'Owm@n񁭪?{88gnF ߏ>4fvlAێt߯(Y~$NI4tl(U孚4Uk;ߡ-uzAuTNدZ^Q)ZO 7cGWlѠMF?ߦPmաܺlZXsj2Hyֹ]vyٸkm+-7C=F2 6G~i֛nMxsṕ@_f{m] ZtPUb؉{:o+0NYNx EZNwۡJ16. 3%>91(9yU$'%P>J%\9mh>*^oQ}iMB슠Z-ɼMBsN!;C4cLPghv{Yb<0̮J/#L+Dx[=4gW?ۮڤoz5iOru:x n\Scv`<lT㷢ZM F7xv~]zN刌Xl)W(RhڳoߗǛ:-q~>VUM$նh:33ɅR8wO%3Q'OQ|vU;ML\?dzH`9L !C{iXh@<ȍS֛C߾pYq\K,><&LXGȊу}Fbڠy9{ I_φ{;캟o|']l70<8+Z3e4NhQZ\ ̼ݞ:}YƯ3o%;cdxgֽq/W3ʜD"x+0ѨG6V+xa"Y$AAii$ 8CN[/,@IqC$NOô.F.pz`Έу&9Ü5EatēdlԹv0`Πk|+&/=Lγ/?+$UE\20:yQ(J(|>y/ɂ/UUQ|&߿+xWP¥^g+´1475|F_o+&͛XhmL8wE^7m9~긅͟kwV/Ez#^%cA|;NYF HF-PBJ֡/~;SE{8A^ ~rLxO?7脓&|"SXT磺ED&O?Oju1Ѻ Ϊh'30Q~1Թ1;||s2Uqt40ib߇:䤞i]>s"=84ȎY~bU8[2NGD&4TLg~|}T]guvǶk2klF̢)t$\ܙm\JDF1yyWd ~1^#)gy}~[-NZrW'vێ֤p~)rR>y*,*ɪў+pN//j2IO2jCpcꪕ7FbVKiqk_~F'Bc2޺8F3|[meNgs[, zM;Il;I%^[5?[N/!|>z=טJ4C$2^`AA.`<SL$j7Q4LjM]1Ƃ \F"XMBmi 3m8;8f]/5zA E4DΨL%6::xQ0qa/"x`ߵԖl%HG'5*+5E1r.YLy+*azU FY]#=(\$,xNr =GLZTT~E74)n `kpLttl2BC̝u^WvmWv]h+fxaD/줙ulZN[Xl}7YKP:@ ]dh=Z'ji7usCm-}HU |}ya^ǹ=h:l )URt=?|L 7Kp\aVώvzU&N`r]Dux@3 \J5%ࠂLC$\\J[9t\'rae;^];^3Kr JSà%"+'ZaNF˄csc7F$OGhTGA=sܑz2BTk,1gMG-~omP@@2*X1o*]L\ŝ[c7f&--fj%)o⅊ϧqR\`\!ֹb8ɵܪzͳdyc.gh =:X8/jeyPet Im\JHD2c UBxt~*%J1''K/$ vPP{ =(pwd=# i1G-5s)F\Xs*RjR!o2Υn] $Iu&AU FzjM6GVwPnoپs R4=9I>pi_ 3mCX#< D nRTM͋LU$'K{E *Y ԉ+J=d Pgȯ4W:JQ`N ؅-r9Z*lQC4FBkdXʨlC1>۫!_^X7m1bRksyr?ugS%2BSRye.$&C 3!2o99dvϱ ):N}t$ T& 6CT$hs8^a-|r<$TZ`'!:UǓGQ@NG)坃ٚMn_mxE^1_Ckaw=|71?ÿ6hOF!+OƟH !Hb GYrcO!%@h[F9IK:YQ= ڹ(DiAt)%Ԡ/s&lkxTjb.Z᠄^SV2gpv/[Lr?7X2+7ч)|[(t <6;4WC#D e@ bDx2+R47J%LF7ZU9tqSSRDKu,=ZoՀ^_'zЫ^5ѫ7Vp`-a"-"YeK@8Iֲ1:421C7ȅMyoDB:Qu;y=S=ކcF}@V{ӈ{_OQ/5QIif9NirP8@ q ?{׶׭dEs:bh / A^8x="K]N3n,kKHqm[dqZ$9$qN&&gcި>]t$By\b4RWs@8ޞ+Zڏ7e0o/yxV˓GyW?ݳRI wZotg#!'O{uʵm7kˏN.R{a_ʡK+aGR22a]7FuP1֖Z9ՈJ% "#KR"m`zz`T\˽colgkZW ~d\kecVBXY2lg/,^.9zzp7l0b,d TcNFɺh2h@BiBy҈I^d$ yB3L3݄Mϱ2F4.8팤(uJ ێu竽߾WTܰbkx0뽗wښW<'8ŅkfvZmțI1ЦzrS:YfO\Cn,?=k Kԋ}OM=~ OGO@kn>%7AW )\eSy^V+o$S=TR7ze _vs [#ztfN}ްl?j7vWͷ|n-_g}<'|xiU!0ݿ2?hݜI>L%pf/xSM -UiP0.$WQ0VG48|@x\[K[Onf3xBl֮c\!9+%H\-o9咈)϶r,r^ٮoWWwjvx<#gߍh/_\\4?lEٍl1n7E3o"EV&x!9v׺Q/D(\J-͠4Wwn,h68 ַFKs{5]ㄚ[EӊOӧ&r=zpKcP[qZ:B0#-nH c}!4[d38ƗƘ!KF-9C޷{OឌHfJ>w -l*16BmU{^KC0`Ɛ)mB5 2,: M,L3y:\n-E(b^u0vr ` #Jbi8()$Y 1C% `~A[*O2 p)Vz!ԤB9K0%pa,aq!_iՖ][Y^uT"z84yҥ$3?5Nh_*9z`"E HH( 6z 9P@ne%'=FxPF̋ &/L=iP6h/[ /M3iΘ J(No* zd-0!aw#weczpv27=X -xu'V11cnu`D6CP # d3,&RH`4dY|Aj6${ZmSU&:sx#bǨ q9!h`! z+yL۽A("5*<^ ')zGJ>Z|Ox $(1G4_qUl^n} [Aq*ۀ8Ǝo,T]\ɂNu~T}g:yr.N.jyGa$Ӑb|׏q'jeyRM/`)V !e,#{إ GЋUuD! 51t+"@F# P P0 l`UD( 5Ŝh Dܱ!-P: *$5 R4I*g  @8 hyּCwν|D xd+7s`QmRgUifN)2@!~Ѓ@X!o8 כw dU&R0@>yVr(!c[9QSTZ, 0U['&B5weV$vH@)FÆLw7C,e"ԿUn%l'k1[)t~>>͋JPC`f. [ Ae9؈~yvY6eړ~h mWk܀,DtJT{p RBJXq<l[M1&s @?a7?+CaKۜR([J^teP87eQHSIFd \"@U h5+ &1FdcGbCLm@hUXAִQy6b=Ld2cq7 Uk, pKve[ud$9j ohqV9X(m@ Zx9XBӪ5N E[΀z122.drXvQC%#q]U{:o^QP sJiMГd~Bil "WrXUmK%1RKEu,&f=V C|^.H,vQ@õ1Z4&VЧH]!wԄ#$61@\<57B8▣k*̥7Ta4UldyF;hG^-dk?Q\ل6xCF aT_x޷{Zϟ;{?ѽ _(XkVs!\g_RxN D)I@_(YhlR': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': N uu%gBqpËhF;f)KtH@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': ; ͋rwxy9N 8mg V:H'P$uH@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R':`'М@7BHXN/ X@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': N/ dޏ^|8Χn7ןo˚ BnI3L40ïQ7oׇV?EŻ7(4!C~wdH!TўSm=;CAVʞ{'Lfgv%@Cu-:S1fGmq-Ovu|e,sY;)ZfkR:(U͛^I#3/Y-OSP ؎Dcs^lǵz a^3$Dl2Uv4JF"R clc5 v"Re3fK[ލ}Qƅ)0L?B3dk^.[X z"U;g翛mXdyS}J707aExySvٛ QW/ Κ/G1X k#]˼ڴOON:%Kk,,a4v)ud00Rer" ]G&'cꮔRk1dSk\zKV9ȉQ^+$TsR˖G#<Տ^Ͷ?Kʯƃ0^eaz.,to@8„y&'A~RC X{MȺ4_W_WWx E´ C|h._AhSf ƹzx @ e ~cgݛ$;V+d {Jzb#3Q6+! āN"/j%XZ!`}")d0e5ݗ_|| 2}m_y(QMRrnT 3Xc.>W~nI?{L^RuҤ }ןhx2I3(ٷࣽuق?apޢvN[ߖ*k EDgM"@h-J&l>Y'65^4Rg3hp-rXar|2&WMJ1Ӑr&n֥/o]<*k8,~45erW'P:AlƯWz 4Ya~R&5$Uet|4o`P~ERz{PloziiFGO7 PUj;{/|ϷTen1)dU$Sz`lh6ݫ*vUV DD;N]wj5g_lVь0)VdtMf2@f ,[pQY/SސLW2L.ZGKニˈ=T]hi?zH4Yt .11ct2B[]+UnpI9"+D*Q#F8(e+6 &ڀH@mk09f{/}}in[:lWTۮ Ll_xS#@3@̺2AMlSΊIQlݑ"v|O7)F~2|x4o<쳶vrv9Nk\![QRh-W ,, RBj{!Ł9C8مfs7 ?Kc*<y"JEƟ4!ii{} O ߜ,BnooⲀc x5{/)I5J0/oU}4:== ''2aAMcn2\]n),E9KP!azR=w?U^YѸQ27ih|f؟mպM6UVץ##5́}O;8R8Ҏ >ƽqyrKJ%P|s%>7߿Mu>|sL>+:ḑyo~݃[MO{4jګmmZ!u5໴rC?iw9˅Aw Z, Jz??7ܥ//oq̪qREW󞫪U"MT/Dp7N@tI"woBqv{ϛ^Df hexP`'GJQ&l:iFYR8&tm$`2(}0D[%̺?RKX8Qo(a!0feX+D$ 1(JV/3-Gib:07D(Cr uPMĝb? wۯaԁ{a-nt:yaAx\ɄEZ'V'7d\Bkq~1-`Q iT)Oɬ`)$]:'Nx'oqϽ{~<rLdjJҀg!4RІ`Q)MإJ#RHm' J{1 (Xe[IM73 !Q NrzP_+? ck^`%8U9vlZ]vbX ^mq5~j)┇ Ϙ&SQYXJu]$FR j!$9!A,|dJ+*2.($!3,X\ihIER7E1hyVdVu#t9첺vxQd~8zGT !qbPɩXFbĎQ-؀{4&EHA#ZleF͝QFG YT I8A;{tȴ[#g ^Cx񮴴o9r FV4{j>i4m $F;e&#Q`*CċX1c)ˈiDhnD;[#k `g2 $טR"V2f1N3ISzQ][o#r+`vimEs{`AI[YHLbbٖ,YlJU}a0oJ^â2u1I@x2UƧierTG ^vݫ,)Cζܤ2Tl_F:֌ f$}jfe КbK Lł}J~58=l D@:5ίp/"fI?MYa /QrBYsvԣ"T68(󳗧Yubs<)wo]5[5vǥTsmtДy2kؼ| 7Lq:YkMrMj<󯽥s ѡM[ _Қ-h@a=&4)Gӣ++m'n>xoO}pۃk(ϼtPJ97ty ۵߆W7xVYbmi_n`=R+qq2=Tѡ+-aJT,;eqIt_ҸHVE<ӭVYhe,oikU/֪^UX[+KZFkMgڣQF NFAre F۠uɐ%&R*YV*v: Y( yΖahAm̵ g(tH[fKJ'rFRѤvewq)jZ렕)H,g˰:}jߢz-P`Jv@`1@ y\tNPx59¨2+%SKzmL8ִ_$ȳk ~rڴMT&iGiG Fi2iS38_66Ie?5 Shxy=Jޣ_~iq͇?>||4/}M.·4M{švd6yc\wonX7 K'۳߾_Y0F4+H9X@0988e埚OfYsB|@3.P`;&|N7GY2ӿuF 16.c'G "Gɘ1_ _@]g ?~ =շA({A;gkҒr>5-%?8l%p#_xU@%Jg*{]N>((z?ipgSpknzimdMNśB< 9>s/q$ˉ%?6ӫthOQjR#%.%{"qҖVLқ_^o7 %{tKE~vS?e'Yb-]Sfl[=pkfbkMٓ*;b߻}M DRi >:FD3a..4W,q1R;Jl!ISxG@*((r ]/MVM Zag$Ɉbitʐr= B!w`mXH}v"EȍNU`J~QW-eҖF YyT*.>R v.1]cc4d8P/mIm-xW/ v`(٭OnJKq R1[L!i@|Kq;տRNK{\3AR܎c!✄Cbc $fZBd XfILIEl`&JQt6I\H6A7#0~FrjksUfXb!!;Y2im*{{3MǫOT&'%za$TT\I#<6$#d2(AV܆P[s!y{ʿyDe!s#\^1ϵlubq ]Dy=y1NBGd ǤFYY`6۔:$ɼpÓiIC=b,8AJF$pqa4 P4!Y5C 9.P3ӯnY2xG\L?:IEC3Ҁ >V5>O,?G:Z$iB;Fd yL F쭆>pT Z&މ:zԤЗQV+g*–Ōn,MYUm>!$k#w“[V*~oz;Gu4kqt6 H"'׵1.rS>琢:v:qs>p @]m¼R cloqOFtq/nz;Rr|15hQChH-Z+JNzfWBe/OS[JI;MvA蟃+\[܉ {G!Z\b,C2 f6|VG71 lx2TYW (ͪ"4 M.Ԕb#PssFp[o(Uu&wq{OGG75e2DwiM6fX֒T9> $K$TCWT+oksvmëAaErzsVCie}a-jSB:i.L )CLVu8L;>S} ʠe-2,9& GBZ2m#Raʑ+YHb?.&du=񔼤/1UWR9P--ZkouNЫaU^հ$;5Ca T*]dX40YM^9Y\R̫ƽpw11vzK7zom20K۳Vlu*dړYTHTNPCnk BSL}]֛r׹PIQ& WF뒳O21ObJ:g̱/p)p?wP.j~NtxrDu`5,w>?-ӷ{JPO5|oPC9^n{=%T@x)+$94Om·3aUr*9oCnC= E $Ld:Cx$##LL WlT;n74jY4h((J nAQ{kbՆ&F1q\vK3d!kϣv#l][+,\g/<:`g>d1`&ՈM8@U NFÝL5y!JǗp>yg=.!t †C6}0mLt;C+m߼R~ t!`r9Ħ<u vekmH@8@6A8:ެ$C3),4߷z3(C4_UWW ,+33Qh9UJ*i=dJ3a2?i7KVxh:k0 yiL`z$$X:#8Qh Ql=N;[B7)fW0 ^QǮӢ`S8՘hn< as;н Xu*0 =5x?7OJ]SysOW_-U謮4(U0 f@M ^TpGr5;;qr-vNwUtU[6z<(*n8e/|U6ؤ90'JcݐtpOnxu5$BbUP(qS/Tj:H3osXpάQ6;BVBAuiVintEMX yIzcϧ⭢ʚ L,<-yF9$H*0Ԗ]{iǟgWӡ1MAxPiR(So֗JYѽ$闈W~ =ʧ:^0 Ǹl"&qt^鎒*ts ҘK}_//oLk,IOeo _y/{UIVsۚov\;eVLXlgٽ7OsP/}v1@ Q@A7B>ˑfØU lHLjjBizFZ~xH-?&*pIo lFm:֖뷙%fi.E\Mc>|hU >tJRU))fkp<)DE([UU@p8Ejchq7<~_G!v]Ft2Lq\8Ky#Y+D\)0䕯oC" `RlW(oQmD8W-U)_+,]807%߂nu&k|TH-ѾsJK{W6ѻ>re "h۝B.`.g; ~`Iuw4{1ߒ_cz"˪e"rE>r1RMQStl<ЦJrgdgNrm4uf65 (BPvv%)Z*kbs'R*AD ʋ N /B7&[O,E UuLlݦ[PزEzl-iH.{<|-^B& H&v9X,ygjP[Y]ࢎ]*+E+̪]qSLCaKvD` 3v[ R2x!T m6O7_F|lo-f3M R'b02,ϼȊwƩc e\r>m] xYmmfVv2TgcoYVsYމ4InemHd9[}.MTfFf\م%ҹȋiy w[ oo|ֈȁui.mP,zsbڃm3X%5\K,TͼJhAB_߿D)JL,fGܩ#Ũ2AA,! miE(q+{ c oCI#SϠG M:;ݹ>?GUg^q|ȷ}o)6rHQ~,x?r^/0R6`" bƛ^Ŕ6-EHSb-Ƿa|۬[{z-$z~t@8,L2w˱).<͸w1,c*$YfKYW Գ6M^M[G8|g9ٚb'$,rdq.5 s`-4bKr+Ll?U&6ԙebCԯRqc!(LS+-mEdT3sL(Qm,d[֑aYTA[U6za}Ocɜ:g>g_9 ᚾⳫC>vDtX;yЇj,OGE O͇T9s! + KX{D!w:LMLM{I H%E>yʍ΃r wT @@Jk/$gBejV;xVߜC'Hl=˭t| '\ZZOV\﨨v;jT O$˵&wQIp>MQߢl:d_ن}* eDJh*URɵ"`pY@Iku;:7 # ɵFj*eERrZ+xbgiYk<)Ja:U"%(b&0%րQię=J)LS`|0[VUpi]o"ѫ'Y>\2(dˎo/RLoY>~dzL`8/,+F$;0\6ן?_gYq(A8Nw|͵xxߎSO7!]^0|4컏wUPa>M=_R̎1ï68}\ex/_g~$t] g,9#%XkQpC2`[K\u+eʋ5Af3x PTcreŻp+[d0@k~x[ELM[3p!.^1@pa2+e?3uRarhG$ +ِahNзIse~ly"ִhBvD7k85fKe=@%؉EyEExE{l^^p`lJ}Aѩ<TzEqlwoK op6s 4!E$ޅ%FM۫@M@g# =&3ގSK!7<{R/zRNir0䩘>_A[%pI, Ds)f<殪C~~/~!?Ɋ9{:p`% )UIw= SCs=‘edH׉k]#w+QCώx/Էs_Ji:еtiMASxBYxsұә-u:pJ`VZJP"aT>LI5F.%,M)P)zqy1;{?W`n^l;L@&r}+FÒXʮb#f^FVP` q…V|HFvͱ#w\b ۖXEdť"\B6x@DMx,Aگ^n,W%c)rdL2Ɖ$R{QfTaOS)=:Xkd*Iu΋AH1y#ZqeW)׸WnʭB[invamI݇Ã$kTYQsDL'hn-y@%b~+ފhp[q}ي۰/_V"Y@k. Mš DzR&zǔ`)u^H+PR9t7ZA۠FH@ic3BR Z tyUwI ;X_1 ޚCe[=Œk.\*WN =DFSI ȨQ3 LRe”cepCQyܮ$NO%{TΌsUd BXat#!D@VcZkN [S93(tJF9`ŤMIdDԈ H!UR aEFӎ嬳RĬNaQYڐDRf$Rs4,L29> E@"Х-[OoSq Lqb}N<,JX_%̳ wI$?@Π\줃N?=һ4r*mhH 4Z"qZdTDh`щ(z(-'U!w= k/W0ɋGCp+K*J}IRKL ZizQFi]Q*=,zDo?nrb_3TtW=HV<0Q2ODRe|$ER-SN<0,M]qȆ1مtlsEZXLg7#I!)%H'u@!(!<ؤMUCPZm9$#M2~ָC6=Z< ,ETzP⋩8f-X߬^ޮ2#}t,KSМ0bQGVyZz)#"bVa$EV 13R/3л B5睊| 7'7“!w{z6t Eg|쵲H"a A}p TQ͍QmQ'(/$9;9~KVZ&x0a> hg7(B}6u wf<V~4:v~$@>5&eO.`QJDjd~sVk/)E%K>qb*n{lq6SY"*ɖ ݆,9Sۋ0&\ZoAPzu{Yvm rE'䪩vM4]>{@_>@+^w f䆡xq{\(ѻfsMeȭQl‡LoTQ_ToNƕr94",z@Z4wmm$-}ȃ8 pND*qM4Ii-OpHQEiiuTUUSA\ʑ2l&E!D;MMA.sD Dy &g G d1/QݻU'xEaoɞ9D) TsEd ϑ2'WkXI6N3"UQA@aHG%J@ÓY)D<PH'%PDpJ j1rvV:ͥIu1 Y*'{,!KO1Mnj%tUǷADCgʯFklV(cfVDY$9$|GzyyNFZsD  UOkB2N AR-lB%UZ3#gf,UR.ºwpIQUF-YWuӛ ?PkljEd%('NA8'-M2SK]G܆lwq&Kpa&&10L>&.95 Êy,Zw쫵Ea-:!؍N8:{8Z"R$*Pή&rڰ\sUi V3('\N$i+ DЊ<"fb!ֹL{1"ĺD lDblׇQN!fx,ea(;ičvAKաJΥS#"$m3@ g $2R#F#:E5ш3$#IEy~;.8!(Q{4+-ŭ[H#gF"vU#\,%{E(N/nDL!Z4szIy&2D5&+S& bEY<39SWcx= 1uo瑈mO*T :-WQ Lhxi"F9T͌c,L -tXD`z1x//k m1uwF3]̽,0 µ݄篚 y:yK9R gD~ʜsu*WCz^26JP %exFLQS1i0Jъ98r+mcx=j?a"I9ĵ?X|sX|Ͷw8;LPOH`PCyz+bFD+Ǚ$%LhDSJ^ZIЬ,i*2xDv%$%QgvD8L%MQOlD;l&9Ire"jB} Τ&rb% zV'V>1ԉ-uD/r}^WnaaY*繻%<:eNK6x XN"<!Z“CVV{+^Nx" z{ h)ذ:d )BVC>v37uDxޟ&hgmu>r]PJ-Њ@* TGQqV/o2v-Sv/Z[y6 M*睥<: D*Ft(%Tpc\ka@7hdq,z+`$STRаi_P]h~y\d#9Ӷ.ŗ~9%q_p]3G ëτW`UΪ c~FDL" Ud\PqƒNDIA+{UvNF ރ =(Iau\HO -($DJ#p'Յ'$?`ۼpľAmNdzŸ}ϛíϏC_J?NmA4@ ,10/ѤG0[MCƘeN#D#d^qS b!v4X`DַFׁ:"Jeht/~: ^߼iB̏zH_쐌8:ӵf݁#6OӼ?nVUU<1k^ʂ +mp:}DJ1,7[˨"N )dv:Ӫz:4A"%fDJhH$EeRJiFIN0`3U d^-0E')%rWRQNXR V [ fj 'JQZQDo~>Cx.i]9Ph(vn[UUy=a8LAȁ;[S7?]M7Fddgx9~>Hvd4u"mqm?ק ݢqj~0k:+gǞ y/ѠoVxv4 :_}À#ꝳCp2q9E34y/#?q$ɯu /l/w۫igu t6Uѣu?% ]Nڛ]Ba-|y;|Y#x6`d | Wa[6浅|w:e_ۡFLǞp̞j;뗋q傋Hk\,TѲ6/*xFQ- M Ң8gDgw6i&i΄7[,IX6^|eNo14yGxܟl$˸Ѱ7ׂ G#F| ɴٶo͝UJ^RʱdAUCcrio+ V#Wj&,4?7 biZ$(lи W{E [5z8l821'խ6LRtѳeQ{Kd8&&͌YNlhKt`o6,p6 SmL).+) HKļ}8=2xj40@199W('v:nps2"<S5?$rWQ Fb 3ד%7jO8YErw>9֙u72iv wOߚjmmJD؉a .{mi*mM+<+.{it i4{͈W>]./ޯ53/MQsii?+7(B6jVmO@ڕ"}ݺ+]-ڗٺ*|pTѼzC>Gbk/M7'i몌l^uqmޫ2vKUG\G^X u4y\bz#h.˓*aR'u+#1ˠ-vm~Ɨ1^+?S:텊x$\˦]LY4%25Q"RbǬRέ{tc#'^ģL•,J^pwNڥa T(mCن:ڦbcbڦ6~C6N< k6 ks e\L:nӹ `qUT[gL-Y ^ )iW**lٽŎ*g*'+_cqSaD!0% o rEEh]o[{CY\irʂi6 "ЧEzz+qP $EqpJ&fu5 ༷T,D:Q"u A!wVz=Sqg攐Tg J TR;YcAMkA- @Gr/$YwTAaxCL{ q9ǣ0]Fu{Qw^_\@C9 B)M2 kliA;jX63~8䵴[' S1ONu?MB+`gSWh JwP}޶gpnGbqSnwؾ^tm6iFoڊn/> c+/yFMef֭aS̚IBgbT uTQŜ/jNp&,1wv<,8<#qbv'?xÕ"K{"ư'{\4JH9tVKΟMlǗmLaŸX7e55[ml[02x^k/ulZ-jC \$UO{lO#YwJm*8~AZIDydۨAiL!&$\9ǯ-3 *زc$%Q3wQ9彰4DJ"iIVaH҉cS`pF zQiœ3Y.:YKxI,\Gp0Bk&~ůfS^5EC2aT`ߌU7^O7z.oѺ|*h<wU W|ۥkvr^%oVo?LO) Fד]Es5~\w8Igw/YEz~GˌYVPw{yGę.%54 O09TMU;W";~TA޷#&a'ܟ5Wy`߷A!e޸f|Vp4m1+Ndv%ͧS㿟~ L9Wהc7Qi?T?@#GW䜬d)d|0RWR !+K&WNMe2%8:Itg5 7R>7_.VnpKg;)zR>7nj%d19=6B:ycr栵魌n&]ͭ{tw犯98byԮݔ_kgOڢgGL-_/=$IUK ]"%t9CD8$@Z UP9Ŋᔤ+Wn)  "8 .IO#pZYniOQ'^q`"tHNj4Vj 3IA0&WC$89,vrʽD9hy8fsS~2pFeJeN*qkp Ǵ<Jv6vB66D]l8-hGhkƇzKڔ66_߰C3b@55j Bz;b[GTlWg$PDD<$eY0JHAD@"LA%%g3l-^eFw:2u29X̳E0@48PáXs<E/4RYU2GeW#RyByQ|Bi0DFI%̡:c8 &!y/A&xPA0&eN,NpOɂW+[\3L*".dv- T(󮨝9Z:gl[֖̀pKY$R8qbNOBFaS)!cmĎiC@-B:.r~>v@;aQ| 4S?w_z*FIR; .C?ڀ9WZDɐ-AjZIACT%A1CZ+(bDrZ34@'#ŴI(ME&R*6&`q9/-V[%1 82wڠ^3}¢[0V0@#y"A>!-}`|tHN;g {D(ET Dm5 9&8$kߙDE]"ӭ I!ҔHFk:f\0nY ܨء"(g }J0Ȍ6 L0x]N6/qXࣃ/1$|tʅK\b&: R~NΰK]9k!D]{<XpՖLGg 2E*6CT$jY{{'T̩S$q2G>|<9籙S8fbDW鏟z8 ̐h$MQ-1>ib7Y%Kh,Ic Tx1ğn诸m +@ݝBN肖mZ4CQ|j[  B". _20H^z['&k0]H];^ohGz \V*l< ,mЦSR b2$6 .uѵ# H"j~Mzjdr}Қf㧒:ym;{!Gz:|wmH6lvf,|mMdɑdg<-OشHHb)Ȯ*X9T@x'leL J={ “cc{c&X@ ĮBtL$R,[&b1*pV !'S%+㥨ʀB+/|f@P S"{>֎`Fv./:㋦qJv[b:c]O1~Wuo B!C$:Fu,m8It zӆ"s VEoc)Loxz}. :ZfQ|=lńo[޾y&F9*p6^.գoG/j4PFsmY1XS5Vjƚ@BӠS_jJ!x.J ӆgn dp"e#z.*oGR)E3B(@52=x`̕b5rvHSo(OV|yxɜ&.ೃowy.P=0Q+g«F:m*Ƨz&#Y3|zzGbq`H{3)b@=z6+KC^z}-k 1* Rc,2tL@N!YmAX N,Ī zR* l!\} L2'sN ,^cF-Ll]LЪ5I@vU~VЋ#CFdl,١MB^8AArӦB(DCۜ܇xr~H.ICWh6Tl,AUڤm"7<9Ǭ:f.:.;31&0{\V[S!MT3) dso0i\Ϥjkjl׌J=]X3 ..|R]xEQ c|{9,G_4hh42g]ȚL & P00s4I (ڇDh`2!EQR!hFd2v̺,!8#fF=>O-rk0&拹6Z;Uez#؍NRx$Y$P-&r֑ˍHNupYicYȄ 9%f FMt}C*HF5_m5r֨_R4b5xFԕ55b7q<Gj RjN!dm18N Z*[,]76FDFֆr:R!h5jn$Hf`7bp^;9;D_ @zq,%:q"Tub7"aq=V ,Ȳ.@-F;*K&z)PVևw> +=芠]x[;ިKC\LяV1Pol  7.FJN7Χܔ+>(LMfoSW,NvwP^<M_Yӛmi9ƅ_~ŞqYNJ2ƐnȘeM =FhQ(B 2_Pr>kl«ZoDȊI4datsʹŜڦ˼\P:qc{tRYJ2uD.K -`]୷AJrKK^RalhkmPjaC;X.NvNUn cL*'87b 1+B@ L7<{KfO'n9$4 ^8:w*gu"SF d`0M6l!I"Г H=׋ kerDž"R6F/JXt5wW"HsN"HkB@i*ƹJ%/]Y IYv9/ޚv$L{ŽRwRAi:d Ix^j C[U·BvgǃRv0 W.w4 FG GinF|$xWɗ~b)61/`OQٙf6wL2Z~GRjV Yoicz?̷dJ߯~[jᄏĴю|4;;V}轚?^qR☧GP0l=X4/nFu}7b⊘SpGźJw7Ef_筣_-ckGmH֏t5t07 aQeN>.W =sx:o杣 y$F휫qzIgWQV>Lc|F+GdqPpON Щ&Eg N?/?P?ǣw?|8?]s8h(,<{C_bh[kho5uh˘929qZ3[n aW[ލ_X\ӟrtc.a94H$槭\N|ⲋޡJ2'U!@@_r5yVdF6S]xĩ]VT3yO+ ~y A%?:c A[В1X̩Ĭ*'鮟ϓٟmu&N#ƧNw&E2)Pfݾ#+ ݴKO2y)Y&5H#N ,\}ALPU^TYj<*?0X3ʷQ`q<W{xF.6Jkԧ`+@n` h}4( 䢏oL|W=R{bof6b`u NJYKM̒vC1(ikT͠dLE(7RڶPvWg#>Y@ϋ^⭱Wlw3ss3#=vektt2nK xdԩ†Аj%جc,.F֩]dQ_/w ZK1. EA0,z)uVa}DY2j Łk@•^VEDƗ602`܆>d3N % DnӑY3sc5rvOtsMK"OvdV6֋EU\ ӳ=Vjv)9ZY:x!21-AE&șGAF!d\Y7'Nt\e2X.8G ޢS[r ,(lFMQ'QΞ#=9%x-zrr2vevHf` OM6ep>R""+=P.3y__:Adb Qgm5f$=4 8 02 MQx [DaW}dW? K֏-$-=OΆ(p<8Qd!)m@cl< Im,!N.n(*ϣ6̄ ?˲hkb ̥}J~^6wC >P|Ѿ?/L5XZ 鿈ath֫{--\b[,UCΚŸakz@ȆP)a=-ߵ8ְEj\7z)UlGt_NyM;Y_uv^Vk"76趡c͙/fK,xHc.[1nu;ҺY=-xӫ6[r9 l; \e߽M{;/Zbzr3?WNw]sޅw*vKa4YobWMwVUu}zl'Y:~L6e=m1!J\~7I:4t*H&h[+$ 3%5gL :A;l0΢C@JK)#T"BݔJ>X&,̈vQ٧a =N.;~mZ@:=VrF#Wŗ͓v5IxSLDm&0 eFa8 5“hK6CC1dgm'O! i6Gks8^jGbU7!ˌ*%q ]2d5*N).pLQ(*+jp *ks+bFH>얉[[;vz7E]`ä_4jҳ7xm3]U$4 KƤZ ;\Zg Vcӹ)c0ƚw-hs 6=-Ė@|?~Й3yܻ //&lSl/?-x)8X-.~>*7f @_v;Xo>u=X5)L̻['\F1\6XQtp?M)On+ܢ`x^>;{ciۉֺΖ{q.XKIjj}H/|s'wlx|6]ìܪ˾ur<߽p6ۿf`v? #<,Ց:Ɲ9;:۳tF\sD_ J]}y=?Hmׯ NNzXfaܒ2/Q--~UFHMyǺ}tH5jeOXcZ>i3*.eA ɷ̂A~&Mb}Y9s} N&ԢY<^3ҊBz`k0*BV \=5k]wsrWwK^ KZz<{ z6Am w6yWy0x?{lxp߬";_|/kQEnKNt]ew߲|BIl9w c;6p痑a'1_+2cqȎxrn~6|2,l; @\ a}Zټ4a|W;o?o`;Qwvhvfg,}7nvޚ yp/ )Mj|_o7kmCq5,ѷzSo`6"&mԒ٤3,oJJ Ø3HO1B.׃qsqo>Z`a[;=$S cvOW s<uy-k~WF_źs!}Ms9pS+NkT>I܋,gvzF4&+6tG4FHEqDrjszaX&q:ݼ톉\~ɜew 3gsd'w_IFEVƷK4$m dGj+5(mQ~h=߾M# 0]~]w s?N~ٚKѪe"fK[r\?hU-%h&(A=+\P)&jQBrC(cڦ2.TT.ZZ$5' s'`K4nHq`5+PqѕbhAӷ:r^}5 _|5crIJ[ PpTT3&Kbki{<]M( 1M)+d06:eZI@t$wVZN1 늳{pGK83R>EZ&nY!)!d*-҆iN6G#4) oR!TT2B &8}2:ּk94Yܳ5Y`uTMÄF]ЂtǤ)~G{1,1KƝlCIAR4֧D!۪y|\&ys٣*K'jjdU9挑s( f[4h"ɵ STBf=LiḞs4|(̜ 2JHJM!!S>im-Uल8DIcOJ=k]:gICR [AioeM52frYxL+]0}*A+ե$P2L%Dt Ѹ hސ L9jFb[Q@Q l|ZS̡mK :8Qv{1@yUc-}(eI+!OZF%µ[M`s28@YAG\ZB4TCZf1őjJʞ˺ CPV)Oơ(հl2< + ջZ(Zl#(r2؆qBނf]ئHck(yi:G&[0JF%H-GKg00w8;bgQܴƁ|ѬƂB'zPK()dBUo4V8LtaXry@_ />"$/e76T l*^X:7+@JTU UĝCVɒ3Lp)icU0 vnl[/ŝGUD|ҷ{%}}6Fx2= RŬ<$n%\oDPDP-fx  BR{pU$\`tBtXY1(3ZMV78t-Q/x=EȒ,\\5vEQpYL$E}4" <@ {pw FD֨W 0$1dD NP4eR+xA+ #DI+fc޲PR*4[y/ۤPf$l@0U}o^^%0 R FhӴ󠭽rzwYlScj\ۙI4T42qfQ:Zlѣй1c00o4ENe6ZRAΣDjaK7fS^El-gEk8nJxKTlЦĐ 9P.Z Hӳ%\ uFAvP ,)aIP B>^rc X߼UaӳY|?YouEXIBivYyižbO((c)U2QLYz"i,WQth#ʅ0AT80vlΈ$h'k֪T-g/8.d$0r$jd-ҵN鑵Zϻ[i՗7:b P[;i* Y0m@…YaFZ&]@-L0E zȕQBbx~)Q2`p gs(6gc)khX.}C+Lw/2ciVMrE`$ 9D. #&J#׸ :ojg%!wTu)XorcWouN{qjutFP-QC@ V#Έv5KKS]Rf %:!*neDCov>??J N I'A>% }2J XģWJXJg @‚@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J&3))`pPOG BOL XWJg=G%m<@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X Jg\?!% tSQZ#cWJ+@Q +X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@W OOI y:J “Q+³9*Pa%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V(.փ/ZԻoմ^_6׏ }Zzwȫ@ 'u u/ieUEbnl\~u_z4l.W^}9۾.ID-#.?ýn,Hnݼ~yrqP_ B~7c0ץ (Z쵻#`u Ij!O(g+\bqAmئ~Qe^t(ٻ6d  ;R߇a{P$á$_ OChI< 3aO*0OZbH &˜ﰂ@!| E |!02 <feKN%Cꚜ5m1J m]TcT&”0`֝(5({pf*Q=퉬b$ۇ jUCWGvƛ$#ᣃV(M;)?f kLz ٗٴ-F@"ڍʪ8@(#P'؅cSn>AȖKjDaV%Zd>n̺CtbH"; ߇Q{U4VoYZW}jAW0>oNpgt)$CLud}pNŚ.LN[zab)˥G,weODd%s϶>:>Ͷz5ޞ!%/ $ߦELegOSSF–cmǛ@Wefql9FPɐdޔ.:ʐ#jzϿZІyUU[`*PWm5wn՝Gq#~sg}آeAG՞4z7uwR}WRy~mt["%ZHAڇ@ Qadց8q'xb2K}$HtYj0 +NdXSUBnJ<2jhgtl 6Ӣ5§(2ye z0&2DY{o驦^78( @oQ(JWH6WUU31h_h m_=Y X¨ NW|S,wz/L^.^}>W̪]2xSV x{X$99k<꼹ҌcݚKxaIbGxJ}pd\x (뷤~SsXVBe KZwz tZ4脢F 4ܜգ4xsN_2:+w8&11^Z 5:L.߾#lk g_6 N/E #R|D{>lT/R(R]_Pn„"}g¥ߞUgٿ$vAc9Ѡ NWSꖉ+ lsK3 : SNL>S(z<=璟7o*8`~u&;ٽBY**/%pbfRAHOzή\#vq5Y'<пosS0eQzPegA4bc7 P_?yS?UMh΁pQJ{ )[Hlu׽ p<ŭ\^2J&g_$6[69[~xLMpcpgzo KX|e4xL.lU9;="q{|7٘$ăr`/wN:#`::8e8%0OEzV4`xF4m:wYi;I zDIK~ᒂ-C`ǜ Dn`cXr :(,61be>UC> )GSp,1=._q"?{Q9fTaR+´$u9/bԝ](7{\YEZVZ u0 /m]6f:TKa Cv}nMۭ2Gi_ׇjoH?j$*$1[]6*p}'YXQKDLL=ʀ0OBlu[q{m>V|ıl9/b+.UETO?uQh!PcJS8l }7r/rMat1!;ف}Ey ZBe#ŒC_ygkSy~dwrcN%1 F)(0!Ek98(L 8V6<0;&\cR931KPTE,KcRskLcJd5NTqR]nkYfO,tZ>1uqc)!B:J IPFt@dr@}@xw"S?U[޷^ 4Z|jd(" %̳ wI$\V6&O' En !r*a^K% "5N@Fe A!AF ), сj>2NoT'Vі)ի;k/G0]_9e?ԨЗJpj@gIS8ohLgܮ.ձ_6Wtfᛥl!*{X%7WF+*'SL"2>")[KJ薩N<n8=ӔCM1 ^Uɰ:Og-9>ᔔHYBBlL&`ij'lumη\>$E4-L4x$=K(\CqϜeqϒe:d禾*nzjP㋩8)~Y/J)DZ5.5Ft&ƟU؃z[:)}8'4&u-45!z=!ߘd%Foνmkпazw*XWa MR-u87{-1^ whןxgֳ0O>ZYKY*$ 0BP0A*1j 3ꄷszMr|[:+zq6Yill L. zf.zڢaجql{6~4:>ҟgX)@1zvY4a#!ES ^ S Jx->cEWu۹?vB΢# E 6ƒ0(&p_7}gEkxɵ BͿ70qB|@/UE,yPKP-i]ARMi R )~4H5;}MU{ "JbiQh 5(_+LwsXwt4968E)@Lb0bb(#)"+-XY!| dښsL(QmGs+:2,"#aJ&,,eh$wM]-t4Iz3 KڸM6$SfpJY8}6fz `<I8T5ARI^R9?E*PK +ҬP <&8=JGNĞsɏ5d QrE8Fǜ c1;P`\HLjlU L?5޺l"mÛM> 3AL%0Cs.3) 33j)SMq{JS +4YW,ESl|5g`#{'(H!9{Z5ĈH:pdx -𙧳scS_\eP+nw:|yPpS 7K71q Bu2$8z ^J(V.YxE%gǕ2,xTij]O[ ^V^;뙜.#ҰOCݯm:w&X0uI:j`k.H>3@cĜbRL#,8:56]聩H^mo'W#x֧YJ-L˕ǒ:}mˉ[X=Sx4Bߖ /DI% U`hi)1H4;ѧcOJ,JXB+B1rsU{K8Eyklܿ~*UjwjPD\b5R1 (QKE9}FraR4:KDgnm+GkQ@JB'>jK ^Ȍ ^`AM2Tn͘m:k|X%.BYN>.(mrq6}s&Yu[Ij 7cXcc1@/&=gNRQDyq-RҜs#R-(W&oiy!9{D%$ItY!?{q/s0R_o f9k 9ȃ#}hS̋lקz8C!)j$QY%]SU]_uuJ6`d17J}"gvQ1hWܱkmkނs!x7#:8(@Bjuv5) X`5crµ T;-( BFːh#L,Ī7^: .$GPmglׇob<eQ׈8APc%RJ˩I@O ՑE:h3Q;ny-Ӯq]vF$pahPJ-)nQ ;hvtI N^Mg/Дo-IܕqzB1_ 3ד%dO+74sR2N[(l1V޺uqi+eRͧJ{TUJl8=;ᚾMKS.5q8q̓ӕ)Uq |Ţyv3]œOՃyŇblI̅m0Jb]Y Gr8[ dH -}:#] ڇ:MFBh\(\y'BOWc.MuTF6Lmֹ*WK:<@xs)Td1=xqPpONЩDEgqȎO?~맏ߟ}:})e_>~w? ̢Q$GG?uWCx\֡5!Y ˸)ݚqWoW+aQ`=&:ɇNso- ; Eh~d_Th9ݡWTP~F@_c>{Hd#ݶ$v 8սkeST F9{lo7O:mGc_'p -hmE \xe7?_ilX`c>X`!Q6$o c(3 1:4+PQbsB=e%G>`9t@&\gGEqdzZNM&#"a1T'PUV3xe>z5LzYZG\Hĕ!D/ӓĶ7/J]_ q6>WhXK"J-ZD)Ξ{R{JIuGRzg8GeО؄q7DŕFaM4,١Fvm{V3_ay:u缘8 ^tWxk,nnuB%RXC]ne$>x1jvkT"h LNҗqDz7]},߅xqƑ-C $FsŴLP"ڕwj92`kLHڅhp`2Qf$Hu mm"c˱3rvOd1q> j"Ovˬ1B׭m[χg67>է \\>cfp>q#ʧsPqR!(1`.L񆊄c,o5! K3*/ygOt=3rKZ; d<#W+_}MdcEnv\#BFrɽ Ids<.ʨDV99>".F9KKQIwe4ѠM@+V136R",9ĕXK }鼇۫4i TRIjn$*G)F9aG>^/ٕ_$YBRHOPZp{_p7P`Tϧ7TNU89CS^BĆx8"}ɹ%Kx7#%Cbّԅ\7>޸PSL2GS *!CsZN6h)Pqzظ'g='wvBwf-IhQzX:l$ɴw-&.TTukq!,^\ξ~F']-]װE&\p?T<[_ī,.i׏x-uˡR-RұeICU7Cr<_֌%}፧m]6Cv bW]B0wtOSmw$xuZx lړ >1IHt1g^ΰMOMlR_gxzhwt`ַrg"pg}55;goxvv՟!e$LjEJ\x2\b%9&27^DÃ.xϧ&$^PGgeK8C" .%Y^4UUa ܅^u2tVTIxc:=rvhz|1?Lף?޽!ѵ*lr-e"Piï{~*W,TŻ?XiqY=Ѵ3]Nɢ,i_(glSq_2ju=.n;EPUU`Y)#XP} F@_#PkX% d& n\s""YA+4V[5:y{zw#=Sbw#whi&!Y=c{.iNTK5dx ^X]uڪ{ NE{ߌ[zBd.A9vPir W}F"xJG /,09&ִ#֧sۓ'sp2KKswT/c/eJN))"eLXHkV$B8\@\Xt{:EP 1xVk ƃjǹ2D&­!֣)c3rvKmz0F)|qdypķ%mxGˮwo$?O E Yo= Um 5FYwev㺑5*x0` <:eIa=Ւe-z)jEXɏ$tSA)rZw j '_Qm.yYtYjF^98#qV,1CE϶wk.R2Yb"L0_yއt(tFݘhLS7L)TeO:p6}n[.]VKt,ܒ s|υ9~㠄.̝Q_^twk>r×Om +wKkN}Lw1Du??DKBx };>hNA7f9BEA߭]FAB==:hm„xiZg^~pа5 Һm`o3U-\{ywa_muOA@xx`kf;sizv5F]׏X˱%C5z4SMHΪtN݅]RHfe>o.lzk=Y{K OZ=}sy3am.̥LO(uw5R ={M^ؚu]f["]9vW'|燚=*IfUktNZBn3lا)՚A{U]*;:[h$kusgy@ȍR Z@nIk}k5x *ckds-6hFk۫N=hS|6nZtm1Zz#|wRAYiMaGَlȉ]șbq_ |jYUE{r'gj*uϬTC*)IW6՝CQd.< K5wnE;If)0Ǒ֑ ~B_o欍(-K|NEb@tAQ]|G[_c 37%}^\*Ht=XQS( ٥fݑ#"L#mTȗ߅*9cH3rS1ؽؠt@<)h-a]K68E @xze<*J|X@A[,<[]:A8FW򜱘+ɺh4 ]K ƺz2?ǡ@5eB CP;*HJC6^.P6PB kGU((JU7@mA'Sʜ1%'-tVw5TMX&E  ʄWgs$4 ( r HM餻P+*өPm5;@g}!6@@-V(!ȮhyΆpYWBnTzCZ ALAAXg-@zIc-AȂA y\;u9".FC(dԭ9 :k,,[tƅE D7S/%J u:P%@ ycAA.:2];F*ACM5VНDC4GR<;/Whϒ^ 2#}!@O) r{fC[ FU]%A[.#=#Y̼Bj|7 !ѿ)9o5R.8(YY,k h^! hB9TU5p4/dPg9d_X TUűb)"9YhQ1&ByUtbB}o3ϰmgw6E-UݺZ01=^Vu9$AC00 ƛC G\Ge:Ul!JrdJUe ,ÇdQ|ѢƢ A9AJ# I"92)0RȆZÌL%1Z6%$DGe YN[[F#H܎m }K/d!T?Q y E*F#mZ .k$ Sƚa|{]]|5di*S,7XktdD0Hc c.O9|9(P"Q)_C݅Zˏ[0#L϶sG]I0SQAb,hXX-D^T3rB`X-;Lzx! g@ 0XXc~XdY4H@DZ5J+1A2BUCEDME תѷÄEeXYw wM2BPʉPk>Z8_vz.ڳ64MRU#YP4jylUrDm52ߢn$F YT_lzd 0L!Hm0zu+[#]NiTMmönT&P-":{tqt&,4zR6\n[})IHu53ZSP'ՓFCoׄ bL=H'c &% xKd]\SP.O(7"fh6?g.:)Jjn 2T!: % 34# A)wn1Q-F1#ˆ xa+Aqj)5NnB0#/|W F-(Ca( j#-F:j#P糞wuKhUp?{CڨlJ%cT-،znzgm͊؁dXf=ش=CɗM6X*8UK.dP?!\ۜYg5kPnB4c})lo\ ֫٢W>85FD (|\tժ (VV'zɌ}!?ДHA8Q/Z{.qzݢ>j"VN0.PrAv#ZAq IF i>tIJK't\- &!K#rۨ\]2rt)eo`j?oz?˳ʍc YO]5!+ ]&(Cr|_~Y kwS'8tAì< &q^ol.A-+箿!KOOc@f% L>p|+?67\+:?x􎀯g]xIQ\`v/ν +Xya3\}CJ+J+J+J+J+J+J+J+J+J+J+J+zZjKqr+ԋ`S?>l VF-շ\qbD+J+J+J+J+J+J+J+J+J+J+J+I06^p@`/&>{ V<"1J+J+J+J+J+J+J+J+J+J+J+J'B|I{AW6g\E/7\*""J+J+J+J+J+J+J+J+J+J+J+JE w qwzE1 Æ[hwi2 \yϪgwY>h^ a}S[8^;lC}jݫxRu??.^+a/xΦOh5`&]4O3Q/Htzt~Rcgtvr|w,a۶ 6>nC{i9|XF{^~?>WIrqz6pՎk8Wc-G~( ڧqN7A"Ȓ&%Ff;Q!sSԡT?7N[ia姝saw;ȱz%gjN3M%"937ؚ'N>m mjjr-xMwq en .#۠$t!7D(8{PGe~}8=25koCQo LtjA Tg^ Z4P)i%ZL7T!o毃1<=n[>I6[tIj(9*ko+]vTnÐY ܸb:(yÕ>H,Tb›I.%Iz&4Eѷeٖyo:٭z0_IzKIz誓v)? AGvV3<|]olϧgg\ThxB P\k_<|p\6yE֘.bop\7n[a'/x{c:\a &b=+H$<ӆi&u$ .A[ֹ~8t#wyinZj?̝"f*~_hnb"/4nzލMeXᕟj$)xBpzyPR%Ř^s.4^KAhk>铂-La_roSUMW[L"،$|g& յ_!s!U&ps94Ef0ˏDI*^ *bKK\sQXBeU\Xy}2q}̿Wu z7<_Ejjn+=7}bh"sZ6=>-pvưCBÞЌɠ} ڲC!CJٚ4mMſQiB0b,TXpw89B!qoNzߝߥJt`Z/Qv$~\:9!8Mpk_{AY/NMh𜝸jAYjn+5ʆ_~eh{wBC"0gr?cz`mS*zA.-Q)g2w-#Agƃa4}m ٙЛ>jWBo[]r}6UIYZT"Md:c%}vW ^ms3:t!ފ;Nʜ+s/hC]_QUAh JCu {cjD%Q`LʜE$m% 8rlFI"Q%n@B$q+'647;!9N'޵*gmlQZ#gC93&g ]O6BEC뵍-}޴9YaA^]y( #? A;\hg͘)""NBP5OIOdgXE}LYkEHXy>*ZfCɇG&9^q88:?ەͬxb6oFyjpKKdѥhc2 >LOCIߛ#"H0?%.t+l+|ek2#7#^-UFF!""08C&ܸ$bfnًG{1c% ξijCM6Yguʆ\-e}K9ep*J $rJO6yd|R^U`9eB)(AW^J"E3s1q-yq8*X.JC 56BKn3eRzsOмKmfS @]ry^Zxs+J5Ђ ނZ/$OwKFgejV:+c0[+ޅy$iI.ٱ٤ޱY9؈*.Ob}^qk=\gSp_&δ켠E>@U,\hX"Z$֔!#A*!ù;ݳQft=8,E/OZPy[LU%3_zYIUb.Jnmt;vc.# i1G-5s)F\XsZm)'q]~.%eKܺH|HOM%0|2;$=RVZo?W$ڭkV/+̳d,EQ|<%f 5FBˍ<Z7CRzy^|i#VUJA(3IJ&8N;wĖ- 3Zx $4%# T`-hF= l.4.;$GfTVlU?4޺l"l͛xu𱇘15sgܵyֱ Y}Km,D ܙ2DhQZ\ iU/vdӁ+Z < ֱAG6\9JxFP;!IPnH J 9n]PN["% 2H 8$ F"ce98=!p0grQEʓdlyOK0QgBt+t&ztbbd ~Z꓎wIQ/hg1I>s쏢CU\p|ʍhW%\/_6?+>xV_+,F|Ë+cx9xNjLɋŋwKzU z]o]=9,'ڤd3ԝUO{՝fx']7ҳ:3Ս(/^z8yCi%.JWY 2F?{KEs8FT^"X7'BpR+2OE)^֣,^9>Qo4@j]b\TſCA<;^Q~ywToS՟˧I0𓣺p3'BiYۖ9yȎYa xUWR-N/!cPQ1y*jarrIKdU6՝ޮ Gf]X{x㏙RDԇ:m&?f "ƒUxu FRҸ?_pwm(6)l =5RbQHZ@sSQj<~f,O pzqɎUשBRi44o/7]#hʁcڣQ&/1'qfݼzG?|t2Np F3·FV٭";#;y踜dU"hx * "%t.Rx$Rcւ'u;R;\$.IOCZXeԥDEI|7,9/F!5 I uRRPmϼaZaL¯sɲe[y[ՎG&fsB+ԨL)P nhKz6*%?Ɗzrijm卺55 Gڰ3Jvoՠ:61<w9!q[ط) #-#@l饷ebyhicϣp%c=bV_)D* Q!{D(Ix0$6o}Wr3(@fx9[m~s?yY_ݯd])|L|~ Ym7(ߖo} /&)xD;aJ/=gT0aC@򇮁Vɵ7"ٕ ωQdFpGjDN B:&h`$Ǫ9ǙBrc.OGp~ochg޿XrCTf~*q|yMmmPhu' N"WN>)aX],s^ˤLz*ǫW\4JJw'[2U|ryzUҸ\QQrKڛp>FG'zbЫ^}x1"5BDKHA HVnAN@y5*:F C&f9a )K$)Pw?$_D~ԑVՕFvٿ>ǴpHŞjP4^cF}D`VuA͟wxq@kzxǤI KItFd0 iFڜ`` #0ͳEoQЪ 4 J)JYSLN"yZ3PYQCiWT`i ?^Sr׼*%ӕ'm. K{rvJΔ..nI=/Y_(gZojTȬp:d.SIhBRc(2/pPߺ&ǘg%][3)Č^DIQs- \&#U2VgeUZ;c_[mlAmJroۂ{^/vɷ'm]?tuf.cMFAJIrah9y4Q9]fuu’-C!{. &R) &iF1Y&@C,<{}YZl~2(Xjc_*[m5Xntś$͘Kt`CuY%X '҆Nq[Ł2!CĢLDL,hA4Tj1cV"8WVg=H2pcǾQWzEܴ#1UO RjdRm08JAKw6ЁTc ghč#key;R>`!餹Np%vPM-qvH=ŶY(1:="Tu`7"ah:!0kJ-F `ʆd4UY&m`a'CY< +3苠lYȭ;(8w~9;X5G=Oq;OH\Bd3kAsSR"]A!SZH*G(;wt&၁&'F4O#Fl1@Ȗh$~(;UT϶ꢼj@=:hz'BCk.HKVE=ޅf:!r:ԐeOFZtUh@EXk\FK240_$;\=e.0bHAU6#+5:R$Sp:&Ŝo Z!eɭe Jt $)#t`ֹ̰8;^-x~sVd/: q#hU-Qx}0FgkS5GpB0eBN60ci3H!K* WHCsKM'Bm6'M1'AKGJ@YM"zm2i ϓK:t2=r<:m4,8HWXO,'|b%1B#%t +U9M*(e1Gݚ o3?ih,s6z[ SmK`.Cͨ oaziPK|| nVJAk{u _Hi $.Ÿ\J (bPVy5qkzBȆPutr~pAڟͳCgpw}q?+_ֵ7qG~8GP+*m~sWL֎SK~rhmi3+57КSYnK/GFw_F7ћ[&]5q5 uݺ_6(r[^}]NCs1'M<䮻FS[C{w󖞸ضq6xޥClwN]y޵y:A#%*pS㻞5ˆ'7< u̪`T#JcQ(o1rfZ #YӗIDA:'81<" Y2)%΢C*%|hLV9c&P8]1XEeoT~-O.\}t`w^!13&ܱ&Yiaec=^mFCմ>(}C m@Qmfw ]Jx/ | ;f܂$f,W\[o%qEbbȒ ֊L.ZȬIڸ's3qWާom Y ):=x7uُCt:).!:FZ1_ޗd oOf?(:' a:%5N/$:,v%'Ez3m9?}ab}9Cy6 .#2F7lt&YÕ+QZ=7f<,ݸ%6noE?'yon׮o(ʲQ$yV/7aJ餪–ft}3ONt¶( &n.nix=u;kn85l 4^\vy-Rf!yޮUӓǻ@O1Me,NOυ7~{^?co`ԵLn$ϜS0e7v|9k,S 8 xl_]9w,"f'ҳ@;%赚|Z;&t§?)yxvv53#+pK8Ę8a'ĵmW' IfqmYr@+N%+S:ŵ JxO rvyΗƳ.}B-\Z~t:cX%Zٟu˗@c'a1st[kwu[-tElv?ѣfW⋿>g/Jk8_]_Uú>r6K*gJ6_A&@pj+)k"{ߪݸT8eڤ @! 8pTvD@3=3} M,}Mopk05U\$LۑQ+X٦kt;as|+Au䮷Ujҩ9ü|~MtWS|QqC%z0xX$;_^zu}y~˥={˫kLϯ~y _`fY Ndm~Eb5x=^jxT/Z!u9.Շdy{myJ֡'uN J矞ަibN:?/by9tZem0͇[6'>IKOkTQ銪dZ~8X?Bnsk/ rGILD60yiB$7T V)C-2pCf1wc8,t^h!F,c,1Z!j$~$"rxs.AGyL3&lUwsFe  {e ?-ϓ-luw״Mȥdi,9 )q"-ByhR;d0.Y⨋Y9r56]U8L`qLD"&CTdR8$u$-G΋/pklq8,cbw=z6Y:k^9b2Gey6+s&+ͶY)%{>D6|=Dɇ"$P" 2-r6w 2|UDyq7~ϰW\< SK.Q \^,!39 4 v8> cOy>YFxvoWqh؟wOɏwhҴ<51ӶWT1Wg?O72^d۾tb$=Sh L X $;*9TN9ylfqϷ Ħ'ިYg`F=&jrطaHQKCRBB!WHE\g?CLøО5& ;uSp`!.)'[AfDʑ,lڗ:I4݊`KRXxۯhљPDEI)FrMs-gzfs- ~1O?7擬1ճzJz\jj;fR) DՔf`#%*9˖JgWzO8N-m}_2qV37R)BN0߽Y4;*c{v{~.] nE4LSb!,C2d1(>gEDmt'tl#ohv6Ji(%U,)Ӛp23B9RO(5ak BJ||CA w{{P6TبO$0v&s+3:[ Nb#N];K8u S5@~Ֆ\4h$&KBwme겜ژ.dͱD+>lCg5azE3rrź. 1aG@oؾ _q}/D5*>GO6hFf\z_8Spfr1) pX%Eu++Z}4WYqcKj?7:kVOx&fpt- pfH*$$ycbEqfH,:ͱʣT RLqT@5Ozj{vO=k)[4>gI4}gD(2m|o8)L hE;|dwH MC XRqXHMU`4}3.X"qBR# IpЙfR# $DŁʤ;j!Y6y^BƬ*ulVSfpHg6fuk֝>8ybL6ֿc"MѸ̬w tqKl/)sUn%dU'xq)İnS Gl.1v60<8g!.Lq49E;"@[4{w;&)|ulu(%Hh.4IbqJy ɔ,+'Z|U7@Vnj~.>XffT^meH_;Z)E:JGBgZK3Ĩ)QR$Š.1!u""EHP8?;7q!XSb]Ks$~5ֈUTQXy0g3g"FT h%1ʹ%CifdjAUr۽={@1 *Z)!<;E˜1u-c\&ښh#ʞ63l1I[vѤb(:n5Nd2[%jNys_|VӶʑS? ug# u)[cjGZƴ茁V^v!l+Y0[x$[XX6B֜jLhaǺgcǺRwwM~mֲ63>Z v̆-üb zBV~%um'eV0;fcbL6a.78)QmVp\$bb3Ϗ1c1sti xbhW Zix .h[ŏ{جi+p]3&z1)i]W϶dFbL//˺#$K;ɶ3@FgcQTp|:T+8ll[J=ݬ`|J{1Kv(t|?eإڱCN<FtׅPp)b%slvʼn/<<u$d0,;Cx1X!`KQKDg f#D:hL~b]ӑ=\]\1 ϴΌ[yrS̔8}S11k9>t1Zq,riVK(X?fmjG8:6֮"77D8jpV1]Isȑ+/|9_>L>$J\$Z$HI),)9:-HDVe~VpLbW`{CZ]WVnw~B%Ji%ztϫiI2B?BFP^cp՜!ע<Ŗ)}@OJTn)47zJCaZL+9n v?g~F4gE '}ya 2KS~i \8d;iLxW[7x 827_!dмDz:4k7BWiAH3VJ}2vg$ XcFٸ}䣞"yU8B.h<"нHhm3hKvTN[uiØTn Urb563ꅘHw&'u@ " 5([g;QО œyzꩊ!`ŽGc/%FalEX]Bo'Ўu:w<Mx$y$[bԃ0b}`T&y(^8%C8Fa$!P+y f,z$z{YgB'(%4+w{nf1m_HiDY%86ƒrD,i=>u+@YOscqR$ii>' HDBBCq*Y3ChhչFTog_DVe x^-Z(4ΠV aD"ZāH:voB:]_ HI餕,E߅1ir@edn{w*sJhd&H&H*X6Aq-D&>K)BhlL+bTP8XdCn$H9‘t>EIimL&y<.7=}nax\ 1"N"L 'i+5"(u$LxӜ}w$lDTY4<ũi{Z6iͪVV늇 aۂ7[5Ā』Gueq's8KD} IUjlBR)%{ VgSx-APDbGdp$# xw7hxߛORf=X-0JSNe} jplҽuR#M֩&J҂<huՂvBSMxPyI¨ :giLMKUE-$ԡKRNT׉ 5~뒼;ix{\h5JfJ>˛|<(mj)qi:^,>V bf21#9 7;{vrv6̦c%Tm >5_Y.ݿfI>dM;?9=>Ua*~nOkMbV#^7KkdqP{sO']okd%=픡$fdM'[¥71kk^"Fe":w{ngn/m2ݤ.1'wRv&*?'bBUwJ wK 6[1d|VJ*#PpJf TRE+zx?CZ'gΓ8䨇;M772bR< ^sތ΄+A%fi. [wtg>oYf2>&+Ϲs5BǒשNIIbn?.m+(%|@ⲆOYѺƪ2ھY-ɲvԑGSΥv<*55 +DԀ m~;ϴ8H'LFw}6aSjE* XP A+/k &:8 Ss:λZYw 2'aW*O 6G'SICÒ"jR*sKvxmz,ɹAKhZ\3v%Þ /xxAAiW5#c DiSqlm t͜{M6g8rrzZzָN+vbC+zz+q I;yu LΏvȯ\4~>݌F(Ծzrɚʊ*5EiђȪiQ|0[--7T0Ӵx+;y򷮕qi 8%$p.V1W }Bbϲr^L]#ù=k^KάR0%qΤCJZWy$A] Bu/qBi64F_?*Hl Z1y=zָ:HдS >=k|ŌT0cEC/6=Ddԅޒ҂;fHxyg˗rH/Ǚ׃^py},?/#F5yJfޖNg aU)HSmpc*^c_4ju̘z$#r\SRשؐYrO*\Y V.IegzՉvf^_d9ojT7 )`{^+KCeh3"'_' xRÒ"ZWP GEϯ`9'm߅ӕv5n.̔YY{?>!SǢY㚍 3w8rZ_nx«H;/jD :`#;ԳWJx c^HTTtW8TqFV)lfi|Z n2ZOd|N*`]АH cOx mcQL]9{]s\(+ ?&-gA1^~̦3H:"hNCQ2TӶ5 Zv)hjp$՗ Φ+,[BWˇqan%@ vKS G``Fx5H-l 3 !EcsvLܱtRBf|=s|PoƳ}pƴ${\mԊZ&451ud\#qEA@V:8E,t{с" 'v*?8N#MY72)uN:"Bnj\ǹEKƓ Lb@I!7{g[ eEBT|r06(" Hxjppk &AgL[,%#A:JQXnhkK8N[i 3xJϱ ![߽ۧMs&CRqlY( e5*c9 ڬfda>k*}$0y5#fp˳:EC Zϋ&27Cv1n>8hz:JFܐ|"q$燿I /ePVd;@VUx- w<5DDBfJGUl`nbCnc.U?q|>nٸIRkgG:}!$ʩM;S!\wb( lFH\ Qe)ʡ;.ldžGKt2m:0DW{rBHwچ5^n#) #‰dKTT\"Q%n4:%BU={'d 烰D{LҖ DdBZ.Ad<** fF/,DT1"V(io0=򔆘\RqVpt BTM .(Y㝍cNeI76{pt;c}I8gaė'W?&ť2<_[~p ]״0 Հ rzMNokHuW{j ɌnǏ =!}u}_#ՇXT-/TU5|\RxQ)n(7+maR-Vsm 0+9E^xWcz<M]-u`*S+u_ s ^P}RWTep21EpP䰤pIJ+A=kv#A\+Gu,v,"d&nM}ٻqW.`1ȗd7&A RGY;JvY6e[`/Ň{>tJ4 Ȉb]I$L0k5잢GJat3>޺'ӔcJF I()'(M@w=qR RcUv s֡uPuJqjS{2-n;/N!%D:1R )g$a/3iAD WM¬ڦp_$[c$ӒŦ:E A1WI(0eةr6nK6UUueټ%']b.(hteNK KQ=Gtal\y:iQn)/BBI\v}})&Q !p0/ԉ.l(L%m_N{z~ zIOvHN,Ds--Tx`d~G#)oZ8 r|k^[n8#8h4~/e.R 22_Ek*rʉ`r[,9wM- hasΜ ˎk+/@s}q 6_<{wnzl$";~TOK[xL^[$E:e5S,LFfiDXjoYx 8?٩ŇehbS( 50 = B&X!k>` ?h򘕫W:oߦf/b_#r)Ȼ= oNT}JbZ_Sw/ 7@z$*mt^;yO\KU7W!jR녝btX? {0*w(_&YZ)`ϗf >;v7:%u}eaD^Yz7=ײfIuBAf d SbI~\&_ q֏sP`8'_Y&WLr|KsodɼVU>c3a[hpGՎy2y4wH(UB r1Rm585]W|~, xDN US-چ"Ky7mQF%hUsZ5 7TxP09ucoj  vhZўm~h5S t2(qXdq ЂBW`/Wk"t9,4BuT/aTqOzvbrj77n?â&;OXwkY=UO@T+ ĉ* IU2ȣvEYKͩJpp{i݋"u1~`3fNE 2X{U߶6[Kl8UT[XeNj 2zV"8 UT.V~ˡ744C[.bD }&pQVX"?2|$Je  3ٮ? NNR5pK7%glMg浨~ 8`C:$q H+G:3.+c]0ְٜ=>)~~y$`8}w3y9Xseޏ ɉ*EgiҬ)MoG8șੴ\tRq{YYm7u;l)١dI̔ q ݹs"u߯bLϘE{֩hAu"*"r9^F)cѮmTxLKU)Je}f(g!S_]cSedUB> ?P%hGEtBAF/bKMfqu2W[%zʣfjނeY&%%Z eZ큳(j];^]C,v!#\..38ډݜ0?n;+Kӫqğ˝4W2$9}MϗL„P+J)+IF3 cNg)22\0Ouߙm_xU z-3n(:󃅽.Op း,o2--9x49c8VH5 \_Tf9VDcDyl҇]SMhneqn4MA*kHi]D%?&i/KEvY yt&!הƑ'@Qx!'ezSڇ˯ˏKv*A< .tPz>y(￷|F@9%UsLDpj|n htNIE] .tt9HT&? rAf/0\Kx~KQqxc` y@у^␀T cpA$}S.2#XE;x~{Ik,B*}ΔIO:H>^99.^ `YM 2XD:L:(/mfrlx#WMȮU SNP1ߟImRJ˕IY~We냲b;f,y7tûhHY@ (ʆ$2"!eBcCw֢Гzٹ9 !/ɓQ||詎k) eh6ykp o|7s58܁.: }J˃CUY]1=OyI9ۯҚ ;s^?,|/f,Y^?{l( 0z/H 8A`E$jގ&qv;h{- F.D㤈o[a:Vbhr^8.&ϝ"Sk}D J"qЍ1SB۴- }>)Ma/Ow nI;U?ˀ]P)[u3]$fz?̩Tk1B`O9@u(믁n*WX~2r塝_gjmޡ(SuWE*b3JFu4fu>/bD@ ͽdfG9-|u"/'x|'4+L;Ek_^+,bt OК=u~%<ް{<3 V6^M x-6E(."+Zs]2NOƷ-=<@Ej1erH@g<Iutg@ߕF/^8^qEܪ[{j(bG}I>ӊJ`Z/@.qShVUʕG:V_7Ϸ)xxM2Ri >_}w&mRtjwzD$d!\|>xI*c <~<YG))jtR OU@vY ³Xa:=/A@^NPkB^P PXXCm(  %P{"+pC rdw1NM*D ¨L'r!9PEDۃr<[ B/ȩ:npd#jE)Ot&r{$@֒A$aYBtB 2Ӣ*G5YTqRd.4?+O*eVR%$](ef=Tkp9t Lz؅\N4w*j qG/4G?$E߽X)*c#29yt{zȕ$dv}v=UA4K} z7JpXdzO *%b-3 ]΍L?dW m_|"~d;7ȸCZ*VlrꉳI!Z^܁~d ?*B맸oD{)Nx}WSxQèYF]HJESʤ,{2dl}~yU V?f&*2'(-Ӗ9NF_KiSr=Fg;)⑪`-W5IڍnK)7 #H/[(.Y>j!ᗬ!Bd`: e@;;:y Dh4iPBN b{0aP"q|?@kձBCj4TBUӁ8՟q'{e0Dڂ$;P|`,;H۶p!1d ;THސ䒬![bmgPձS (u=Fu-.v2d1vݿO@B»X{w>CApf,"B$y);eԓl[sezfЍjy|0a6@D &6ldw>mv6/b蘌oW]o?X3pz'cݫ/N߼_Wn 1L "J3n03q^p]k*TpV{OD}YgNrn^ qds5\m?ޛ ˫}>R"qZ$Gy @TU$Wdvofe#/F]d>[~ s}aұY˫>= v<W=J!T1OgP?D14}Ars(P[~@seJex+-y[&HqyI;{&I:G5<7ܾG(*Mo3_$D.{%e%;A_!.oXlU$eȯMd?+.?<'g,sz,7W(N|#|;= D3^]H,`Ba(9OULwIzc—ZdfZlmo;aP5Sr( (@@VYB$aYBtB[nT}Q,2f2zr.4(VF1[]W1y53pF ܙsxRB$b\/)c[)egHZou"Ywh\nV7:}} h?ŷ iT')4zv5KTe#f\]>ZbƊ~bޯb /-W<):YJ[eVz?{Oȍ_e3Ȳh f:`:|XdUdі-ɶ$A>r(*=/>GWxg}ó*`΄gҠ?QL%w 3M{ __= o+i8˻f-rx-ɛnH֮q!s^Ji_uQÞ4CmŒ꾲ԶȘODZIDGcFb[61K:YthJm!W.)Avؿs^ƭshp\Osn鍛*#> =PI~94qaû\Rɔ+.RLd7kJI`UK̐Of6л))wz7V.7)^ 4n kMZ1壶qmX1_K y١\ϧEW8*գv^|@[|'QZ[1kgUbʦ8iwjb tl?WAYTE hL`j·jBpAda@NH7lN1g+*kGĊIo, fiJKmڌ]i~q(LG@(ñQ&ηXOʾswSMЉ28",[)#8ʿWڻiilpT[$:Q}4hz$['o/P8ꐭP8Dz96GQ!:\\\musHN_!0>}tjbRNk9Ci h.)6, {T5\I[^슎VCNEWu:i킠 [v6K]gHl(a wDnI\Ɠmo}0}\oF1VՏO',{i6nm+GEwN%G80|Ӳ JUPRBFnaoxum60f#)Mm-qTNlr[+UW ;B$fKDQX5kqjǸiдM]ϱf"Z%H -[M]nzo*Xc̔[oK;#8zM}P1{hm]y!{oIOCrLXqN\gZf03zP'4wCtpKS7Q(@DMSibP%Y{eK<WFkkA e0m]Jo\b<-HQxQA}.ƪ0m=`2`grFFL/؊[ C jbr{zMU<. Rhg |@`r:+Uwg*lJoA5c &„,aJOVJ*Afq4PS`dBШZ']yR(k]Ŷ戋qUΣw Hms0.FcSYPCa %+tAE2?R,(yct [:H* OÈ000,GY@ q?v_,TצZR: 6|ӣ*+`ۆu&s~^ 7F.y!$\Fo᩿T,sKFcfT|\(Lj >BP=2H 574GXef4 ;P /k ptqQz1O*sF0ԑ)a+ H9R9D Ν9/PM0Ɇ",sFhqp`Σwq<%ᄛ|L[0 {1qJu3fKzi؉!8 U-Mex(F?<qoFS35y7mc8>(*ۥ99Fb;ߠ,PXK[˱ʄV7 FK?V~>_DO}:|aYsA5A;` Zyseʈ^%$v9UˤK}ooHYjEn3v;_Ê k`!#Ň`:Xr+332ϝܣh+?fDPi]SεDk`۬=Iu<Lj7m6 _/B rI5g}T"5eӼn~c67yH/q]BPm1 ~g 90ә+,ag6kPt{w `ncrci@R̨hn[#8&;掋0 {n8XO1rA=Yꛬ4T-E0֚WIzםSAD _Ht2s%qцԙO EQh d'vBrRdp' C06vԙRk-KeM9@9!~)M5qL3EBd)Duv;yc5q4޼^x%z^R1%WCy.Mw=z(@s(^"S~\D@Wi.BYwkeA8 Vhą"&XAՠXfZk/& f:1ïiR`nf9CYNjg!mcG#rc!F2z:v( |A܉e&MNm+/`l6,,tB'vOrZb#fEbt!l^ {h!8F+p)vU'FH? %$VPQ`I,\(Yp[yN 6떻 -+K9_+b̌PPxX(UexSTFsShFwam_WA Hҡ0YGtzv8mҮjȎoZ"Z߱QH%QQ"Qjb"wdžФBm3RΐbTn|˒~x3qAs5D(rXʨ*vxP~M⧳ H>~híIꏳF5>luРy19kjl0]Z M^=) ^o, ,felf縉o0dҥFujYː;N*`,(m[no>S˨>n{-&Z_)Bp2 q4C$8YbM—Dd,w !O&{3i1~X~~ZtW:lHQ 0hxEIb2gGHhV<S <=en>@il X{\o^|N۱2Gwݢ# 騒vՍ`jY\V,.o 9VZ_7>}ݵ'Xk2LwQ0Wc/f_c)Z1*Rr)*F[=v=bj&gnЁ2緐 oix!7'Z A/o4cr+1¦Q[|TK+d>C6c oVrJMV߼CVbk???1mR" -Se|]w !1\y\3-龛r~ * HW=_8$]H6U QꧪMr٬`vưK]s"xjZi=oh 䮍Lbh6?w`} Yfrj?f]΍_|aE*}UK/Q x"*!~bȇJو&Kb ~zvήGuLE=i#R1T=᧽F;Uf,zEGU ݕa˚Q*>E)m$}t"6ԣ\W@6\I;jB&h 춥 F#9̶`JG;r;.}%<]vCQJe7nZ%6SVNQcoq(Ţ&]+T&' !mk脍UzjF !կzmtmG}ca$Ǥ[|+Wb7-zD,52 4EVI"A}1gˉT'{'$(2 `T"|T_kr?]M` [i{Iq fs",M94`n)|C% ݬS R 5p/~.t}#^_i#Vl+`o/%D/(ʓ.jLls\ʹi"Gqu?.f+WQtp&~(J^CZΥ|}zƏ8jPvdZm]$𞵚yIRr%|x1Y8OVN7ϖ|5X{1נ#Z% gz$z=>3+vt' yWyybAf;9;9}2J ZQ, djnfSʈYmy^@1ve#eIZ9{1D b94Q+)5RA={VUV,̿f 再 e.;T[*g̋WKe5ZI|k0vWo=9?qtefMpE{ L/,yE-j{m=c)}2y)_!C.RL2r1+-`vԧWJUؓ2eX`iu/`T'4J[Wfv @@+IvPItkeЃIB(D&!EGk&fҟ3}e+F Va˾72T=QSA4VS=P}rHps-O\Hߛ'{@>_]Knk `ޘ7Pݖ{q q^J4vo'l8=UYBd؟woFi.Zs-$X0Z~~E72gϻcHϓF8RPql<Ǽ4cx*`!Zpx0m ؎AWNH=b wOohz^bʧ ;]^ݿlv>gfORNfExNvf|g3W0x/B2gϽdhIm7땙y7[sR?xX+ƹ?)wo)"GV [/&y`Kn~=A zl e(jA Zr&m $ ,NE,4Zm ؄;*!HTJH gwﷶ,2Q3V!b·Oz'赧 Sl@p )^4]^Q9bFjuqGC Q8,%&^gKMj,5dF0% '@1K*C! KQg3%}3ǽEuޓ+{ aT)a" tEO!sjdWi),H ,#F/A&b-[E1Hs.zIXhPOh5Ιe&fc ak*aP.|=FᫍMi});)9RdNR *sd%a%A P:XDoĊ.bD0A*-3F&F4Q"9]- "ɋ90hPhP O)V)(}Pp¢ld1&) !$Bx`#B*F+񻞐{<7GT/`OM%.h zV_hM&rF1J0(L6Eƥ\:hFS6vLI>u1y!`'|AhhB8k02%i%)cF nYdGS`L6NQC"NnLa4IJ|ٱ,r#ɏdѨH5ZdHVxFƉaShbZ㚳K#zբ|?8Nb]H^t6Se.+ QEC)Q+JaȜҚ^.A/K ǢF,4ogw|eOXJQw53R~};]yTd'G8^yȟ .F%vq U vF#§N \XX31նނdVG#Fsr`4Pwzii~|tٷ滦;Ч_kwGRY///&(4-cw:ރ^Z庼ee@Q#F*|bf1_RrݏW1qeKBQ(9@́GmQHqk:Fe<:O1a22"b,P "[ uYi[bsc߁z9ƾA!sk=Ƽ  k"ư=K` 8 Qp*rK9z?),I@c4`,6h3SRw(S+(p1}Tz 'K M@+º>2eR;|k] ~JZ#Y#踷 yHfe fOW۱d;jJHx(D6 b[ѦD@ӨJMTԪB(+**gQX-=W7T PhB*%ܢmsMՔFkU^i+b:Fڪ_GDBݔʑU'%m T*!6 im9ƯMZŒ!XyU2'/qXcT @Q5Fc:'Ā1izҔh2%rAjnH2ҢIjĊ*b=-^ jj*Z*/U:у i, _jء* HbS*pu_bw[osU ]!TL{RxJ0O =M}AsN.=Nǫ1D^k{/Kxxm{ KjO9VPLKNw{ I%߼*rv97gSD VΞߪsS|y5 ]Qc3i!@ɘ x?gq(7*|\FR1s!lΤEK(k P+;*@|K>h&'JR\O?bR E 'j1;3@k6rlC: o) $qr/3cTi?U +R {3zPMiSwyݎVFmWTQu>@ghBx陰{[ K—6 (~FEiYaړե#7Sf@Pe{ďܻ|joAE$2r1NW9/?:k/^YCq_/.nS|=k \(q)4%E*4,":^ /JF*"YE.Ig 6 Q\8bo gLEQrnJƓɩ.uwNJ~+i]jCvĚ$GrX|=;kc.s={rtVmܥϢBz_*ʑ裏?~NufgϜ^j7W_1]oS@Zwl(Re# y~t[ ‡g>kY-]ϓh~ܧ QϞޣ:2Ǐ^c8?ʐç)u;E^bW[R..Μ_kýs <$9RSĭfð2q52=k,D#xdʨx?hrCό2 ЄUO7<'IyTB?9)b8j9URsyU`o(VwEPnˎ)cE4{7Nq(q7~x.aGe *~1lFĢL-ﳟ-!2]ћ]%uo Q)$TPIX  f2n{$g_G? tKRkAI0X$U)mDp 1, }ZQ>fcwQy"wܜv7jftP3zIʛϕyR q=zM݅V}Gʃ (1k&mEe7M =aS>eBkw Y|m7#gl~=9Gf ǬT~+x]|ۮS+*9|7d*8z놽%04T CG:AǑCp?^ -Q[ K#q;%/kK^}bs$k6&I+QЌʇfT>4pb)#a N>5"JOHI fY71j,sI#sH*(}3ov]l)6FU*َH}{ooyҟ?4^3RzH5#׬N93 v@cA^o8%/nnH~T"l|ϿnyGָfL8NsvjM=oPkHm3kT!ITk1^(>ŞG~Ofիum=v'S,W,J̶+s̬!/Enga~@x.ӾpUءB _$#r(,G?Sr3)0F06|4n9oJΨpxnZkz]4V w`R&PNLBi,,\Z蚒3mjzӍejEM_W71M̨nb֫L=G)9 D$L]ʝʪGUM8v1\&q(\ t ׮rY)ƨ,XScH ro/H&!H&U/ 䃢1rqdZ"`}1?tS`LUS& )p"U·GIr@m[tyHadel/:CߔGW몀ڨcEudT]$b"]2߹k6q; > }1ٮ=/ By9/pFw!=Ϟxݙ/poy7h>i/ػ>pޭ36]&1 Wv a_^mIMEdzR_brnb( : Z@GpSϤJs䊡qJ\|DWm9RS}Hl/ž:u@9u$|&{"\61(U{r8#.KN?ZSYUq@y]dȥtbK V.& #nPM4=)nQQqʎ[RB8囕5zY| UI# c fI*YZ .J`Gب`X9bӤNTl 29(E)-i5 xE.=1vo:+N4 jF,b\b9iT_;+e_T;(䎨ڏcxxρv9׼pCτm3 QoF8bZPmer`~|x#bzW՘lC5QHpuRtg5~@ye#[!$13CLUELM@ qSnXYՖ9+o?) WyO=y^Uj]߽9{ď E;;}?͞j]ݞ?͌v0jfb#qc̼Un`TvgJ8>d,2ӌq̺oc}Uh!T9UY*d"4(dezzK_ܵFGc^D S^hw!q.y_Z)\oŒ:I1"@.B^Q8"SS+ :t5uxݥ"ε`m*C[b.4Lvh7ܹђQm'Dz۹԰gE|nk'75n6 c2)> tRyxu@Ě<F)- ܜ5MLB1re2ǔ*B=kɳ}'f\H3k52%!z4*z~ҒOSk)% @Ka5b R:$䤖}^~޶dxLtGg(v#@Y$[K|^ pwҹ7oՐ cN#=݅tܝLM=%vu>T>VB [aX8PR)tcDxU[k=ʷYK~(hf%'FDz#Ph LC2JAOuTNcLݘJrPZX$r#ygά,oJ6Ą$D&OWϢ]|!ß(D w9D'eSa,KM-j%Mx vͺwj$q9ɉY '0JTZ]Fq( h37-Tg"k.xq+ A2w|(þd3 %8)!&8D9f-Yf81;LrPp&0~ma{M&#;ߘBꝀ'ׇdzV"#͚g)`f0^MF%9sML. rA8?jiE:X(@cg_x*1do+ӊrp8*E7gŠ>ڭX8waGVݢi{.C)>e=h? ^n3&Ѓ{3e6nBh4: / ظG:v)& E I 5$;ˢI# &ECRwoTk`U5؛J?.ۓNgm툡\!dՙJ6f@S#5Tl 9hf$4?ԭZ MfE I}w4r 5oI.#J9 !H5B?GCu;¢̠Ş颖^G1 iFTRfb&#bY, г$]gHśЊa ؼ~oJfda-\n84qN-͵@a<֑|P%|iwGpҧQ`ovV{'<^ZL^X}WwAȗv'ԗ3h4J\c2~[Dzh<cz:#ѱ͚gcbƣm`ʙ:v/k:/:%.1tfdo<\V~OJ;)gOYɔp7ίk˧Qӻ9DZ{lfqM78پ m4FvI_"Ýh4n5zu~oig<77k A suLdlp/ək[z ?e%d[A`i=gz` Jy  g3-Ha.,?<Ήl<IZ{ ɞ~NS/h|0ƣ7M=EoO7vGdfFo1`d% v:}i`!{RGBJ5& o`M-@֝$t+輎=LB'0F>r ÃkקfJtP<%Q g@Og֖;9h.IO19⛛=gUjk lNn^%08g+F3n_YQ^G6V9h4#.[k'"3<,g>ߌm$Mr쟺uimz :1K[ EO7s~y9VZ]fOlA`fmZY_x ӧ?Y7Ys%Z~KGb~-Rڮ}]TRn0JEW"}oV#WFb=!W6ݷyഽ;q\S6.{P[-7{fg7جy&gP"3%l!4ڟ54Mp!V9z~s}WM=)9't½lrowі2[c O07~&Mad{EUN.xObm,T}t_^t,уu|sr3r7'um_zs,]t&,OM{] /ʗ5x/nsZmTþQkֱYl|#K;oG;Wuԋپ^s<`iX?pYl|0!>XGcI]S[ /ұ5e a[kF*:'Ĕf(ֹD$kv ٷ$'hq0^wd{ͻ5k깋!xH!Yg'Krb-s~Þc͚g%+q& mؽg ,Qз8}NV{:pJxb ;cz#g^æOҚKRm/f%f.HO6ɂO@YWR:k߄|ˏ:j Hm@?4`Ӏ9H ÀZ+6}ߚˍd:r[5S1~rIZ3}(^B.'"Sgr׊{@KkjeO84bnƖ]JY(.0r!0|lڲ/{IT6}g]M48۟o>NV$C-4M)*/W1 ŔȥPt#ut|=Y EpRM1;.Xs7$Q#|PjhE [C~Wfy={OؘMni]e1ҤivS_QGmoe$ݙU`ء LMrQN֢iD3`|uA3KDS#7?/"E$Ϗ?w,I۷Kp$\p/7=CDZV?#䳯B4LP*Xq T݈b=mRϰ0qHˉ3Wb`bʖHS|cHHg*,m6NR[J!j,Է%|,̀hTK9X VFKd0f ~!ڈKmB =Ot/nc3ZhӧÎ/1с,hp ?l3tӰNK>= /餐vD-Ó% oCkA#&kZp\X 4^?K &%QjRwo;"Z \[-yu Or{ NA4vSmzN6M@-f@. 9'xpsjöYM`1u4^@e >cԆ7c+n =zˬaC G}(5h {ڷ@YGN0S`l"BhDuNGĹv1԰,' ޳IHkr#G#kƳG͢q$O)Xqyd.ER25ɹgz8_xF[uW gMHu$ǎ&%rD!Gԍǖꪯ.]eՈTjDВfhFa^JxeO` ʘ%SP)XFNZ߶MT~潻m(@UiN o1ފNr@-E<*D]viHBt| 6by쾊Iaә@uLy)mUrc7r$P5=`ա=$f7ACw*e, -:a1 xTס Gt (uJk]{tSc~Sc]5O?^>06`TfNfj“4(<Y&Yv^}+5m0ިK>Fb4{(W<Ѐbws|@7 ER\R6*jPI*`Ln3D)Mӡ˕rV>aOK:S #mA0ˆw:c1JZtLkW3oLّc\;%)؛Sߍ2R(Nܧ(#Kն(:GvC#{%@M4Dx ьQHTg48H#C#e ,#O-斥ZʯH>@-yl4):"A_i1- d"Aᓢ *Ha}+I;DE0/$ RPv POTcPcMk][_3{X2fuaIdMJb<ҚV^jYVFN&U_ժV%YX拏c_ZiZèZ+ڱԓ?F8h_r&ӳKkmפ[۴^ˠ^L'ɱԲZQQď9l@^8HȚ2+11G#MlH-]Ÿ%wV-ȭs{̛<y⃨z%$9%{X H ӋN@bƕqQ0aaT_+ʳ/PR5MO>ha}Ѿ Y\`  jP0l"ޅ, G`4 l5cZ1`!ZG϶ 꼞R I:c,&?KN#Ƈ[E@=ZNv(xo:ךEiGC>FtUHY`ԉ̞y3u" `SH;)oSrb6NjptAKː~@c H"T:Ty^w=!m^*#ޱ4=>RzZϳZ;@[+\*K6Rº+QgLb Di B9%{Y:=;1J\g*]6ܳg )#qkƐ{Nɞw(:+^HLwCFeNYnG@RkKXco5֯ߜ^jW<ޱB:tVT]ꪮQ}(?\_-bgތjQy:Ü=BZ0&B\^+Tbo lF4 Guz94)nlߏb -kjzț *"a`A+@ӦtFQ.>hP;[ouk-T=WQ3YGeiSP|#U9TzN#jg]Y%"y`=wN/{oKҦNw#fwrpe*cK3d\W$9{dl(WjznΩsJ.tI;-]q~ӫuތa7UD m.3wdy'#d+kꨄe[$RO_dLP]~+æh6qUIr š=fYg4+Z9%{Yz6jX ؁dEimjEF@+;aUn"\;^F(p'5B޽d4-Z?ΏލؖVWgp,5,joTֺRޑF@9iޥ3p*ק,Pj=F*@+q}AYTc;\ vP*M]u}^lP8ޱчְ$Ç>Ymm>v4e㚲QM7 զ[Δ~!~6C;~}Lk 2TpSBgW?O#)}}NfiN%5&Ѡ!P H!-)H>Adu+Rn PPR뵓-(JmR(-7 * -F(ND+3,;S}Bb:&K2d.ډ@6QY>ط5enZ"BAmZ - ɱ F`=}=wb x3 &ɴU׶\l~WԢ8 "8}<7j,IGɒQz t.іV˲:$_*{mPE9& j%)(57\:uNqE^|Ld%ΪEeK &zddrfal'd $EnqM䐲 Ưh!6Ofc"MMh%F1Y>Ch,\߶)a ʖd6ePәPՉ3h aPY)`yT([@͘׌y~4V;F8Pp!9dNV:kTSĊZ%I>~|wp}*0'c~ݿV|~r̃n̙o/V 3iN>9<_dm!#4DI¢&MIݭ>e\?ӖҏE_\~..ןu3y^/w}  gG#B:祸],FTQdC L'[ZWoz Tڽf#؉Cy}: To`C(I+U%so?ן!@0n*}2$ТԲ>;J\V "I'+P_*}mPt `^6_:8kqukWYb`:dOއ,Y4(6ƀnAn$LH,c+-|WCW;aqcd9ƇiGM=(>P%0F.|UB'7l,u*ٲ -Uu tUϽZ{鈵;ӣ#nR'{0yg b ,3=^H܃a: FTQDOC[eU{Ը fhWqoÏ&1Q]ѼY7Z2K`FR}M2j/֩>&̎&-_mu_?/?=NBG0ANvʝCꌩܲ]DШu&zcTlw|>l1}lGs@Ca l'aʻoe p Q9ĴoMMQFT@;f`~<6b gr:AY톼&"j*l!¾©TqXaMS*lΓR:k6^^×%r*#*.< xi{-+?D3gצ+AtՖoHȻ(9J$LuLR1LfZ61RmJjh}$!wo+lL;St+A,Zo](|iAڨZŲ%کgWv,B4@(@St5MקHCDw#/ye]m43`6)sc- 铽eNY;~'nӁ5և:g-C{cSҎ2YW8vc\ 3uiZ9Lspio'g]nW3Bb/fΦ}HX#<&g˚/eVRRlh&8NƁҳ=Ttucv tDm3 z ffZ-AGBQ'5 9KG B= Δ \I,4fiSJ̫̀Tll#0ZUv’4/Apu\|x^0Qf!t SSԦ9m sd?Osr{,˩#(Ky/5![7DdS9`+PMMQ-C޽U{,m#ۿ"}HnviM`1%UM IYDYHJr4Eֈ9לhDS*!Vv9HU|`d](}3/4th<䍜u&:U[mHD>s]=}{)9*81'OX wZí1$j/—Ybb>pVo| Ƙtkဏ? A%! *-< #fD@)Ce&L3,!'T*Qvˆ(! T{l)5+V'H|5܉T:4c=Á8]c"*AZV2(JJIra'(>~<?ZJ|>ǂ&D$8]4gœ-m')IϞW}rU rYdA$qkկnMfb'1IHa0f9*Gv!" 4B"Lc¹q?CysNd424``2|4,_#lQ+# XdHd91RP a[TY#̕!ؠG:+Ԝ1!`Qd"b QfM[cFO/?: /Tf~uĊ06(*T.E6o}⯷Ɛ3oIR{ "V/7'&'bEšYW=H~Vx+7Bzfp^Sz]`vW;pd-h5I4Vҝ]ySKHnAR 0 IIZeO 0 ^=gZiY3?$\5QH([d] $8 |G.K0sw &eN,Li)Q-nvǹByV?TS@/ܖi*)@!Dئ& Rzԛ%M#937O5Xs2Os7EŘ .Nuc+z ̅_c;Uc 8'ٯ9 jUQq:0BOI[S.Fu=0cT!M=}Ftu1y{$bc*YNTxSq0ghP}D^?)1זdn0:ZE1 &bWy>2p^. 53@*.Q4R,MJ]&a<\E1¥B`; 4J\q}c% w{UoJZޣ "ujH/JE85VRJK5Doʢs۟YvQ!-jj!puTQQE+_}p+S|JJatV/˷qxR$B~x%3}\5>MEpO;wNN23꫑yP!JKI+IϨooTk|nM4.OQ~9=".N:Kg鮷gՇ~3h8; +uD 9NTL:o M[cvN.*E̤1J/v.O 1^kZ47M; _JwB5ml@].8t(F z~5ݎîv)##6`@A \G:ƒ÷&&BsQ"g"'q&IPb,$A?#acu K8ܐ°]p-+9 ,Y P!H-`Lh(RZH!yX<ݼa c.|bx$aW/s'f6ps :ӋO%^e./nç G뙘ȩ_0]/)n#lR>%k?B SS\̆`?<DSY44[?G2[ e[%(MY`I>{ࣈsLܿFkjVD8"ʈ$`!|DAqlÝ&EU&zv ]Ei1ac9L8<}gʌk iٵR>> #Cn- 6 >oO>ho$v5_rmtp    62:\ZcU( VIҒHQ25XbA bD0*$41aQK@ K0%8ԙ?dx҆ .jS2y%Ѓ4e:-#7F݄ "jِb(c ! P Wk_X%q*v^4ԍh ɪ `*jeL4< i "*tT9Hƒ~! r_yu,dהoC)*l*cEF:˄I5,'<"** 0 G,%!!@LY&@/ #ÄD%Bb {ƤhBBcZQTEYuXG! k\b *)crX)ĉ`0nnGQ p1,l*%U.QȈF7?'np2F5NUd\DEl#B!p$5,@A1"Q\h؄]ZFayBMEh$,B4ֈ` dF`r,Í7[15 VJvɝtӈatӷpp[& @h,+`ÉA1Xtl rxLr G͖R~=4,f:'o_Bp)E2v:*ZN8KY8M`\bf1Is錥>W멑_g2٥j_4}2x,d<2SDemm+k %aVB= $$t HYmS8*l_fܧ;:@iFWa):>@*8TiM٦sWd28M-ʷf:__=:'e?yJ9c0nn'A2^r (tOx:+Jsk)_XU]jABB)UB^ A[M X䝣Srj`t8y8JUZO|a)N9yp\eIC/g P ae]VQpo6/ћ [6-uWv }{Z<ĻT+ y7a8|w_۸[\eri6Nb2y7vlA# ;f4սRNI &#mu u?9Jq"/jAR(,&=y֩"+6ca0 =3d/6||pfw)7f|仇|u!%P],pQnI(O)V:u{M6r31qJ8|Eh}k5vv˜k@eafJ򷧞{P r1MU}( Xݼ_~0r443do_\\4:Vwgo JnQgGITXg>l ]?%LJZ8b&RʸH0ÀRC0*f(?8,6I.kǪ`ѴFnFaٌn>_u<Y^ 갃ޅ`B)Əd`09kL8?"K ȄŮC~M w,9Q֒PZ`0`4&$-~13xE^ _zQ]D(׷ %ݧ&|p9}ҍtG`LNJ\D9U4h^ňiMQs@6_wx^ث' [l8-8asZڹ\Sͤd.9~<#yhдDJ $O&;MMű;:9aQx~! 0~\Dž/ՓEV^ZN`$(1KoQH2˽N^Q[Upz2;$K9(SG zZvyxՖ Oc267<eYLpVb&Zr @{ P]DZIGZzdI=e_:ޅ^61)zzNίオLt5L.@=_7SZnw/;ֲ@>> stX@v:#,!eLHlcz?ova+5SbBОU>v@SbI)vU!%|cu|SWY蠢oV$l2FGOϧt\ȵQ\ȵQrm@O Rp&JŅ`QjE]AZbw05^ gRȒ%R-| ?0UC(<(qނ;ygU :f[wզ^kOY:wY1%^4׊yx2-L}Ƞ{s }OCBmGPBl oAMώ+hXArK-6%|ded V^1xd"D&3Y+jᏢg$ b<!퓢 .0eEUqH:t:#L.&E#zxЌv~uIr~ܚߒVki:5A"+' YU;ZB:DF KO=4C1q j^͆{sA4Dkh ,*hH5>Ĥ dgs&㒲E)h$Rk-iXC$"W!U Z1FM7Rip('1J0Y9+N :av1YOXRypA]II'eNT!^S͢"h_-h(uC7Pr[QSF B5&GK*iԈ; { &2h# Y,& /RG99+߈vkjIt+)Z Zb艊)2S%UC`1@JJFe{*%^;/N"-n|B#[ 2O&§0Iy~<і-@ k$֊Β8T;>W/ĘAFEBhcK-,] %wRK Yè֌ 8p%-3AgB>cL`"w%k !J(cNGOE^1 ;U~""ԼQ9>ޒh671Eu,,2E48F|LI\/ىkM֫?ÃL?_/,_ɼTAiͧ0(;k2a{ʑ˲EMV%m4zhN݇ ~Aw;`Kݴ-)WM]G>B4Y;80vwfZ b^Y 4 KxfPtT(-*~:^\ꗙݬnu8 i=gjF$*wo>TT應4Go7ni_?U]*{DYfE>~|Z %糶҅'/I$f)>@a1[zdrUmB>סR/Mu!+wCT]ir٠إ`wصO0#A>Sgflk9HU5Z$-gS2k:7ޫ BSDZw"-v[XQKZ ?mRzUƠ:;1*PRw_OVKvX݉JC#-]} RЮ#vcҲU9WQ٨Ǔi4b+@ZF~yXQj /X"S]/:d7/֟U_cN(Ys3j8l|2AI8ⷌTTLlU;K|~R2;V8"r~{. BhFx?6n󊎷4T0ZȐvEo=(?28v'qvޥ%r^1_0e^0^0v0s089,` NP O u h7QI.̻ZuXTow2G69i{F VCȾOtevtWY0+@>ŽAWWdz[ ն}[)i`JE"hCh$k]vFHn7wjjIwKϥwSÀa-T~-=Ǐg m#./Y{:[J@6VZvW0ӓ6* :'`)5VKR5_o:SfzfRIګ؝[Nm/yպ,Rȭ9 WcLMNe Q7*5Fި^^1B=#9f|2&T`\|P2Q]0@q5ƶضͅɾͰ.VsɅ䔈N@&EJ8W &ij}8B,4hii,2G-],Gk3Iݏ YlQ!J'@Ckg'0$"(x'״vTB:d#Vh+eOk};Hl͘0>vJ嶀nW=qy:Q=:7Pg'͞Y6b]_6$J1v1N#oQ颳>@j` AuFr)8A 9xԣ":Es&,v*жf^Nk Fa0];E4nʇ?ChXY`  _qHny?~V_214$'s"ZV׆`U zR9Ku/sY~Tg=v>?|bY1KiՃt`;9ğOGr%$rn17;=8N؜v౛X}vzf]׏̏?SѴKke|$b%~F%n-% ̝$g l-b*85.nL9ͰEi_ҳ9;@H8@@L\B[ҿ\}7Q-vZrmNEA_$C', hx$%XEȲ$I(I( 1,e(ղT^:K./ {ѳdNPFD4l)YfG?eq1:鐶!Դ;AjpS ץ51He" oE'!Dd49"0ʲ(%e!b2f-^^#z)HGV"o"=F-AΤrg p WfJTPGY/)ZDn-HK7ZNjn oŶ)PQsNewTl;uԑSGN59ZmfYJ%U{$[w~68Wu# ^T$tq(}JPH-JNF_~pӗe}٤˾-&O2%R x ,%ل.zPN/OI ]L^Pڡ5~l>DYh 5n$TKbM0A@t3)yϚ/W,bƨX<ӬQqq-Cl&y 1lV_=w9DLQC%rņRi !/ГWOxowi% TQ#!jd]#71p]O+݋1C9RO ﳕl}!TuW5NB/* _J:(=u,5ϓy?OuMui81 s4$7KMbrH#2p )1/AEQHrGcSyn60UQU$k(x&P`࿒(NVl(}Pu>|:H:HƘPCg\/2{÷e.yUlrIyTXQ#D1F_?_M%!eQ[L2dʈ)닉u,/BO[[c7_~O1?3ǔ13Zc>z= j4οM?~u4%X ޢd9mb_m;hJ# C=aGƘ(iQ[KKzce0a]ֺa72G^ᾹZ9S KTHB8,fA-%h򵜿[+f˻Qc eL|W2pYo!&?7 &EYPxCN {_5ǿE{+ԲxD{=Y-ͻ0Ko<ѻyw;Fo]~ ثOpe s37sg򝴭_upm+;欋)k'}ۯ>8_):P)c%Dc?O5穦hhپN7JzG@n2 u~\ۘgW &NŶ.1 ]n&M!zJJgZUWw݃=YUz~Uh*״`0K fTYd^mȲ8s-C/#*ڔw2vCmﲔV[G55lk0gb}}|G3M04Gww߷h3*6}ZTg4>?qQ4ҭ()E#EhVAvNj~nz ӂxPE#G^kjE4[+EN!PTF-N:DŽ㘑b;<{܄ZABKa{(EŬiϬ$H-F+/qiXO%hÞZu}(TЇRAXZWuind˔7Q9z+2pdj A{k.A0W10YG SIn"шXE ,S?-8+_a8@N."QKP>@0*blsab7/RvmAÔ4eL%Q&V.*-fXi%Ji +)ώƻ_SAtC.lvY9u.;4}hLcE8'\Y 垩…fjviym deE4+-*+Zf}^a+=P BI_9y0%VmK\22Ӏ4FT݈eHUmh`@bp!*MDr+ƃO < oUN'OA@(UcwcnI1E6Dc9O%JAjEF(qQj ^c JQE}ٓ"x Ӕ8KnP āhM,!E0DQ:-B)gI#UEa1vB;4d%Yr )Y)唂߯JXo~w3ʌ'X7J.8KQ%CJrexk{j/Ah i!qC772,(ɰ$ ˰LъRiU-T<:RRXK|dif /,)pexCӅK y< Xz 9K$U\S%-al'&-"ÙIU\$5]T2:K8}+BL@l4>*< & 4) CDT[2@,8"[:FUWVz sIN0;,EHMD"ubJ1J bZxB쇽@^68$)̇mlэ?_tSϸtt%ӵo5`WmO̼~kTAꤎ:u2Rac!6HsT #iS!6Rq=ƘR`_ .d F  Y'FWs;4;.$@yg[0`0_J-A]JF$dJQ.8 ;\qF-< ,JH+ `:Rq P'kxU!JVL˴d@hhh kѡD gǐ:GGRдt-}!ǻkHgT\`BL{BL '2C̺YY̊bFv`jݳCy&viFî2E>R8P VO͛0/~N᪻1tL*X_&7yqnRwC@4œ<N2rA[̷cGkE7Y;tx exM?rϗo-!O; of&% Irtmdpx=z֡o?L9IeZB>9-^brJɚ`Iw4ojMfϯ&&L|8:NfG4k7w0e>#D|^i,@цyjjoWbzþg &r}W[=F;- IJ=1.YS1)Y;-N-OiphP3(\i Zx?J j+? c뫯g{,XJ^)I0"@:&vDHBЎg,S=`2mev%d%CSf7uLO/{{#N!*YQ ⨱SV^|7ei!/a'J>`fWӎn'UqU룝XkF3N=voxפG]}C.m8Q8}]~vb<]NWWOWWhne/'rJκKg23c f`p^2@]5H!UvlST [!p9C\F$}3dQ{kRH3H2|x:"F]7 a@IbHE(JES4;h!Ju6ݔB u^`Rҡ9 6ȦU 7E va'pl8%>S}5Cje$.3&j9>nLnl"eDB߯/&>ˋ ~tz법iO9 N4qvF5k5>(\ګߎ6;?tÿ@4_}&X0haZ w4>|e<k0EE;1QX}n{s)ـO~Uvm 9FlLPSp(e_f/+u(`9;[,q0``|<19LdNYXX4-gomHZ<ޤ,&Ew3;tWLh:7Pvm|-Fv|hI)Ȃ®-Nndje+8DΌǧr24ûA'(ESK/~D\g1|Zܸ6+<ಣEQ('d;#J' K?{t"o@\AO5zѣ/{ 8K Np(Cpc%4ۛ/UAX3yo_S',]xd]}x9Yf˸_mϗ{ ,rOGc-?E3>,oS]1wn?PXmeS\YO2gpщ!t֜͠lʒF*5~;&k>`PPD/ ΡU]MKK,Wt~%ΓRj~7#WIJ@Ud7mDD{ d:^R ~|Itzqr'sZwV > T9*E7]XgO|o'($/BUTB qB"{9Lh3vZo`YKr֢Ժ;? 2k5Y;j,L^4S /D_72ۇ^6[Bɹ,֚wIly1͖f͝Qǭ˙EX$ fI %OsJܺ(Ag t"ΏJ$pȚvdšCvuOH儇Tj [B7m8Ik}DCVFPµ9$e !@vXUo )MjbxFZARq"%1ZÛ7o^ӹ_ޏ.$L/qkm#Gn6Sa\Ke0df`fZ˒WbedbvH,U'"{>O&M(>=Oz.枽.rvAr5j͌t`Ka޾yݹlQX31˯'?>9<= mP<3 Sq9&JJ)DP,űu0 ̍1Vt:Tw,9ƅ̟&^CYM?LQBĨ2ݺZ {D@8;;F]%:O ߻P*gh3匄jmm} Oᦹt5%4\E:AZGz(Led0fkeR6Bk{ffبH3YQQ8h<} 24Dڔ0j +5 Db3js'$B`{T2a&&1Qc5'yempe( 6E[/e~^Jx̕RN/.kt+h nU}$8Չ%t3OCQg&J&^䎂PEIudSa?9-Y<11)T%±HT~x00٢bgS!0? *`4^FfF+W{O٫q^A歶\U-Af9H-LJU\W%ǣدʉ@U*nu[-] #))m.󋮔D5RE// !-%kEqÔݰ+6PuH()?(U"(86=X.x[m4Yu )RWNW9:cr,uk_KPk8M,auUEn+JeLUרBE-O]URt` )3ꪒ4vTAfl)v36h*e 2'. =ApU<&m(3 3!gCBRwkfFK  {J -%ӂ>_ҞaD{acF#!|"f6uvhVߑ,&<G^Cz,UzjQ2$N/`xi E"j  8[zQ֒cr)ٚь*}U}64:Td<ɬV'67>mu$\sHGajwJjbF/UL[ pYk 鰒m:i8w "4':QG(`hܘQ*wYʊPۈkݺZ;p21 a66b$]O8E;Qm5@ޙ^1W!^D'2\Dr0 JY!E 99h&|![ݲ IF[8`ck//NbfG0T#Mn0:2H"*G;(Aq.EYY6qFCi3sd)C6@F&dq4d< }SMqo:0wˑĦ][ymVDbY1;*rY/vxTL A*0o0Mf,©&|]\jXƄ_Yچ jpN{mKAPbHͮ?:RkJ|ZO{owL䭗,"}rkTb7]\%RKOoS,9'F\ nt*cģ4bXhP>97Mu,T}۰T@Q;7&M)EMn5&y' .yP UR)ajC05C)Z v}& mm촲ip2Dj[ear)aMHO6pi5톖TbQ+M>wsڭ\ש42ò-ؒ\KlҵlbJKnbD-,cl,>qSY8 q_^s7KЋ (yhN;o<"\<B MtvFS4Ly.N.l3TLLS,T<H ܻZ -B򑘿.AVȅYDpC׬ 2?}C *vh0:8]8tC07?|:#?ko߾~Q}?Wnf;}Wﯜ|(y>Ŵh@vh"x쳯M{t{º"q?t׃d/>H_¿˧ӯ_ M\fނ೛G_ ڿt~=?|;W _?gHS^]ŧ>{lrN; vt6ӳeUw?hQQv&{`8ͧ௠'{x/3'ʒ 5Koٷz03~W>qwgoPͽz7~}}lVWx=OpϿo{ w}Z;*8jw>e( W9}LfaYOmrhs/KbJ`U9&HqHJRgƘ/νzyo O?}ЬbΉ*D3:>piP0Tjps{< SG>ev,K.''eB{"$;U7|]VѮƬSna]p#4 ^Uu&P: '(,:ӁF3dS"/_uӴwsLJJ-}=􅐜ѲwxcfmW/ )$;e+y `8E3H 9QèVgW^ٟfXu4``b${^]X>='I(t;ŋZjVnVk_k^hkR\v&Д# hwK txv a ˯=OG̍2eW~HX/H*#nD1CS(z9fEg2&JNI✤'$qLS*%D5..b26"3Õ1Tti{^Scl $ݥ ͯ1B0]_:.C$cwy,-< (%"r $(&IpYynH|cG@D`eIOuGbxV<F+Ii@'cK@ C7|ngNh1%6ˈ dJT^ 5?qI9t;dzhW6)Ď =1',OB%P5ȒG|m90VJrNh\n=3-/Sԡr-FJ60h{M(+(CIp][\hҐ=Z_Uo%d7h-Uo2RCQ" L_ 6{!o5ĐES|s[̙į~u;1\Yn𬃍)l({}v -ڃջaIݳbي5Fi$Oq<]iƘ/b<\ĉ LY4vAveO}5^}m錴0 1WEH xNj?l:#¬ }t]rK]ĔDh;~g9c5Ǚvs^k{s2{s xCP̴&ڌht<6\Ia]~n+-[ZvI-IwZhm&z. UKҚPn=7״㻬v ;3trqTioxGEnTgF}0:(Tx!%svzq>!$-wqqb⼗l+kXO[yw޻`_^JB[`8 ujz)RXl#hLz"=kUi 愽iA;:Uh#h \K>^ 4tme7I fiART-M> G%V?!VȆwQ;4(+lAP"XzhmQs=4ˑ.xy{x]ue,ގ[Юo=[4;%ݛ5bd KvRXzS논.(6̴ X([_zx 4(Ww>.Wc/*mvTKǃR{UF3Y%RwoZ 7=ד^  Φx=䵝Y86)'9cs48R8Iʰ婔8)!Ü4&vq'O^McOLCTE!ljd8)gR)DTFS*&&#a@nVWB}yP3&Ys ܥ>u,t *"TBO&kcR( Q2b0 "gpJktx#EDAN"z7tbGtk>1N !mxC3mut:j9 P`f{)m?¡J6.D)o*ŦM9(!J0Ɂ),tTGUߠgϜ=č0MSn6 :<.o޾eP%VKl7Яo~]lBKO{tô7 ?{W۶-L(Mz&0ll'A7Bmwb3g~3sf2SU"Z*Y./a*?- ?g`2>cIISzcpv% ^%*׳%8ר ]eC Xck43Qt(dOc>M](Ng__\0tM_s+NZ}Q ;oyáӜ}N'%\$DŞh2MYOv*u 0t}7+@"QoЭ${Pٖ7q?Z'V镢g-#@&L^dCMGz<`1/a֟1) pJ)8&$1"~)%l]bP&gB<  hjB?yY233&./@R> }9*-bvʪ牕K.(`|vvpdOag6.ڇhj@ꢬK}}Z fu5tvb'ބ@~7c"QfB3{.:3w~k:gth-֑EՋg|Mڹ8B:*YTeދZn)CnQXoޒec?3PSuv9f̵}7$qJk a4c@bJm=@Q4IOzE?{UJၪbZ<2?i I4wЫx~yS:GV%z]KJj w!6JuW B5ܦB6Y YhYϳYwj9A nL֚Grs6{$;c D+DVMN-XMsF![qi 'Q:k j>=t],e:kL𖆇U+gyAN`~=~F;h_gl0s/-/n2-s(3=kV}eL{;M3{:0,A>_~Lf9Gw8ǟ8ks}A5kg]4uh֮KYæ8QZ 㝌vn"s-))@*5s/O{Y洗eN{Y洗='t!uSE8(09T  ڪFG%w|+u¾/mH.֞BH 971Yxu8%Lp܌ \vN!+V[ʭhU5ªw7#JTڃ(f-Sbڳ bNPQRj<\[Gh_F跞+]5u$מa(!?m3ך:]Xi͵6%[o/>8+%}9aڶdmo8Pޕ>tuOFt=2&9WpY?Lk%KE8<Z/#4r{i]d dhz$.ޕvd@o+>5o3ٱOW&0 ɶ±?YJrO#&HS`1g8:0*q&IwuO l7MR zpP/"]0}*._D_:m7\/Yo}?\_-K¢Ӌ_#_w.DWa4m@χ[̍._.%Xį ~?Af_߄0^AJ?00h$iݛ=#;^0>c;@>hXb+]qJu-w, =ib+T {nS#5֩8>UYA8gGBK;%咃q[[ii6jTn!fnnգi n~.qԨ`zFNV#zExJEyPnnY΋3"#:/";/\AYl$Csn( d1=1N--y`a,`'8Ea!}sgK摗`\`T_ لSڼ"a`MڡZs@( MC|Gpm8>`t=Uad4koL㋗w.Xg`7!Ÿr6F*c#õ<+V#qw6~W|<0OFW!vy v]ox/\+ .?ultw4WT-<[TӜT>gzC$w SOG I_~c0"I1 Fמ1#@vT…!O(M!0VSGdnΎ} L4Gpccۛ>ژCP )]/ʘGS ):E QMZf+v~8ZMǴ|.ցf5^-"_}8eXgqN٘soY|85>,CܾlǏKG4er^?|@g$v,rBwf ^?V^aX*tD4V[ xV'A޲C[?lS`!I†b$l6dYW bXi>*6lJHEgsa3i95i*aQV@A nyŢˌĐ(.Ҕ[$^in/QjV{ICQxilhS /;Dv[Ƹ.O9=yҡ֮!Uaw:Ow=zy +A7~ۖgxWAX_|q1wivB|y<;yjKl޲n:ΪRzבNaG#`a/7ӕjc4?NG?רͭΟί~ i{9G?dٯlfTז%?q.͆is=hD;fruð >t벎!!.Ʃ"z w6^[#~KDg5E=5IN>NtbjsF$~buY DB룶ƘDӗ/"!)5IpIYJDbxbu@Mխ?xu#P=]^R婞.>zv:9D>o%hj,A )"  Ͱ<>!fp|p+OW;#@ DDCYBH6,,X1BsHH 2DO9zٝᳮvw]vd&nnYde7'n;Y=Ml̍foԻqՌ0p1b5HbLg\>^]`GS;(V$<3@(PYNDt.$A K> fG@Mh$Jx*$br޳BhHZ>NTXhPS ^Og7@1}z@RQH^rtP&Gig c.5p]:P[[0[ *mwx[kUUU5ڐ+:`\QKW5s?gڮHFM;% + dQfIe1%~*KEkZNR˦l4{osV I%tv;6.|+JPŐKB00SI,{/ Ӭ&':3.[Pt*Jʖz ; :RNt6C]N13%hXsr!t6"N{66J @j.VA_ʊmC@*i3-A#7!}_^ENBWs՜0mMMVȮ]#Gv}?rv`1'{!3+tG߷qPV:x˗}{v@}K߷ou+t:8-K/2R{W}: (}3m{95 R|, J>0תqQ" r"YrV<;ư:U^+)${t"]RkHE_lDR{O!0xL2*$Eѫl]+q5EbmAs0B陛Ɇ5Feb>cTdaFI&%bQJ4Vh_M5ePvɧR*F&!E/|$  .# hJB,!@"bb(2a2E`a9,P]gqv$y+eV`-g*JyLJ$鬲wUٖeRaJb& t,8>(^Qc5 l!0kXK@oQѰFf JUyd}S4;"˳kb%DNXq{[!$|rvqPd?'Tyjvxr,~S f1,P)5ڒl0_֓P2[JFꄥlr}m{4Sp4:9;O`{ǠfڜXKw-fя_bw64yHaaPl.~v'M0K{}|vyE7 4V핿'}1Ko>dxU/U,i3G?dٯϼ6sZv7d f{>9%ṡ:ޕ8n$"`CuG`t ;50c4yTk:,II%*%$.7PvJI2̈`|v]{xfrr2Zo>Sioa:$уK4('{h[K .?'%ݝ †YI1?ۑe3g(#W>p͇A^ d>u ATC4W\RL^%aʩ.-#B)l RFKt%FIE B%WL2$Pƽ H#|fG>rek &a]-;"R#]HZk^NNXKΔ"[b-&\&H1\ZzM@Yl!vRۡ!b$C#0@ ŵn)oufy]1c&eY*%-%CYEQPjdӔJDa <&BfWmVUaI,_[8f6~֔EP=*J˟wDw/M()BشJgT( ?c)w 0<crV#jGO1>FX~jEt6 j 8E!a0%fHkr[u P (ym: $Xrvk, 5]==0st2"WV_Sڹ@)%F%uYa+$WzVJأ7j Nék{9OwZ" #4 6ؓwWqZ_5=ԍGu5_ݒt8?>c9HuOcr0 &<콋EѾ7#YJ旀:MYU<Kgo|޻l=ڑU4KHK/~sYLxT͌]WaRm!k<*(j'7 u\7;ev4VIs>޵q)w_wf%x-f.9 sDtwZl"qޝ 2e-+:t(TcũnW4 5pZ.0݊,cn~3v0pAc3y$pc8ٿ$hx{Rc8?TzI:$=q. ?/~0:~0i~j‹SY|^|ylF lI-G֘Cb- -v,M!$ZA[Ija.*VS;ZBpF^YpiR#7RhH ipPq&I^aƆrmf2cͦ.J?#ܡWF5A!v4b]LJ2h炰לmuhvXu)Hyu(0)kƚܨ=nɻXk,>0v"`LH2jTcϚs6{BIC%PUĝԛ{|>T:8ɭiEIf9_=*x0Tħ!z;/ !ZlwRcB!E)wXN`K,HHka%@2.Z aD)Զ ]jo7~2v^{!lg,p OfץLa<ѿW:-5SfO;cROl$)ʈ P'#Y[kS_QD^B,D-  EIY["iQPOcoJL\8`jL(*V1 [J%,c/^"^uIڐm7 @Ϸw7KCOm,,-N[n,Θ5Dv2F# r/3F1!5Y>P}KSJe + ?x@,1AXJ!a,Ŭ"NY6Y=JR ۅr1fp#0g{ݴ%6UƔ&7ҩJ+ٿĜy#J=*'Ap\k 8Q8a&H|њ&HYlO ,-(g~KOHyUy:<{9X.?hY85SQcy<+.pv6r*.m*.ipjK4:ʾk&Q](."Y+$48UiAkN'na5>x3]|V{diC4W,0=;vU>1[O}p?zG@Lhuo97+zؚGt̏W ؊?,W0,}wzg&f:Z؇*q|xHuX߻__簑Va Faq]6VL8\:M,9)-͌*\/HzzFQ\8Vw]k?.eW15)ToxI6d4sJ0be "@ ;7ʱI6 rSzd0Yh .{1XEcKHuc p|Zz_& P1\Q"'r ~3"Nz ck5U;aA ,c=a hdc4b`2wiDUӇgS!BTQrHUbbUYpm+5h#PAMw[ưrK~ I0s&[EwxTށP~-(yK&Vwa?[3 yv=q !cpc I`{UH4鎦o71e}7zշ߀n汒-(seet %!6$iڟ!aZLpѥjO A20@v*=@ `@vwHJu!㩼 X RCM~_ ֓14%|yR!R9-D;)Nbܽv8Ǭy *~[I\8>퐎3DcI!sEB62l"= ֡2]!eb1n24(8ћ(0)?BG|/趙Jbd2&g뛧}Xj=iܕNej-GݚW֕ab&h* #+4 bc\dST%_ґ[] ps? BX'ݴFJ;KOLg!epЧD:4yn'F0W(>buP_ž}Y_2$VLע/Q}6Yzk?l )H"\We5oA?~E] )֥$u,a?Y]/pT+,?]4@:Hc}p.$ 1%jXFQ$^/0yE5U Iϧ> ZN/ QG5}]W7}5)J s5Α9#8\p E)YÊpA}5zn6LQ(q+i9Z|2Y|M\[`tyUФ)\ah ɻBVa h#PxW(I*?ݪ j$Gt(Hb$1;kQ6!a!QJH-1M,i)53hpy K)[MxyRNj{# eRM -QY0LJr sJmUr!11$`aB$RBDV "=E;)KoJEVXDe&Er0v}B^,fZݑr)=*9'޶%=|u3MkvqȋU5yFfGJF%)y[\T?#5&n *mHfuYGD pw83W&v9hc;` G E;Q_G$XYgP L/Vj 7m=_K%4);!&rupjLN&|0WMܓkby LU` J &vVZc%@Ӫ{NG<[?Ѷ>{[ܳ=@݀aG*qm`'`^pD ”F -fS4M{&)V[7a&Ml LmP_b!(&8V)fx &BEW)j- -9ۥx^kiբ ٫IL&-^h*-)KdL^#z6ozAWtB* ih =:m5%;V(봄jqJbxV6{a*_+=OFK%Hsoخն)Pcô]U:i Mko$*\K=r-] gWZ*hG?W-O{մ|\j!ip \T\9G] ʟt{ݧ-_mŒH*ы{zfah,+mM{| <jI4uz8\W36c?.9˗3# 3c}< 8`9;|Q0/ 8}AxBbz Splyp@$Dy8Q i9d[#>s"cHs>;msKkz%O# 9$+QK,1; ΊZcSmĄ)F` 4-EFeo ^p;BĖQCgS*#+\twb㢃ps1i8x줎v^ʔ  R5/?sxp$x3;07rרf1S*&0(Qz.  A& o:)sPrpE{$L-\5Rc{{Rl] (Z=8UivԪu}ԕⵞ\A,?e_ ?~8zky(N޼zxyv^޼e5@_޼{W%A3Lu' ysI5)DCg&Z) R(2f P\X&7Al d%B7Vkꯆaȫl߿^mwJp2jЊ tuz@% vٿh:-8 mqFymy܌1* &Y g#Cp`" 쯳d)I Jdž e`ZFiiŴ-aF-FZKp+bV6w9 ]*̠7l:FkxYXu0YM*t72̴?J{M'|~wR"m]$1bXT 3@Q<8wS닚z7|뵨%SޙLsD6jѼCɘÛlP22rNy ZDh$ ; 0eL˂;+alb)qU3j.:NoAu?G{kyH b Gq]08Iz$wL'8w|E<4f u; IIT)%A0#\f"ߛL"۟L7Q`h.ØrDCK~t֍i̜S+=(0 ;[dρܢzm.v(h3а= 9Nq ;_ĻNߐL/"+>>eʳ H9!+a;fxvv'ʋ '9lO+u/k Zykc"JM\5)x午V" C,Z4>4I0NVIJE=\u'2:EuvX8̧ϽJO;.5ABuA8r(XQtvj]&X=,YDl9x쇣_nڊbhzśZ'aͫ}/_f5yGi>f9mG9DQ=]Ůt_:k|:AAёP\r].ږ͑۶hqqPuR|D>YpËԩ̄ăAVk<Ւo2ics):MeC&Z2 2cJSIXFr&yC ~2M(SOy3rJX[-'`PoOeE'S ]2U凚 `mޝ;;gWh׫cWR\Ѽ(7벾%*ϓ4S @{i7;? ;X|f?#{y-:@SJ([2C$D xeTg6~K2)L4j2&mc K/%b{thF9RNFa`{+-jZse|4|HoksMD @jh\p*E`: k'elPl*<hbBϤFiaaԨ$}AkԊf5 Y݁!d6]g9)2X=LT!%g8 CyHlޫrQY `aCndO  MjI*s~CJC界p,q8UOuuUuuM@7xwRLq4BQ"`9V!xUODt2"Pq"{ΆnLxw+N8GVR6l!냲G *!{왱 MJٮ1P +D  XK 0-8!|fV cWd]{m;9Uj(;WK>$yqQCL}t!}GX)֐<Ǵ$r_G L }TcSR[y[%yhZHWJaZ<6hJke^KBcQ%zɘ N+D0XS9[N-DPs2;RJ[?M0ipS\o@z & :hk4+Ą28ڒwJnceʆn+UpvD^phߒPY3"ĮR5\C[ }{eQXEͨ9 $oWRt21~>m”,=*A{lUhrkHn(yp0\W ec>WDP>nE(T DQ{m;5DIi(Mx Qjsu$'GYɿ]vd^fYRlᯕA7[cmѥ:!?=|Biz=r9}yQaJ?_|)cAT` MAY$ kӜ(;1EX*Ux2Х$Ѳ 1CB7AHgOݣG#;|ǬcZ 76eԞk6ht݊FrM5?89wϳF19\l 3dFn?k=xlwчd2Zo>*]=Րb!5 k1;gmy3si[kxܕq~1Kh6ˎwן&yH׌.ߢ,cK~} '8gY+,c y>;`#Se4L@%SIfwuliKtJ\"%%%w$5҄;8(:fէ!hJI{D,x(.rE=|g gVazS$`dص?µ!u3YgFN1 ScIgi5^!aܛB7z3k@&p[0CxHB.@5&\8lsx|hCW?]caYsO)0W&P8ʄ1B#ɗ ٓ4䠷>/ynk;v$}Z Y'\8'8n:^Ӄ/iW:FTCm4_QC \=+]o{H<8Ui #leO];G K0%9XօMCC- |h'Bm|8igs #s41g)0%5?ޯ~*/{uote;Řaذ $Vf`JȄ]cd.AZ(V`"2c`7tU.8.{qJ=;GDE;NVm%-,C `':o}q*u ,i !SFfBe2g< 79kſ̐W7޼}.>ԭi1k+ [^OJ:hX ~M[ l(f[ f5l#^hiD&IW.dtQKěPtw;p~nA񭢧߽?71/OfKe[Ĩc-O#h@[h"w.$+2E9ni˴P^38EA-~1O78T胢-WYsGD}-y;o/*RP^{B] 0/ eg}Vhf?}%oF=JGÎ#VoOWwdMhp?u;C7|L|Z&Zlvu7N!uWz3U╄L}7'>)u6Oaܑ+gN#gNhҎ܇jo_'3 (\3H˜ԔS (s+H:5d6q`2 HqQp1:(N`X\6е&'̓+;(4ag]?fiΉGUt.]ⵣ^HUF2e`jhwWO1*!9 c|}jκoY2 y`۩`',uv7v='x\„MFRqeOiJ/g4P#@eL/^_KNcUn={+|ʺaN_Z[Ubh$8ET;d.K(RM~]L3LGIs??P NYl 1s\kRAB=,xWYnpEx*\x.0HtrKmqe 'jA IIlrڔ y}1)'2K eߑRsnԀJys Ehӝ~<_4E:ʕU#VQ%_Q-(8Q'^ БD.Nk9%1epD8Pn~n$wh~_cnaoq6S]ު2Qop?U%bgR2\D]->iGOAFYQ =75![JBmL& 8alXI"@rGeSH=W<5=wNjc˜jS֓IK-R:{K= żj@*$rGOf(E!l[F1Xlw qSjmN۴wl@l[tB;Is.DQݨʣ8unu;-nROnQy;Ƙc_+E?ͨS폩G>'fF-ލwOn_aC.Wܝ+9>?.`c\]l\.%@bJ%A`===Qf_~m|9z?Nwt7&=\&L> > hx:l6|,:}6LÌnq&^OIlHd6OP9 ɓ֕Xƃ?^)e;>yӼ*"T*Jur?Ȗ$ۥX C#2Pkx兏KT=`XHo0'Rf +żVΙ@.xAOR$b'-]U][kĕƻdXC%(: X\Ib48)IFACF5֚,Dkk!2Xl\UiJs?kB$r nqNt,x@QSq:,G1)8͕FqDu9c ?3"AX UåZjFMQCuQO_O>(xcm;I orm":v_NPbv:Rp6*:n8>lYhxAj:ڸS}|m2'3{wV5,B {qjlXfdf-N(um/jҲs [VˣlqlhOn(@[ *YExM~`lH%G;Rŕ##DB`,4 9\,8egܩܚ@+0s򿿳k\]Ƿ0xʛI-_II;$\ :?0(ŻSCx[5"FW#I%kIo=R;%FL#D|b%f9Ĭq#s ZjLc8T4y,vY =؆1:*?,'n^A2uiĉ˓²dVD؂hZx5(a2+D]IbƏMVf{f0j?c8>Z)fyFY^<RࢮJ8xb(gn.>x|sa-%ТJ]ȗ 6mh*Tј {6vVgwt?#%lhrbdzs3Uht8["sHI[a{dŚ*Ċ]J:^<^-sZ}{ 'VG=^Y>J1@,e}.{\66V^󸪠%`y\vi+s)<aDI_my\Rpv>gJߚ= &:5cVDK8Kwm9&YF A>Q^dqi1jf,}J<_))TA!| ˜ 8!7 I\ $4IX%#F6YP'[J1XIʴ 2 <$ӧw{(+KNh`=0q%b4s!MdYFM8W+V3kdH") xP^34{c x2BjM2봄>uWkνSl)>V4]*cGљ"5 aQZOI&3+ͺ:.3cjyL}.5VlRiԚ UR󚸞bƲ,Qx!1:p돚seK3hҟ$)GI&JR 4`Jr(3&(EL3=D;8a=\i&}&)gb M*RB#_!1NsIep,Ѽ +0V_KJ~i\;'+^R .0ޏt8C֧Y+膯vXaKTJ`>\%1}HfMBR]HNtPn ~ |VU0+XVzD0:&aRhrnHS@L:D bϵ4 H̖v$=\ipi̔:>\!ԿR>C6 :,PM%O"+wpdR*+wDV_N"˯h*<1"i_bYw(DV?hmJe̊g,3mG+cVb#-U<~bI>E)֝̏),~,tI Cw*gs+Z+k; 'rY=`0|0Di,u[%#0j4*5KcOB8j(I3cԊ$ xn<絫 iI=hcB#Mϗ`]U>[!s$XWĴO4fs4)vFad, H%E9%0,eDp6(#V1$kWf ]))>owMj oA0 `{E4A2^^v~io;}t~JO/FY9߼ xeF3 /o\ƞLi!9hQA{,E4Y:,hNvrE)ZqUM'f$3$7mB-eD'Uܕi!)mLhYք|"$Sn0ΩyPJtQIы[*[hmbJ(%}8;^E y 05σ6_bLbOo7`YG*n{?á7PٛxMsi9|7AZVr1^\FK*r2CJuN>,r$!(sRCr"$:B d N eau⺱?XyK,)p!-u$ Gp<&r)EH,pDRu]%0fZ\Rhb`ᰔ҈e0O1?an[o"Lde`0BxFA`LZςk(w*Ap~2ztvDz c,\*%x)n ax,O¨Qf0)&L^KXQA)@е䔄ଦ4=Y#s6-q$Pt~?VG1y&F8 JK5ɭ&0r UE)n,$LxTޮ87H9$wf#bW4<z !}C!,/g~(DDJ4z|fLNP %.0#1lаV](b %M#TPd"B ebdR}@1?Yᅔ Ri*i7yҽkFywߛp2|ՠq{SiHl(˰7{ǸnP4 UNݾLV9> wq-/!TwJ%2fbe[RDc:|"r~5IFX\WCNjRH_ZGܻq Mស*˕Ynێ+ uhyhkn\=Cm&I[ή>`PիY98|d8)o@BA4 kzn\z@N4u58B]f}ࠅ31>i!=v_,19a0"׻%nIz:l07e<6u u/_oͻxK͛S?vn)DbqR ֬'\yRBG{U"vuH~z/kE /9O/!%$_<^:@3ct1Í Ni8f(Y² Ź=+YCa,]M_O+H4{&a`^30^]]0-|~ fp,>{|7Ky97jr; Vv>#r-jĵzھ,|_ 1Sn#zA9'@(9ꪨ8r2p!89,n08sp(ģw8r9X?w#/cWծpaMnԉ8<\yȱ0^ʗyҭ( qp :m38i5L&hNAЬa:֭#{TJ?I׀~FȢ\s1^<;[pUl4~f{=[6ꭟLZ6>ËYf_ʌ>^NnAuAUظˎ52 BxO&/6O};37?|f~6y]>Q;`ݙZ@Fg捎g}471|jj|E=Fwݘn G˹[I_~yǛm/ PKfXw7?o1 Bo ~ BC0|k#pHቔ O$2 J3:a,g ʹN2A~z108@z/k_rnt57 bocneG!bÝ @?Qo" So4r/߬u;3ɕ>{(|c(|p#ub^u-C}3@s7݄lZ78o +58װYBwoT1ěhXut7+wU*n;ͣsٻ[R.B0Cb);F喩{7ͪpS8x5nu9A k%NQڼQZXE5SjɉWbRXˌ3Q ̈́I)gV5!.dXkqE %/,f\6dłխI'Ynn-R[[#9<<$!DŹ5/T\ ń#"O}vP 53VT­[hAz>r 4Qh sRVةx(Xk`No.5ѲK‘V h)KHGz}t}WB7Kt2)RW@pwDHؒ=2qhH>}J&~:s-gou/g̪v9]E ^7صv<5Rwwͬ6Ruv1g?{)uvsbyχߺ!P[_cU'Ӷ6X*\ cx)h9RXd05ͻr~TOn)OYQ͚L1$`IJY&atX˱"OA^5lC;(a6wn^[͢p½Ha?Z1gA\; _ȂRJUH `@%Nǝ5xqw (|bpW )/%fpyJDڄs z" 5bP"6Cp,pEwza0[[8XRy2n{HvvIJos;~3Ku g|h L]yeFKNA3%PyK?l7Vv'@g5]xouCW[LԠ }DA~- YŃkFU+mT=I޳;MuOi`kW Hgh×% e0ъIivQdyOt+ϱ?GоM$gd\K!s*ϒRIq"$Ay&G9L"gBrp5Oqp8dC|?$?xka{\eb(D'>AȽYg穌Hgۧ[6LڵAٵO=T61t*GظNeh@HרNa׉$_\6_x@W d&1*  |bs9SLRFh(0aJ@LYRA@ιbelGŔJ.nLd(( {{B@N҃:eB"0{8Ì! kRDx?J)YndðGd}4G??~2Z}rt.ytId7F&Ř&HXJ?*~ ڳE1Z ER[YpJV@;YI4щ3+&R -Ն8Q RZJJe (֊cTTXP!3Rrh$ZaL-L;d`{ ٺuawyuq5 u ׋.8t;-e>e+ PUAgJ1- \qRҍL3 Ed"T*-Xv %BZ{ɣK3$fިfe?!O9ɧ# LފcP c[q wf.d+3ZQi$(Q"mAtxӤ9IDxf(5pB$_̓ircρ'! %(g1(e4kj u`*k!;ȩ(jQr1,bJSR֎1ybPBaIN1*%v0Qs+(K<=V*-rڷNVq[S(*6 #b-!p;D)ޙRU`UiJxk ozz_?~*o Ӹ]Oxi{ޗGH?HcwYuuLV#i9ӢﻉaӜDǃ7o3~mfOn̐W>ETTMR8閊AI#FnY-ݒ*ݺ!|&) Mr$p;t}D閊AI#Fhg:N9L%4Tu1C^MS.` -Gt%0Cn n]̐W>EcqI7)tK l'F[;N[n n3OѤ9%B'r -Gt|k=-ݒ*ݺ!|)8 RY?X]O/ெSW"F" ɕPz"Dʜ0 C&čE9]1q1$`Y-P( x5Tj a{#[NbuǍ#1)#OZGIC8rJ+u=#̞Ox:I4VT¢DOK$J& Tν^]3S;ԶdhxMnmӣ{d:4Gq/ҟ`I|%!dnj+ ;͵wk fuњZyo֋O \سQ_28ee)@1@!m`e/ ʫRX8r?`IY кa9wp @ED>=ɏu%oW~y\ͩ};۞@NWdssO('O|U2q(`&0A-O;I逥gpU0|[h/`U-BIT^ G[ W n=Q443F(-F{kغN>ꦆ'(DW7Z} plg@@ ϛJcPl*L*7Fڐ+ ʙ1nb\C])Q2G`VI]Nzřd`$dA 1@IZYƉ "XRW*\ @IU1RQ7|z60x~MAcBQAw@DྌxS̿|s9H|b9g11`YPmٸJs\1Q$INxg~!P(|Of p'ݥ;epo3VBQZöϢ>I؇̟ uW?=ѐ-WqFjfC23R!%Z&G ]}#UDj*HY嶨RϾaIt6{{Y8 O _I~;2_,|\-}"`qj v}&ahbp*MhbH ?GU:_r!* H)5Jc+sKCtɌF 1)kT#NJj\JAs S`,sP鯩K „)DM2 H H~e8V 7K(PrO{Jt|g*N!E4 1d䥜lF"N6ㆇ<(A{ ~,7of87 }eoxóˆ=8a0q ⓊE\`}qDBCJ%,q ((^-Ҍll7<&6۫Eۿ^]tVWn3";uXٟZJ2R'<8C 1-i! &@ )FoQxEyA|qՑ-$J-D 5KQN9 NT:HW R^2\C{ ^"A#O{RWOi]σ~$~YY]GCeoZZ1IEA HPJkUl]8|c /R@5?J27Tb14fvZ][UaԱ09ǔ!X _x{<ǔ!8f}i'z^+WO8~XvQ$1ep R*QHHaV Pr%& ;'{Υtq6R6RSoA) JK4ɓM,w[NQ }3ډtMy z 4Yݶ~1(T4à A8vb]/}IIҺ ӗӥȭ2a݁a"4oC [_?05RrG ce{aA ^Jg 0$0-KzBP@4V2UI>X#vgʖD8صv4)3C]/q{qוU6Қw}c擤 K-CRKxrEnʗZzCl{Q(`|eK-Q$INA.DЂXr"D[E0u0jږV/vW=˶DuS#k/͟b\Ʈ]/_(?%cY?awB>vž7K?j1<b]V!#eW?mX~t߼m ^[r+!|^?0@%rM/\lLǠq٠R@ʹ ٠ACXɵ^/ߢAMY_VbOw n)ة %WR52(:a+!Lel+aEei fܶ^rTQ -nVQ!–R ['uh4r<"ۡ<<Y9Y;n||JFXfg*r [6 TʬF":;IN>(r! {quds`Ly,,N?1gD#Z9#š猈B,Sֻ Ou0ii'tQ7j@n?x4\ϟ܆(LSppw(vmls^aAv^ag/?p k/?\У_+AJBYrlSB!ABwrˁD7A] $|1V#mYq8@w-K X %=V!i%pFaPT8=̮Jz[#19* %0O& rT |HrzCf5Ԃ1`I]Hvr !wQ)`wMRBRPq. ]#"'ipH:%UK#*xc'J[^%`)zVk/2oп{޶,ۦ\!H2NO=Hm%&$ }^:"ERVCQ*=ZL nndut'vVu~ѧxɆ96B?'KY/\/6qr|mxv~o};]/F/۠P1׵7~6)|f 7٦)xR2& HOCS=$NL֐8.z:" Hi0L d$D)xh0iCWV8_v vu54=z&B/8{:9B2%-(p#"$ \fx#KQj.ny3-×Φ'C3pq=*ʗolҷiDl3avǙMy ,>{f:ڌt7XW(yqJl!# @DbBsERJY6X.wLutv'@߯FLUrx;nvߎ,-Gyj{ vV'v%7Q}id'>-uSk_lW{..=}{o__m~CMQDmSPB99JL59T,c<'ZHvUbcSSE{o1ĹrCBØ:Wӱ@HD$"y<@q>~1%g7`@&SvSkGg7d`즜QDݔtS&O'XQO`ju2M\ȟVmhK=>EDc6`*KP]m0@,v%;CRL?Zp;n$绱` c O=s~I0XO,'l&5Nfľ 56oO+)VRbZ4g!H_/Z8T0PFDSD*} X?&}L]Qg^g6!6 6O~XkKJR!ÿaOT ([OT5 cϦ7i˾(*2װ"3Cd1VT*,H ,Bi 0+606*TfyE Vbamy+1x9nc$nH}ZOe LXmCI<'F)(&24 Gnj[yB}ǣe/-צZ͗@J|7k= ˓TX1Izθ:s76\kp1dTߟ (>,iX [N( !'ߤ99DXܶ('=EQc5EB-*"Eq}'s)fKӤJb8'!7 u/cq<H's*}OFֈ\F5'K @\ȗWB^%PҔU~"$I]O(>ݸB+XC$@Nj0R~"i+xMc;>2Ʃ% P4 \ 50+#.+Ւ]phP%ɇ$[$eMT 3cu}1[3?4JH¤J0\GzFsn>ukNd1`t0[W4!b0" fCZ1kdhFU f$5t#:OZܸ.VCJ"|JӀvͪ!.,.- 킗'>QU`NF.§3[#ř)N>p\JLsf33 RO88 LtK۷av|:@+(14XP3'@TCG} EW!/AG0} Un`9|6LŽɷ1C0%P/ʙc^a\<ofۿ5eQq2^@9 ӭO/%[,I' '+CA b%VA \t1 G`4=7۷e^ 휫4.x{+R{G]r~;Rawmqa|$SH̗ih0N):MKv ,ţ"rg$;̅Itߍx0, a),x&puav˻QnKF`G`xUn{;CFJ- "HN-s$[#d:UՑ}ľA,G썧4j>Z;f\_qL'N0QxS&@KwuAq3rηGബP֚KSP6\%f*j:I7cYWD3*snh&MfOwⲓ?/f0;kO*[m$fډ;Uz<Gs֥opZ-SO.k ą)D-qKa[o>:B;$i5I&<~8~MuУyMyȬ}D6qйo4%Mnxk3uӃ^<_UM) EnacapGRlU@&TX+!,Q9<u!wbJ^.T)h^JHBTzs0.hB^w9c̳; weg3{b߹^L'U$q< `KFƕL'`^~w}/Isqħd!.x[l9N>Y7w8Y~>k (0h%Ke=>ڇ!Xam!| VEqޑ৓2oQ-[С =?rxar}`$.^Y_hpkIeK+t ܓ:_\uV k0x#c"`gOjLIesTuިl`4ʣO˻#yh*23.JlҳI'EG~J1݂p%27ê\I35h$`5@i4imy\[|luv+ Nf*F5KD;zsEh\VyJ3#KhҟV2a΁XZRїR:]>;2apڽi&ۦoí$gIjNJ[Z 6=)ق/A?C~4e&fߕ(VEkɂʩ B($ 'V b"xKJ- I”m*UEOU_,橝!`fߪ/';EA+Ͷ0=*XR5„#%#RS ?w"/,fݶ:7)C`jI>ۏtȄq0t4B[(x$Ir+?{um* &Nv'dCTfbM /?X~Oz|_'oC //"C[ Ogٰ`>^A30dӨw^OGW67!̔zGh2ƹ`\t>^xߎOԪDHF/jȒ 'Fqq?E'IҿcZz4]VcvYueסa7jd?=PPd'4$ <۩0`T[U(@ǟL!coQLD! AtL)0墎iQJ$mt{-Kh@>*RmYn9(B&dR*%>N1Ž>yI?5rj!+R,o*0Qt8L!F6UW?>HM참`8cؗӬ?EVvgzhQ5 "`)\'ЌA%h_R"R#|/mcAX`_LM6&Fl (W#N4Yo:}7 {yϏu>I9C H,i H&ER.0ڤUpz s2$<<$ X@`MzCL1Ji< %e!b63)\Ef f5Pc% D9#]ˣiDu%#}`<1M>)/4܍NH8 6 T@xx/8T,\:@+'ÁJCMA0ѮI <~՝GYD0O0Y >>թ2/hQ ksfӎPF$1GJ֊Isb :-felӎ>%͌s.sBR)QeaO=5'YB0RD dz|D/{t|+r@_"vVOE9 Bx7S(́^",iN:.Ų*jMuA9n2 OF*&[C$^I $:-roTM5_$oC@?8mN(U~[G^| $]i@Ӯ]ࣩÎFY"bRzojh hjD1hSO0%g]vQ_xʐנ߀vtΆנ\[pe2\Tp,i;*[^7w`i/kYeOrAS bhL3/͂pȁnE'lWxl:iU</.jpA!bd{IUSQ}hY\t|ⱺ3Rs10?|{.>\ktOf8MZD"*J# 8DPа%4{y6J2A"U< a 0D~ RQ,”8B FB~ۜߗa9a3 E0mqcӮ` 0#d^ f;Zh؟f|:#D V9N b0`HP0DBPC}IL|NΈ/zۂ5XdB:Izi&h?M!$i=&Ǝ%\sH"!ء՝͗lj@-Tf"VPkIutDzHɶpQV- LI4~ /yܲN|\n#l\jqxNҕBY&C.QNYpDytI/R$~RzYL{957QODU80A$ AӝYDN!S&pO^KwOzSMyնg/ ~Ua8ˆ$5]LӕX6ҽ:3d {0>,17׫ۊmG3mD+(0 4 HD+%EJ;MmhԼ1dTyyp&HjvNA@X^8ݛQ3(J쿵_S u ԦlK4i ؙŹWPΰP-3`a˖Z*[^2*` :^R $Kӆ:Do|@)a?܁%I%xY8(Suy#Œ5HX A飁Poi_)#Vz12T~r3F:?.T\4!_ݪWi2+0gu^pQ'mzO$BסWPwǢPl[]ԣzmOyGML_b /Drcd]IrS2Rr4`1E0RH (J` Rmo+x@!c۹-n9z[=oq61 %V0 M1QJޭ:%ZvRJ10LIjيHD1!%\P4Q/ kffО&ù?M*d~vM47 yܛ{Rpe1_Wfsul{wN\jKW`wZC{0ҒSh1WI`҇\8IK;M9TFCΉ5>r䟿U;ξnb=A28~ wh}h}z< z7.OYi=9'!l);lDZi{oOgsSLŅ(6kj,+d#y7+7zo/z~&̈́X"3ҹZ8Fљ|xN^ iKJGi4"s=t34g%7ʄBKnw6v1^ARad1,SH-x˛}w>3} n `_$N\5rȎ?ej5WVIk*tWRvӘjH+կ|\}gor̊jkouKCmI4R|_s3t/gzbw<]t- L y<}77Z_p}GAa<<,'sxFu|'d.1*ExYkX*)냺8sf[=*[t(Bѐ6)B1q ;>Mf3fu~Wfx5RLYo:}7k^^_eo؆kN'U1ytW$P,魹A1(&pnYNy5ig3b\>Lh232x> g3-[glSbI)s(aRB"=N$o).O[cDUi_ڏCgPKR@~=ÐSnAy K1GSwTՒ WJ7 Vq*b +ZQt(=zRs{Y}pBn\onvc?̶>[;so9ͥ,/(~s80ѢȴX AqKg:^%0//Ck:U$u)(YDЀ@‚Jn^ . `A}ό=9!r©ڲgQ.3.zì7].S#q!-{́j.9lٛO5#T좍.3RGm\Όfcib/X: m[_9 00?L's߷g#S#ep~'Z"KMߵл y f͊~vRhU%nxNަz=24 y8O XHEio,G^d5RmrCYٰo>KRZ;͓BbF2"ƹTSDrq4D bBrCg^y2bDkf*H4deCD5`Q@s*z'  (0JSoE 0M$H.1&(PULbOBKv󵔯Z'Vbk<U:vͶ̿՗_g=\4^aa+X0^9Wj_m}B;Gن˝9aZqo>p]_01:wX %<3~7~O= @m!A'w3XL\`I' ^Y.8sNwĚsB[z# $LY]Pa:Q+RJ$X{e\%-X<{WƑBvw7U׫CX/6xf 58D.I9q [դ&)H`TU;ԃ&[,Fp/ݒ}7f@|27H? Z% Ǩ!.#یQ2Ͱ(T( 6[ d2ew3S8Hle,ORz-tLf@B)9/Tω|K6X^ZbrH%OTOLcxwU"[Fߛs/އȝ*WܩVN=J${ ۻ?*y'rP<fhO͊";I;T$C[==g{xGǛA ()4|T,G2[|x*T d' f(yk@:l{8Wܮ2J}2!:m{W ÷zR wpR=Ķ;xHRX"lk&#λCبvBZHPx Y$&Ig<QJچu+H؜QWp1&UI$;/!0o/9'UxIɲ**DD)1"R8b'> BiN}[sB 0y/V<Ӆ8SL]+"Y'[_LZbtltc){ȋȻ1dz}}w}3fR pɩaV^vU9FI7j@^\Q x1սʻbR(!gt zV;XN o;igVb5dI˳-c*3Un0LO쫬*!H* "F:6+`ϕ1b">V*'$tf^Vd sgStdAҲ3qi'e],g 3FPC Q*Q/NɇJD7/}6kgSjh2[k"N W(IkސFtByKIs^cCHly!QI?DII1u(jPdqn?ns{R RNBO'T$^<0p)˞foA%Cʃ0Șr:ی`(I[o^N?/Ed.f V 6$l+it%cPrc&S*2fa:w[IIs^iʰ}=OghTO@ϤѾfcZ):+ѓzkT_ 1?L>U+c3I>2h/h+jʖMnۇ?sj QäǨG 7w_$R)QYai1KĎ e83GMLi2F^"#܋lw4a I#)h[} "q.eeŠY"Dv<%x T]гo} $joV# (U[UEn"PW7a9;Ż$m 8+v=>A]4tZ3EhWpEmGu(I\uܨ| Yj|eeIi?Po\(ǘ &F՛"z[]+E^ADpv3cf*sw 쇴d?%!-%|ƙIe0La 4dž#̀5`s(׫ttkUM\bяG>-.c׾0诣`:⪆a\Xު^O8"ͪ$ md&F($XͭCVHP҃Jd(hG% \Zd,v6][X)!&Q文?#k̃ {zTvG "cLBJvd A*[ve =:Qw%lWS[pp>Tkmq6CP((5KK˻(Ti';c:5~t4(;5а`i\E2+"IH_DBv__30Oo/ޏgfNEqb8|Q3F.?*f˝>myת[Q9Ĉ %FnO㝭2"ލjRf#ИX"@a@·cBѾV'WrI{l(ĵ>L2(8Pi) ܁Aɲ-`o0i{HNn+.Q}HYTϋzH,ۯ~{#Dz-ksZ{M%N|金ԑػo^Ǟ)^c =gi-/*Tlw avzܗZ Ǩ:g_0@HW^cRLuU셴r9]]l~KMP=j0 7g-W9I{pQoj*Yf7o\lc(I~FW>Q JNB/tep2}χ)3m_; XZxΙ?yMr^|XbjS (Yc3dAe=07240S4 #@;F՗d&S, 6`\3ɚ_[=,|뛿Ewi^-?Qntޏݛ>0j|G&n7߼~Z]}o~X<|O#G]x5x;|\/#~}5_JuҼ{͢vhpq!߽GI"74tlX9 ;KDDu,N0+TqUR`,^S#<#m0Yqmj{fY3_Yv|;jY-F6MgJtW[)(cU?{ƍ_01qsíMc5D?_%QnZxV7d 5%\ۿc ){|WLa8z0ۢ* |So ķU8JOq8 ʾ88'!.rssis~pĿf> G}q8r,Eyղlp銤JioaǕbc,u' j@h2{9BcPLЏMaң^k]i+tiYpSU תhȊ+%H,g>*5NX "d*L]FSclSm2pq,$#seLE-Qo" 9<,`AłReS@N!8Zkf2q8#5r*~Z-#'ZHUg4n?5 1Ud oy+H.ظ޸?*i[diFZoF[i^ӈBV5tޔ2Fd23BD-Q7&@J {X?JVdm],˓w;oz'cwf#@5oWb%*gsDI?狻g2ݰ .?tY2]m# 0M]U V0`>9"QXlibcvoNOP6/fsNϴf{ 57'Xj@U$0S QoCBE| ˏaDZ\_p:$*C0BB'6wz`Q.llmEwL9/rQ@MO}!$,O4RT9IЄ{[Lez d݆#)muբ] i$`8)PK/s=LcUZSuVPM9:UufrUkm| ԒMv}"澯E'kUiQ(p \9 B5r֯S.I6:b-]ޚ^zLC_f=A=X5lp2g<{ mV,8SC{TD6&@+d+."'=2p׏4pw88nߓ;jEt=Z ƹá֚iy)E)" \i wr}J)(&& t p)8QbRگVJen&\)Z~ʜv G!EoypwE2O;o YViQ4} 8?Ɠ٪]3/5: Ot{R+S"Rj8FWJUT|iJ^PJaR8P*HδkEG/2YˑxL&q2^# l1hŒ8u/׾qt!B!77q_^FB$ZY8;΍K/M;pF|ZB/+-HB9'ߟ1e5 5$kNjkNjkNjkNjKx+1 &[ԦU:55D+5KjX%&Ո1ʂq~%fIL%kd\^κi+q!R͡0*Ȑ-їmZfZĉh" 0HkB[ Ȩ2$L˜R nJX)gιOĠW+ "(vd:Y mNVutWQǹгPEWzb]L]}VlH;/TzZcvU[jk-)vƚV b-MQKQ5@'ǜ3F*QXXLN :LqRJɚ-%^ǹ٭vT ^"QTdRj'n3n$ .) نNg^*6JGgDg^ 4XnE>Kg+zKSƼIMp<&yFq\ʐĢpcBF,X&zMZWK}USk)Ξ:όHa* 4Xp^8܎ c;i&NڌԋFӂ^S;م6QQՂKm*6Hi2շi7ĕ5)*9˴0x$*f0pF, ި0(1b3S#~mC>#DH95HrYW%)RLKI5a| .Be5$WO|HxQzJ uP =3ǩM>fNhH>6SfeHx)55"әkf2I8Ŗ3ץOW$ *jĕI$Ng( rH1 ,h,MUj!7 hF>UhhvQ p!Ņ{])MSvKJWݖy?L`jD$fsYTLAݖ/ OP\r}{K4Ls Kj?i'iʌNLYKybEkx 2g,(ޚTfQ7B۷_YRq_21]6<:*_kRR2sJ@8VV`"+KT@G&ԍ]WPַK5> Gh\x~(.4 ޹?ݝ ooroۛ\ˆ !^}P2ÏYIk0:;#?/'Q6'-cj\ոߧG߽9<Z4VoK&~c^ `+Rw0^!tޛH4V`twCjjWdnuTS(Ies 2PpO)c )D=UX&G`X eV+γ¹!TLۗR5ǞY52lR=&8 `r S[sxW2(= F' )Ud?# -}j~@-dvMD V}.צtQ)CB:8R!԰:oFd-ZXc~8Y {e7or2]ymBDB/h坿pr; 6m ĪftZw2`48*$,_sҵ57& VH@SDSXeYMuSFOF8Du*gS94G-r~7 ]R:A_Ƌ4x%;FE "RKtqm]\tf:ޤWhpP9Y?Dn/! 2y1ҿU?qF Wҿ&`h2\ދ4m{%pۇvpiC OB0z.?ox~{!2fZ,~(E_^Xbh:(-O.*^~H={h[`z553?~V~|aG>R8nh3fǑH%#\JlDŽ Ga6DMafQF p-1gN#ǖ=38AT_ Y㰍7pYb*|( qp Qȃ<^̚w:ǽxP~믓ku#LXa>ΝػEr VI}QM=][6:\~mzH `jsWkZƆ5dƥ- bfc!ݛwTϸLH. P¤p[ U]D^N˥r~$W려5їqeWP'5^zWjZ]Wz XnO]vְ.~̍;e'|ݟ1)z ._7w7&`Z?I&*aEbs|JK+2q3@~/ׂ,gv4i8*Hv4b{w3 x9(J337SJ$kDL,̯<*p4`L@~73JAIEp Nph Ҳ /0V> {Sv4"ZCOC/$<Bh|wS/f% YL GLC3B:"˞{[nc[+D;earӁ[KN#f]0',Fd^~VG9m[Xj'Id)ཱུ 0KɆšW5*AfpҒZ5 P M1zt6kե/SXX]kF 9>WW3X2b&^by`>etw NN!K&~p8 T`[p׷ֈwzu|SG^UxhMcz+""5Ȭ&!ɏT˭,jH^u<∵~ 8G 6TA -cm;#(QM<>>k? 8풝KNo;<]ۛ 9jGܦud fwP먎&Q+2`FoluA]ΠS8.WP[K˟K3& Z5'DXI? *E~4Nu  XG޽7K#|&vWHD<,|L'Fl|8Ȗ}cj/\ZT'Kḱi)x&bf'B+"PpQ|jq#ٿ"y> ^^2egahÖdϱEYd)$(2Q /4!Q\sv5$ܑќ[ w^D >W9kLQ4VVEbsm?[d˪ν>#}MiUU ,Q\mI :$l@㞑\:\ w4(P6sq\!.\Hw ЀSn; )A*/Yf  B=xCO {0}+ڳ;(Fz v1z_ACqi1p9dڃF[8 +䞢ES"e,Nh-HvM{eVĩGˀ93bK͹S(z;CM?Y/ `TD+A OlKGPsE!Ak]"ȞP}C%-+7p\&ZsOe 0oHϩ \sFioO$'hX|kgnj4ś'|*C$i$^aGڰ4]NBd3Oޠ`8FS \8;b6I;*-!P]:j2odze|~yf^N&l{ExfBd/ BɌ0S{0BAX)КAvvfneإk"4"ZHAaL.<"$zWit[6 )wiﮈ׻CiΓl(tjh'ThSn`Ru lS^4yO|CtoYw=*DŽ- fS¯6: 0~M+shFJx0åZ9#D@/Z2 w~ĶHn1JU8<>+jf$l]Z"Qɢ#=40)M5mz뉸2$[(V4#N,4RyOWYgj/2_E_O=X]1YßM*G6Q#e$'9 0w8ynQK6 bSd4%2"H N cBE{c2IJ6;6AOd6Llr1a>'tcL@t2$ȵa P*3:dWpXH)4e:sHJcd.׋YoYwK׃h 42I6AJwF1Z|kU̷e\FWў滵n[n˛@Wٹpߌ ߽.ɺ|V-+ަXyafo 1cuw\!{)N,yosk72~Ǘ  }/a &G5WN8JSpTUEN;E"j.j*.XT'\d'l#=6r)6 FH7¨aL~?V! J-eP9ySqFCNQ\rsb%^/Ayi fӬoCkO)\cz0 陻^ڣ!5o7tf6U 4*d;&D-qLV^37ThoMܾzh6}9K,vN'v_$[եӉ(Il׏fj?ǧ^o[̱0Y2d+KyX. HnuK5xV[=El&B:Lf {Cһ%([tLs/Ҕ:lw[Zi%Ȣtx½DpH_혹;ȒlMYƊn7w=i:}35_}?X=QW4~;=V~W ]rw`[-U]Q`4iڵ|y_w$ htUzt#j+@{6Q%vbuI+kޱ!3J$wݗ{4gŜ\a]"i^||ݹIԼP8'u.aDa8-kzګ3@衹,y[u :365m-mp>[iKo2B{-?~v7qz%% J)@X')}훹U}3|8h[ti]j=nfܪTKt%#ЫՕo tֽxͱ5^ѩDbp޷i50qE-:h{OyEWwO8Êr0vv ow p&4}֭%*H48o9 u4ܟt|V#ѩ0pp(98HЬ@Ġ19s4*$y$)t580g3ݟ|7vL0(ϒ/8+8|{hݬfӇH B* o ΀I!L4gp|[0)4Jz3Ƅm+wuM%3gp^)vOҒV%]I']]"d6^;/V3_w"s;.@z#Dw;w\fPsyrډصr,$xԢvC^N7_bWKkj5W˭-znղ KA)ݽb5 )NI+ޘr^ݽKj0ΰ V_0P6LG×<[\+ ֟8?IF`   [(8GCNP"썙&k%(qL^$!c| g%'S52tY}d%Zb,L4|wAEQS} /~= դT i(r3bRX1RF ϲ(5iNE,O%ۃRt.Զ-̞-]7mH02kD(*ѹLcЊ"w#Jh.T8JyDiF2VBM77JQ"VG+G[.R"k R0.#)өTp6͉ 3ױ' c qc$3D (˔ Bܺpd IeP s0fo78n&> g *J \暡όQDsQ}q*Rٻ&7ncWX:'%%WJy99[ fŘKHJ*4C`D%kAw覹$^DH9T+8MHXJ]48EI-D.SjsWR1D^T9'EDD 8 ߿']C[#b"9'3;6W*r%/CrF_tc۔s^idEbYLIa/q1 &-MV0^ u#zvtm^V+́ĶZ .L PZIn6 kI `rJk߶ 4Sր$9ǥ%/N'Y ,tNHdY8 jyJjK*#)M1%y!  B,ت7)P#{M35KR43Q\0 #"R<*.VCSjFT+ kVXP-8h XscmLH) a`cZs><ΧZ_93:2\X%87IJp, +)hLv9SA5t?F0~ӿyeZԝɦ3sO`g?@+JqyxNSS!*sb N]ji.pl3H* ǁT[ѹGmueS]=?V CAt 6ٽEͲFz/OxZ ׻u< 7 h<7]oo |CݭfFWĎ13t_^aBܯrj"x?>~j{y`8f/ M܅iy;מ*}G0Gk(sK!s)F)6n:[ BT'v&֭CLw5QնUڶnьZ;Wҵr/qn[7 Dubhb.L̺E3jRh\EctjD B 6q)XN唇L p'Z(fR#[z0}_•T$+7cf?$K„ޟHys_7)`N7 4NI%g@z?b‡|Q=%صϋ17W3Ç}b`?fzyΞƣO0JrK jLn J D+W֗ƍbĽSo+A%* @i^`a%\[ (cs$3MT?=LgƮݼVfn⦕w^2yEp:{fqSyrg[vo\d>q{_f/{h^l|c"o)U31fDC[i[6.|\|rSZ}2~C1hwT9+.cd6)(jo}L<dn& 9v{Bci_Fs.>'(%M7%+k=pUtyivsQw0B,4a3#t.LB΀`.9HЂBԧP ׫~3XE 7 П }Uy84ދيyQg؛߭'`K{0`oXͳk!`c{PJW@a<7jx{U4![`Iž״vDMÕDTZx)N=erZDlKS˼~O)e~N⩯j$~t>b#a2[l;¦ Y{UڸGgGŎN-N>xjcՍV,T7jj8~/P7_{: "_#"Ot!/=crx~+:a6Rsͮ>C1E=G BZrƻB%eF>-/k> Z浼tiU<V DiRtcAn̓p|EƔNPKcךs]|ek!o>tɠ-Nv\KJ/}+ִʄ׾ox{_Ϯ bA#S"KǘyCJ|P!̇!:GyCUrO{cW,|A{A뻥{H3NM'/ao0|@W6V'qq0@>Q7oNշʉ雗8 > +ASf?[^w/lcf`O6RYvRNwkUFAj)}{xrOdUǢ~%2qT50$p$S_j,nkőåc@RHIFNA1 { a:pBQpqyXzᘏsY"Sdc(QI`bR\g{`P*- \tG _6(k`D ZD0H C؎F,쁤6/ʓB%\yRHj. IW.j/JW.1%W"XHaöE qV,쁤VCR3%Prtyry(EȫA9/[{qp*GB1X4YiBⰤ4Z|AIiWµ+.]ď'Pb戮):N}t`:h ZE/-rr^uc,ErjmW)S$#&GNj)P*\hFd}jbAj15!\|(9FՓc~zl,H믷~F*_yϏpC7ݷe!5*I>cBڲlO;]Z^sQ<Ӗ4> ~H%qFk̓M6b7LWBɟM|bgዙJ疺4h,O\kf|odP~vPl~xٹ}?u z&dX]²'ҧXG`"h;gY i&HȘU] : E14Grw/Osy,3ֺ4{ўeog]S")Usd9(ӹјyT^0dt#-h ˝{R1NUj/ߟv׈%̸'j,Sf?=/ְT$k( x qyi.kרï+R^sоƇm &eX+?־oc'Da/~ڀ`*I-T:+'h~Se⩻+|F3\}u+Mu 6?_xd2"WtNpoNizV-AV>.:>~ܡ^eVp/ L|:]:tyNt/8C<ײZc r HEC9+r$f`EJ`w͏w?wsy%[@wEtVmo6%=7(?[9e}Y)$$ ):;g bNRI1!d2/a)hMNq&X}y8jsVV՞]2("<0T8Dq)`cf b cDBr|ǑFMIn1MH\~#celwy6;xW̗F яy|=(U@x[s߂_m仮>=h"aoހhaq/Կ34ٲjw !2}yl۽?rNn%[AnAӻwV2F%Ai9ݞ"Pĭ X;S46:FLα 0o3%2\jn4S&yiɌ($@G6i6KFs2@![ęf*Y! Zk+P %jtA7uK!ă]S?hfS>Be?G%QJcrҰJp/㤎 *x(@VwX-lpzdrݩdEbYjݘl|mD0?t)n\~n`fV+ɛE&t﬐iAMK\:BĞh4x Z\@tc,`nZmo#,]쳯c/R2/\2cL9d3 y0e-Sd'l|#Wh2GFA(F JؒXfsBZ+3* TlkG }X3o AreђO,k-Fb uZGԡEx3h- u9Du0@ /)H1h]j]FМ`F Z6"h"#viȨ9~0#-(\rϴ=FkYm|C[; nCe8uxA>25{5}/A;six"xq"K.GجEpZ>NKz,`T˟ A}LfoYW6_m&8ĵ3AKf\a r Ҩ;X3Y-zjasҚϹ($}P ْ;xT{dq_%=|,6ӧ)#3;AzsOr<&ٶ7?;r?ateӱ?t9藘 ]l{Ww;r?@Էn/&ٖ:otaRA;&-#6맃ys0p ŕ0ɥ>)w NžHp4$BL.h$íE[Vp%OLUP- */C[gb8=5qh)o!/roT(*ߐnݧ:+6nH7J :6,R>Fi d^Ri<%>T8fC;LØ'BoJTMފ>{{]8&Z-n5#r'tl̓%ަI)),o P+: \տ9P]ޑ@'AЭogq .}^(f5ݶGLj}C,i\G%H~Ѿ{rksF6N:VZ?ѽ.@D| M^aSeJ-ek_[<,($nXM&Dy+P6_g$%s!m'~E'$-Y=b[$kfX&~bY-@1N '6ZK~JğF8~*=-'ߊg@('= (Eƿ^ "RWI1ZqTH bR:g>3\M]c+g=\{5l="ViÆ~gUE4* ֚þ #=.[uxhv'k%6LKSlS{$R+LTþs2Fiq$!]~nH(MCQ>Q>c CaN^`x$-?),I'сTTrCTP!q,nHuN ^.]ٖڧ5Kbؒ#R]X0\pR`F՝V O(W_~5WuN3 {'\QircDh`x,X)tR:t )aVѩv%TYMsc)BA@HdyNb,%ChՎZ!yieA=E_ظhn]Nk\=`?+@]åZ0Q)wA @+.J)˰ Гj ߧpw>i PP1P(vG_\ؘ/KD<8oU$D )MBIN]n^ݫ36l#Ġ ("UD $JgA;'(A E ]ȉ8Y(mђ}<*'$f5u8Bݨj:;c2PZ%qzETP5=$'-L&{ci=s};v(Q]oe bwD8X) % ^[m,:Po,̜r \7_TQ`M ^%OMVEhkqIW c,K f Rj > `Ywq`9{_Kֵ[5;] ⴗbjki%kԮޥ6bLI mA𩗦*Fp|HPU|+ V,PA|^mx(GJn>eb>Ems€8%mt43P[ Quu GyOb$Qҕ_ IQgfv"V+@,|uk5~.J HP }#|AS)&W|;ӏ~8g@@VWbF&kOߞ><Ū$kMۃ EĜ ?6W^LJ\sA2 efooMUoӎ飯%]~Xk1gЦU}E(;L۶5Ղ T/.@)Q6{e"il6X 9-6 TyP?24GpݷmYB1-ܲ.7̗S;.A d[N \;ǍAћE/M/q;1v*8q]D*]9i9S|.fg5IQ[euA[d.pYԚ.\D;3&ds%I+O'Og%6$qaL=.$Q})>lulȕZOo -IIq˯Cƨ ,IUh U0R"W<"``C3.*##˝:eܼ; gAr](e9Ȁo+wϼ]N40p +r`రi܏\`gs;~|8{K/=C`܄ӀdBŮ'ЬŻz"TXj P K"rA݉P0}tP+ P $Ut`4,N,!GLn!+S+Zy\Z%iK5TH$RĂp+3@ߛ8y}Yrϧ Aސa%Cs#LL͔q ŶcX<(55K]0v N6.8? ꉄ?H污|=jfMzz#@,Ȯeg #4Jmy߂|5k_xh v3*#ॠůL)$ "7 >E?BΌTP TZ[J8@N>weʓV_/Q;[*8&~n;!3%jpAG)cP8OieQ8*%sƒV0n#XRe: r]^9Po>>v(WרzF#Ror X(R;'_A[ U[r["@l^ H j[rVxNV$!qy9'CX|X"%%_P HZH^XO4 DK@SUO^C]F8'$)y SI\T^yo$PLK (NĻpTѹp<^cǦ3T\dW{<ȎAv)|3SH5) O5jTs.$ˉk-xϸ2)yaCMQj2G㺻3;?T2G$z7մ:ܐ2ߧ\ݦ=U~2F-8Y0|%ȇl9%tە @{yw2| O3]Ո4LB\rҹldgoW.n/T=Meg *#<0*xq= {ڶ #;^|h#5nu B"Lf +k[lTLC'9I>GY>GY>GY>GuQ·\AĔK(BEØL&(N&nEi2uB+nuN-w$PsutFR;rq|!3LT~uG  Q&p П*ydN9㼌,SA[4V(- e9bLdC)(j>\vihN\[CYi0|{s&s4ۜBrXZ\ oW[`"rLJ8}3V:xj? ]G4O׏W/Ϭ_U~'7GpӚuVBb֒W~>Y4}uW\c18[.hjY$>i dчن3_ɋI2Y2O:Sׁ8C@vZN"45h< w 燋_m&,@7Mx䇏߷5aY8a9Zzu3)\Y|{mw߳t?gG2LUZS[Q ly__%o8$^-`HM&h%܎ZQ4=h9}֧?F&hKˎ\E$(z$)w@ w.rma,T4ٻ1?,(_j-[?9?8Vp KHIH*8'k#f4Z[PZ12%s6cǧ)7>0 Ɏ 8A뵺PDa#R,s@)aB yLIKwcd$Ba|SWI=K0$$ѲA\ bz$f? I*\g uH{MJ@5nl*!L;jSBU$STh#MȎ?.@>i?aZi{>ߴƂdhꀑa՜;G7f{ uTO?dL.noe;^CXo`6ol5`}ZFg <+Źv¨osuzĖKՙiAqmD?Jg.w.X  >2Y mN͙y\F{>PIOs^@-~mqI?ՇnESO[=}mE^ʪ2d8ed/_05ѭglvX-nYԖ=dv|bHFi?[gb5'kwbe]WCa)qϘwC` Z,Z cqUlhtzԱY>\QK])HqhyDBQ)HzTDC&9I&zT@TX;$A@=j(hB4gĊ輊@ff IQh!<$&*5kmĉ؁jЦ9H!x]NtKYȴDP8cO z'uFbJV8h#*~|Sx]oA2{zzGp@Ѝt=$-N0uû-&N#QvmP@_#S%nmh=a:L%("}Wmג3}T2{$KhQ ,\BhNґr)}iP ht,qLʓ蒖ra@5hgԡ7#;*Pz 10bag[0G0@J% BDHXG@okw:&GHRTxMRO hD0(QQx >QH8Q3 FFg>M)q8yYv<ka( D["G_T*9fWHb0`d$?ޘxE'ݠS@5՝?M5ĞXwoߟj1՝/[,S- xve}ƼrNv)7daZ:z]Yg:᭻O8͋ZBCy!pm/ͿUHS>f 7:F&ޏ s@Y1v짏6\Cc1/8?*OT__:oP1ۘnɿ}? MXByN{k?*1+H2\u`eba>f.P* 4t1a\x4E4cle$HcUi\3ީ$cV*SM#vim򀒭O_M䶡Z2BOTĽni1oյ5e_;! ?\#+BmύR T<_ &\YURg2o>-lچZV_'F\<A6/@Pnz܏0MLÓHӔH>{( J"gZ xdb(n<@nj@*ኒ.J}gF 5jnVA|MO"Zv?+c7Y%oJditrWyb3LL̤$8ν!Rr_Fa<$AQ?eR"}/װT)8AWf\s.h? _m`$tuVNE1EgC%kG4C&Ǵ ύ5a4.BcqKӉkFQAXg@PJ 1%{K$mrΰ-BEAΆ%9цlhTrG D7 PWJ"VcB%I5oŭ Cr%bT^+zk8fa^uJ}˛ "ި~M}G>d;nŅ]~Aqt/UWo_~q6?5?ܣTLwӫ_&TOMbs4XO="Ŀ>>_]^RWZ}9(Lߏ}h".炞SXUu*wri HkR}ޛ_̮5kyD+! )iq"5j>7ω>><ݢ$撹8hVd eƄJf`J`h=H9yVPD-et>i+I4dMApJi;$Z$Nz&O?6r !7 [, >ifMfp_0Hx-N>-tmfp%5> 23LLG hj/MǴ s\%}*L+qj@p*̾[ZҼxT+IdAaNMvjoS{ڛS۬MPŢ!Ah$,61C(6P2/A=)Aa~ڪ96p:,)rNt(Sg:0ǁw&)9痔u5"kj |&$Q*2BĹ|%%gtЮOOD ݹ驘P޹sk8Tg_zz?_H"zW*Ы,錨NW6X)]O|;+x/E~ ɥ)Võfx8VA?B Q1S>P(%zi~ZWƀæzw&Mֻ5M8wIȘp ůy=I:$B92YtSn{Vsx LQI+kAb]k#k (h,&SKgqk8U< r)?u7bf()a[[ZՀ(?X-SR 0JZkBhE-N;/ ZA 9}h9޷))|EW7ڂ.KLMuNqf`5$x5mkSS]Q>bMuxҶ;@)q g+%whWJS]ǖ C9pf1Vrv sQz38Yr"LKa ~cTldlEe!rl\uw$ m.i\EG#KB@) CR Ll[@Um,}B >Y8}lsgr&C빭 +FB !)zBNLv~ZFؿ,u JtF#d]@;~ ]j[KN mTqI08 .b6mW$ĉWN7610&F/ʓL rNHcd֩>9s ܙ1>3]%2 JLT-v~5lĥw}{L "PxҩU_ɼ|ƽۻ_j9d1,bf{;}k9cx5[y]r$c.3X,}xQJ ^CӦkaNGȃ }Lfat~Xm\[,] g4f|[]l>~av?vM%Z 6ڝnhhenu>m?k-#ՀT"ZzhbZӈʰ}ZBC}~"дu#m.Ӷtf}`GCc!SK=%+lԘ0jӀRSCb[mW%$8@PFwe`E}IRKHŷs7gvU `^R37Uᕀ@gLy%V1X!ՂHk\6,*X;7*|ߎw31xZ1gx34&Se ݪmݺwn56$kݔacn:hc:(ng}0ad0V-h[օsX-FyFV6SvIT[m[Mʦym 8j*.5Kۙ@իۣQs/%t&A_۾Z7H/%b&hM_^*\%O ]N3O BKp: qn m<'t0(4"W'P&%O f^a@ŀUKjҦj~dv~5hFml.]n"5k](me5W-ovwP耤+{5ڕv/oWZ]ROwžR4% #z|464{nU}q5;4k%u)~4d(H!ד ;A"!bZ*0Y ?\Gs?gOpZ7Ik4gn mZ mf[Y>0q{V5Q`p6$)NBH !2ivFy)ox*(, PUDŽ+0NFf!58]j,oCXnpTkCg4uJF8- WMn-\GLpg.ox6O|v=NnwOWվ.wP>5\ s{=Y+Vu\y m %Khu/{84OW ر*{ '4qSn$=:rA;d{z.l"2WC@EL̤B{=5n84>}}o!_U(ѣuQ&$לF-9Ĭoc{MWD@p@ (4\꼕F(M$>4c BĪ &;Eg1\{!’IJQ[DpScո:JSJGˆϝ,NA-NnE=d࠙x1INu41THl+[.)q݋N0=̏]AU$N 2b^dBDd(bcT|XvDuځ؛]:hSUH?fӺT 6=0Nۻu9T"k9{U.مS6ƙӢAZ`e6UFdLnWc+착(V+h5D8yt#Y[^Ωnt 2b?GkcRbWpsC-05"aQݰ ɴe/ՖK%T @Ʋ8M!-|lQ"$  ɧ" zaV ( 8pL Ving~:>KC׭wuBP47(`ݥD\ͬ*(OgTiNfFCCz-o=A|v$>552-XBtN=Ɂ¡OqɅS &S8`ThQU_ ʅZc,(_Nk_RZg jћR桓njma/z4rV+kIԌqN4XwBsFGy&͡Ob z{r2_d\g;*!/:+eϿ4(pM;įr[1jmyq)Og.srt>daFS`ʨ*Y <KHk`G08thQcҸF*P@)z9RxyTo/`}eW,:cT>ҢүT? OT ,fuEccNdRSUqh҂8δIdP(% '3"(ł J%S{u,:fshhF#lq.i")H:"p:9R) [M&SE1PD b4Oy1Ur^AWH(zIY+L<(*Rmg5~Y'%؈[H`"b 3秫\9D7We%"lpy,}Pe='ouE:Y?=] i|<|~ո`C>32q|LπLW|_,7ݧ+S3i~.v~vy{ƶ…k˺~ۧ?+g eF)r5w=Ը{O.cmI-JbƲ%'S+1Qa c <"ꔨRs=}#LVaff빏N=`]ǃLc7_SD~-S+yBRƂT L1bO'%\-j,+0>'W.+|N\gQ[PgӦkfboYFAeTj勫ڳt ,]CTPMc' Cn@uڝ֘\tLwUʻ`9s"IlpH^J=$Y >XsfToM~GBrPd 6 $&jAT6h \ >p* *ޜI%+tܽF.Pc ) ,pkLJF*WL{g|kr?Zk o]+#kP0$R3͋Z{>tTN(6͆u&~%n VK9@{t %J`+K8ALQM3>@A@Frq6E&WoSbAox:Ѷh{Fi w7CiIO[Ӥ i洳>5F IF\M:>Bތ7><i!kCǨ$ KԩJg8ACCg%$v}l?=n{mR("% ׈>)8ΉHrGG%C6A:.!/66 ʙPgb<:K>y1<mCc] aSW2m{B29tm1 i0ԆXU8T7#n{h,_%ϋRWBHmT,".i# N!4H!aK GVbYst4@12ŶxKi찥רrk\>A{{*Аk|].?|Ȭ ?a)>JYo @;E|o?W'an'K7@gӫ&M?`A^_ '1toְ,Ip_fG Z4;.ՓaDt,pKb=^pj`PP k-Ric?b@ns!oV7<¶nĭyx"٦fMg=R{)t_MS1jaz 9K=zX,s;kyN`dtNZ h@S4QՉ¢8Pi8(CqzB-4Qe@,p)d ?{Ϣ㶑HpE$F &9Jgl߷)׈I 6!UUBc%aC0i<{Mzs;;k qoEw$V.,W6[ _u9Soa-6 58Ip;V}p;vp \l<([XJ4V!Ҳ9#q,^]uKOXX\DkTaz[֍եAѩ*")SNfj#Zֺ5!!֑)Zs|1ɶu㈞Vƚ份XyZ\ѺVҾ[h-b矻ضnsnuiPFt꾣u{|iA۶n-kݚ\Dk'gܲn #v֭. ʈNwTn/<%n-kݚ\Dk%gm݈> ߭. ʈNwTn/(>]dZѲ֭ Eh+7+ /NSg'q6 ?ōP[0Px7n$-޹*iUhҾ2\*)_=HpPs MSXզJ?OW'S?|};ͬn@x7#5 ?_r=t:w}1//G _| ^jHzXB~ "uaIׯcGaO$byʌl"x VtCac~,9,<.=}1|+@Z_+JL:2-OWĬX z\ ~{zmF&;p`0vak'U[,ZEjE"[AjImdy+P鵠:iNJfǃ慮$^4ǙYj:Aw8ߎmaUd11AQ1q^zMY|X*'؏ H>×?Q'tf`={t׊_W_r ]b1F)#DQ"dĘˆak9cHD!+` Nq٤GAh*~gbm="0sQ~DpV)AԅZpر\sh$3SI5О5 oiO)bW÷úD]駫ut7opu,?~^a>8%.s} R^?}[QȇTOobA l_h5]^wVV>lGm3_z;O\ vR= 3үs?H $p)ݼc|.:D qL*"Y,03,QX+t;C#`7 +C LM 8z54z/A?'`GbU Ԣ G@᤿ ыXmE b3|G98)X]4 `gC^FVwM  p;!!ٚ}/A{@snSҭ &,D*A jMjk8Lp V# Jʛ?w!=QEF߬Sb% ;Ys&8r_]-xzX^]GwΤ±SwF^QI M1n 1%F 0A8<`Ayk);^b*L.w93݂%{*YCnLoC࣑K J]L Jb_b~~E?{ׄO -l"<9c }h$ۮ5n8ulnU{ ۸sYZuy}og2&+M%Z!鎓n4Ϥ}"/F~ۀ޲;?|7`_]- %{HkVҎ.wuK""JB/o}u `sy:+3W`M-^7!aP 8R;Ң{mQx&i{ D"GccQ0q%XDƑ$(8 1s(T'p*L-jk5iU0rf|'ϞSӹv5"gc%O׺‘#6J Rr ќb )))mmջzTSU B9;USCSREbt1sI4U`2/ LTaƍ 3$LdABr)! o^]|9rYcN>ߣ*SK)vVIj'*.*M-^X %dC<ZO;ݖplHƅQEV+1%\3bF42&]d-12i&v+V|8_V*G#JZ#KDc.)P&81Fi9q$&V&+ \!Υ[a@YS&V֓O S eN 9&ZD$6XB849+V d"TK ԊK"Gb&Xi M01'n`ۘ0 FcbJ ÆX!-&2(7D2G>*pI+TDs#%0IP[E*/?8Ք? $vDRVm̖jlqx|b>g0 ~O 5|1H0HURDٵ$a\g[y LRf;&/-aq8x~0-p4Pa8o{||\FB@Չ2]nUD0Npd,JI5\8sMk@J63փK+IkEjPLųqwn L桋36{fI3Ah:NMKOA캱fjY0QPDX=g_Mcc!<t!`2Ʈ t Kln";|:DE!" xL >A|s(dg 3*H61؇'lon 鲦cjv߻,Nn׀]Sْ2f] 9j!Z,t6eD 3* էbtKKI ڃ{ poǚ@K Mލx qZaѪVVPeUᄖ_aBmގg뢥U@]b I>.wkCR-oQ2.XV.+9LW@)U7BPsy"\Pꆕ~to%냉Qcχn) y{] $T$3٭K|@XF>:fjoqd`Ry%`FAefU H( '^&.Nmr<i[N;E9Zd9QĔء8;V* 5Eu ,(atʃM(nE'cc hih#U 3głrFYt<sXKيbbyU +Nzx;!XbOk|~Z=I7L  #Dvvhy5bf/`2%A;+ ;k?k8پ֟0=cZdIӰsw-mg\؜v j%=(+ϸ,Ҩ$v`V OFJ.n'wq7(Y%*V%u!8kibgаVP|_3m>$g>+H3M?(6񶽦2O+ uPК.wp>wʝnQZ(CL:mZ4֊n,;GվTvng dE=D@ Um3w[~ݎos~u!wwX`m(/*B|3h{%j@,1O@NѮ2Zw>QIJ"OU8TkNu / rZh~iw~yrjTr cwOPg;[GZj dXSETEmxaB=Y3\HLNnl- ![Z`i`~G, TQ1nf{V#дkp#to̻?*'1_NIK1V{Y$Lh!Sh5!m綊;54CSZ el &iBo B0>A?ˡ 0g &;>#R-51xiQNۉ#vL\ 3"' P]3OksZwԷ۷vx*)mOimܾE|&5)>wlbfwyۘJ@Xz3>Eh5f- _{W䶑 /;^캏ZUV覚RRk}ID(}8D/22cM[c5xz^5s{5GZu`ZUUQ29|"sTu;5-pk,wbCV..Trk v]3wڔmO\U`R+U'B14/yʉH*qbCՋ/V Tu%"MYjN2o9{Ii|' ,)DJٙlIhG ; 1'J/T昍?:=~ՉlhTN"(x9X'" ĦEr". k-R(eЁ:Ydx|X,Djtȴ"d:@tm%ui#.D A$M+wFS!RF*Ɯ38=h[(iBL R`GL{OձR-_UVs3= O֫=kT`4ʷh Mp}3SZ^(%.wRO1suo0!;,"tEEU3{˃(iJpQ;)Z# PM1 LJ9o\uSRؓہg.P+{  Uڊm! 5*y^-]C 12X (sT(i}#W+D]-jmpkʻNU[FQ6D2=[RbAp2JE5& |!BeS(--N2 L9OXNꤵm4Am8ƳѴr^^ hX*%s"(R—VScm]͒-*kGSG[h=돼`R@\maZ8%AE9s{Ӊ76%]Ad'ק@)I1uORwxM"";uz -S|~uSMJfT,1~|l@2;mu쫢[<"D2::L#qrI9!9gBWo.7sqӹ<._ߛ6Guu.Z9)B{h-\pȷ?^:^23(TbYPg*+,>~+Zj0giHIdUofJ^(OF^ -4<QO7@R_]>Ӱ8Y$J}0=ZjR M)$#쪏d (Q֬bo,P ;VkK˱dzc03"!\*grK, UPV!}JZuo`[E_ x@OV).8lx95/RYMJ4RJ<ϼ~鳊?'RH;ERZV'-$w-?G+*PT)a@2ɪf.D`.F|CQ5!_@QnOQ&Il~6%SuL!dmMS~pݬk~iiX !;.)UY#a+Y#Qmt ^VU1УfbIY<ץ:^SQ!vߞ0ΝWq(:z3o|]'4nd1 m@FWoSbkae{ү)J_l[BV;9q\ >9"m 2UEr]ȕO]G,M%;HJ9ю2*EN(DCтz xfZ+Ni0M!l6.&<}N DE|r `."|פ Wj\QDʥMd}] җ9r4q:|rq8hS7:P |>4R.uբJͩ-AKr= 3+qj#f!mٚ Jſoټ~eM,UW{:Pk,d́~#¿W˙$mXҔ]"Aο6V].>,k&]خ8ֆnoW\I{P{*Dsb)% OWià D%+3}1I4I:nʹx-,L?1xQXind” 8}c%[B C(PC TH!9|kMF]DQC?*њ-rVZ% kXbSfJ~g_F4~`,4ܛMY!pr9pҋ#@B3ؑaoHil h.ݮ9#0^pZӵJj+ޖߥkm53Ow˅o1،y9Lkt/\-4z ?!,.^'t>ݻ$\9<64c?^{h,ӡj<yӇoC:,9꿙,{[:HhV7VjN p2xk>i4ޏ<=cU: {ee;7aS F  8)a%d{% V_(~z~\_f$VK΂OGn":ƱLq~:N|?Řx!cQxpN#Xo\ )wߤk3E:3p{4'a09ꌄaU*ah89'4-\L}44N+ hy/Ycq37RiӧbB[jp9iI,6 *'~Դͱ * &8quhovpcRڻNUt޿ٸl&C34@3;_uȽ^L|2?)7mt%%G'O"6'Qw?-YP%]\8d/ Ï`,(]xcb|>:{ ~Bu|]#Bq\v)Nwb||7׽0vm5-Qnuۿd{Gu ?n7҇o0iq `H`8,DເI0:G\x@Ď҆.cĝUE*O)D>HԨTͱS%`~9}ltHRqLUH p( OD(f ͗,nP#zWs;lh]jUlLv]ŵ+0әKF,H1ĉ>b\O' c}0>)AG" ]/ngQCLlΣ i9^Qg7ޫt1봂e0 +xrF+vbt;KZ -ٴgɚॷ^Tj ;d )isܫnÌ. 7W5=s1uMC;vo;@K*O'Ơ/3 t(M?($Jm-2337Js F( F5#뼹2z6i*Dq)A5E&$7ڞ7B|()0x8l(251CTka4]QE2$)7|b R6 `:7×\`@A>+1nߝ9&_ Ts\rۋ^mlHV5̶v40W=# Wfی9=|kَ\0g\WѢE~,Z)8r&6w&FXw+3s}"I:uxEdXSGhΦi筗*?QLZK1bg;Zϑis8dn!;je#Gu&Zӎwب$B1tNͥwpGdgwt/Z쇸2'Y_ #/Ƒ?Y{!R`/epn19n @ϳAu_$8k.sicOs{7= r?'^*XnԿhF 3$1uI;}Ԓ08()ӾMougL[/I>hXFbq_̃^ڤ,5ؒoz?͖Lw( l5c|]6W~k3x8}|9Y>s4!+=_DDuBz6.]hLddV},LeVT% h}i8cT5@I8V^Qܾe _y*d?w 0W9=`e^Z6㱢ExC #Ls/4RaӦ|kcs9`\^0(w+&4iTh^3d4g]ĊYq83^|vNl^ZzbdLi%)F_ṾVO'~e3UDdBE;&C)U}z?KBNҡ*Y^)JzT,FR<~atUv<JPazODG4&A1ǰYF<O8fBa~㟄ы|6.#Sq2TT " &!lR2QL}`!LĜ Vq*r)Mb-6جpckf7z>[Q/f >ҿSy^}QaI_C-]䲔~cdߒ24G7d6MC4|fj>Z9ׯ T^077ۊ_JxRa]mw]>r\>B_9Ū#vșC*rf638mX᳷CdX2bvH[ohXim vջlCƮi|rjvI[CvXўƞc⿰b!xwM!m<n!u hp. l{.KI!1%X˔ 7 \+Dd$}A qΆv`|Q>5=1zB=T/9 P*y~4RO8BޑwmsP4FjLsEZk,՛#KsX﹩57L1aZҢ?f#Ҍ'F:n(^D?i4-uxK%<6`Cwg0`:$;/,VǖIv'YaIKԮ2" ұU;sX<>1_]uzWHK곍eҚˤcJU#`~`=1 0D\3Z< YLNPy\83ÍpU; _kPm6=/Vj}FqҫMNjS565F1ͪM o^mj8AbjUN״VUڔ2sզcͪM_jScզm&ćđ)b^?GL+:l@^P9' (\Vv}54yLM+3dҘ<||.##_+c1B2~={Kx?i#|0Uo ۓQ1+p[;1ґBN=:V+/ϸ'K;_>Xi& Fnm^ N8Tֶ"5Ό\d9ꏹvۘ(Maz})/čjRenKM!hT׶ 5Acm/: ZPm͓ Kd^bo˧W_S+0k=\r4~kDQ!o K;pӢ&8bL%:,NI& *ViEݥplNlŀ d~xRLi`)²] 5,ھH]tfx@-XT\|²:#l \WW k!2"[k!2"k!Z"{~k!Z"k!Zl"Sj!'zZZYS$hedФ3334LLmLLLLLLLLLLmLM9~㙌]~HMFJ_߯iIR>v)m>Chk>ChU>v=Chif?5G?djs%CXK}>CXz>ua]CXo>ua=CXWa>amCXJ>0G?d ǚ+[ kaIR>ua)m>aCXk>aCXU>ua=CXif?5@MN5Hʇ@K>)6C|t^!o>͇@*C||H3i@d? BuB=eކ+4hnql}j+mktL܅藓)ejLF+O/\\rf؎fDyw#2I x«; iY4'&q.x^@fZ'u瘷ClТfrl{##3qFsK+,:Qѫ]{cIh+fD5QDPEEi~媄'"5_ b, וF݄FT=xZdB% mKFX& nWb8/_*EF{PR3wPIB ? \x$z6}TGmf@6CD ?Z6ELEdDy0GiSD/N "kIAptREf1;UI;9SP*sхsdwf.W}i7W,0m6kxkNm@BbϿ!V 1E ܔ^gJ[>g GOS\A X2`*2N+v̱zMT~ufE.Ű(SpREF`K3 xPN{"cԾՙ%2J1E5"QHg`[:*FPqg#IDV$GdZNDVnADlU͕:74qii8y, e;ISWKp~xgt̅`>޿DP4Yo9G?M/ #Lfg4η~ɨY~Dcw @P]}k@ֆ32?wd~{8D6i9@ m N8Tֶ"1Q rc8NƜk tL2}? -c| W~v'?t'*DIPE;ACs5E"1xPm<)aHRᗨ/( m̃nSMOAfa<Ԭ=@;d;2=q~aۘU#3b@oa:MN%a!gKHsv˘&6e6Nd|N#j7DxSޑ.O<8~Wix>,<;43ǛR_õ&\vv;]r9b\&D?R5n֢#ڙ0i&I?𗕅\C0ϸ>rJa H1n6ɮx O BԒRV?hP2LC}r̴˜#/N{1>lħwl04^1Vd5kOs>,h{)Q9m,89lofxȶ*n" < BD3Y!y$ ]Łlqw ML%Z8e9cx.ù6je7l}* Nb~qvrzgg=T_ϗes>~:a[ ˘܃R7oIշOG>Iٗl؟}}+ `-PybK ՌȸՄ79dAEƂtd GC(^[3r!$&YOd(]BiݤZJxڥ&W H:G +M0Fq&$7<;+r%֒p#x٪NXԕˢZQOIN)= I7Z5d`P幧d$D= %`Zr g>-A`\ k;P2  @bz!5i/򪑎Y \b>t$/ٹIÏ/3JOCT* w/'K¯w]~T8~ɾUK3ĊE 6C#ѯ"!Gh |8ׅZ ) ~cZ맥 oT27"Glvv2q˫lw~s?K?^Qx ~I ,(xhձs,W7&üp4DC ^qY4 ϸQOozyQy6那&zŻPme$X7l "Қ)ůL49IBf| -q:=0"]l!L%]6aR󰸽I CBJZD ' Nl}) *ặ^(]g6hbSYB*3 -=d*'}mYMZ9k=\s9셤T_H C˶$.ȗo%Euo-ĩ]zD3rQ9ޔv`I`@ Ce3/+d/^Vl`4i-tx"dE*h+ېzCə=Ȫrpdͫ#9].^SJwӱIm\1Lb*N?O.^I>,s.,*9Cm|ߌ`˸&㟂<ރU0[MpK"mqrM[b\oƣO+x[ HFq6k/o$qѝ z֟=Y~GWr̘iWzEL48|;d۳ɶE ",tyƨ1w!.Z`*))k."k#t.~2~KWaQ>5|d<4$[iV{q`ƿ,k #lCJc}l=R޹[j Mn;z֥N3!XP!rBJu(I&YX|0~D˚6N#؞-XՍV'$ۣXjE0:IN'Nҫ̋'$?zagkܬ=xMz)2Cۇn4xx1Z z.W?w!hm8Z߮*4&{X*,a hhT=Sk l7ֵ !@x-pD/h*k[cӖ `XV[Citu0>^n S%n ՜#g[CfR3jbȿ9TW%ը[RdDŹt=-u=s.*ݖ ڜD\%عj smnf x((~6L=ϮF% C~.znrz3t#ZMu%|XĖ:rvc_%Шi?= tyKE%h+9xNo7x~-ڄ|F "mQp%(Pm *[]?_?29xvW'B \-~Qʢk˻W#Z`tW[#Q-7r hohV2KzI3qv% ?'*@ߊܟAgiU_ZvNRfmsrb[2m\[ xU٬yfq?{ ;>1N~*?F99"NH#F LMP1Zz}ԉ(X*Ѡ_QYB3.E GL)5ȎXD@ fUeX>" (Ea aDJxdBtk)kcKQElPtb !-#SQk q Ne[F]6F@yD8xLP9g8ƤxT: K9bFP^k.~+Q`Ô6\J 3s$6#`1tR|D[v㶁#.?L&FLS H9hƤKSB K@LG jo{]) gL^;9~ ^W w:luF0͔5!3Py:@m!L0҃;c ffx̔5PqyR4 FTW6}0-2V Rd!zj,>E) )0@19;# 󱶔F$ FL=NnqQ~0i-f GP{`{wvx9ճ\TҬewW~zq:^: p:yN;_ILW]PvB/t~was~6q{%Ap?zA6 q:NkUy";];kPN0Ľqj4)ug3M ) )٥ $iͺKօ/:8r9ە:CO@ьQqDH LAAHek$sJ,HF\T[ppJVQ~d;}%^B~!"3p7{[ p Ed `@T1ra4ey@raGyTBF˝ zuya_t3ZI.٪T-V Onfddo;9|osX83?-jvcF.o52Ț?M^<~)=Pя='ẓn"-w?d7Off݁Oq7!t)oƓa:+ؐ|l?4'\!J Pø\ k6$Y_Eދ 1WN3=13xvN9{IŠ!kyCYEV+TZ?T#N* @EhE;FY*R_EC_6UU$m`x+7=wiMcIGV>RK, Ag}ۨLB$RR'hՐDB/bH3zMWE*ࣥ^|suhZ\K?g1s0A5D>RM}-oC$ #_iy<"_uWJ T>ݨv,R$) aA2[iBhxTiavG!ww6E?[m ]O З^*5,V pj-N%3˻Mv7[A3U[`J<ȗ$Am(&@Kj\ 4G[Ru@]Dr^%$7Co JnmUjD: h kI[,;'"1fLJTql*Sz!9!3@$l͂>y૛M VvUq_wo0-O tO+~dN]3Iz"BH.#~$F@V^yE- x +)m9=+[7 d郐찞SKuj0tOev(g+YFTڐw4#jJ[>ަ\lR] ݛ6ol1X"eǠN\0pcUDӎ.9DI QcGu)S襸d+8mp@ @t`U B>!&2"GQ#R 6HW)f ZTiF  qOs1jGCtDn $G`)d.Rc040u'9"s唄3HN<561]+V<8Uȉ@@l"2" p~H(NkȴBql^ٮ;Ƒ14m^W;|p2Je@`Xs0W: N {EXT3~~k]qmvMk$.dq4f,-ht=※]g 'LV&ag7}6[83? 2-ο)Zr_myUzq+pƓޡ_,7dVBsZ'>8?x"B\C7Z2%$ ~[] N6nk۶n֭ h-=MZTmlcEtެ[VnMH\Dw))8pM#z.7]A  O^x[@oY=lKQ;~"UGM=Mˮ3] /ᩄ5ԥ(Q\zQC+3U {). /~ ft38X}=N~: <33+(W{` +zpOz50t̊}?ED]%{w`Vgo?fMZYT®k8jY{|܅na84?>{Q,UqRyĐJ6 JH ^j~IWqoxT6Mȴ]8*\j́kGKWy+/[MP?^v{Ug})ڔims YfN\:u:j$0+1BI޳mlW}ؽ4^`ۢAr~Ɛ3Ȗ#N=C6"m3g΃Dc_>  ˖ rdZ lssmU(oe[ιU{&U p۪ñAjk8U)&;t>1h U3#jزxPcfԻgf`cS++)ܸf%n.Hoߊn]Ǘ>ߧ1jdr8N;k';?p㐬48.8[y\\ Nܬ2O/u osV[K2ށoy#$ٕ@$yb $y7'B۸w] &mKunV/n+,*%GJ{ w-IQqk5dgkOvϹ5VFrV6n~sVn%v>QUj5`1s2N'c1I?ďS.]`{#'GHpكLG)>7"qAycRб<]8eo@R?T}رńJt䃛z94r:*D|8ل@XSfR(n/zh{@B'Gt9#*j. I*hcEL0<QEa"ûwhcJqEۦ)[ ɖp ًɶ{m.TP3TiIآI8{w֭)[ c[f)6KBobq bBf_bωū:lj˨N྿ .QU_d 92ZDDӤ~#yv]NC rzE=x/Чc /[o/_MGP SɵJ3fئ \2J31S!ɑw~|(pw\ ,M:땣b+s[9Bf~VǾlea|0$BJý;Mw߽+ݶ WZკǣtX.="k&%CW6H2D8%l2o>˃% He(V4e(N%&Yn1'2"4qb#/̛G!_-JoLU!aa'beA$M5Lg#P4iQ$)&*ce @w8N͝bDgeZD8E5nFt[z)/lBD.*"OY;U>/ xz\}|oxY7SW_#0ssswgq|y5A1V2x y~BgſlR~gR-<\d ݜ]K\K`K;)NA8UB %eE^;u[/.>&G?ymǠ*mlAѷv5@/73Ό_;NHA-1CQE>,09Np[|owoߍ g:O9874DYf)F;R"8 a&я??\jA#ZmSX5&Ry$3EC^ 6>lF?[<dUp:k6xnR:R+]9/׻nէE&(CN*e, " duǘvCRV"!o|. ~_A޲ƉkǤa$5wj ɉ a 3)n9IdrƓ펋_NF|,햃R9xҳ]q'9X'z0j.]_]KbI"$cdś6Z '!7HmDIFrRHqwRGZڤJހ=_r-Yk3Ou2%6ݮEHt[`7 K-nmJjcyƓ+=vchA!O&1$ST#Rfn[~)l]9Z."kÝ mY3zNd4YW9'?oa x KI.4QF+s8̘Tf6S.rH%TBV"^7ΓP_GC{2׵8A(({\XF❴TWȓ6( [ &TUD5j PSeTQIyhR//5vŷ/#";>-9}EiΩC-;c! / fvJK,Ҧ \La,\Z$QNqnМt8x1e&6V8ď4ijch4"رTa|RE%db. ['F1)We(𗳁;؇(eyn3I3¤e}g`CHD3 051u EUOXΌ}&S_-7]#0"9>2^nFf |etz޿[? xeL4%`T?wzV4wn&ӯ@g?٤.]]"Bַz`u693v#+ig$ YBM5t@9Vj7Wx@]k)hk)R@al}ubAju`%C &2)Slu0bZE`]FH(\58pNpD*s#NujemTYAC1E .LE! e^H@aDckħ#ɵ0)S0iQ8~5Ю[+å p܄l=c$9:;x9BR:bQߋj[.ǤrLzƤ4 s0 @wWOw`4G{BΛܭ ,~9UpQyCݢzÄ.n}TLTs" >q.IJm=+.핟"!xK=G9f+Mk_f|O>5DUr1*gT8ͧfh@d)XO \qAY"or:^AQVqN{w~ԎzH7D}? )Lw7fqY@z}vl6-kħۄ2Cׄ{Pwy#JEI \xfB`ρzCbK}H 3SD*kz#~6w`f$3l:oQ.NdXFUFiV&QW[q遇Sg,aHApVV'MQb-k< /AJ?d5$Up0xUF:f/0ׯc66Dx,:77,(^>/|^z::C3T9M Z;#0IaLq` rny6MIu 5XpZk\ 4` rl:DgmkH4`2⥈MϸEE؜?\pm>>).T2I3b4ȷ36ٲcp`5+6k#T]:qiO.1, l'.8pJ yf'd>@DH2@l" +J4Yb੨A4#9.&,:$]; S"L+ V1S wVu{ˆw HׅOຐeAX3z8.dF+u!8Abz}N9 -#n?S:D䘈危k$!x|4'a{EӻWk-5Vs^kP˅ZyכLﮋ{)⳧){A'R0DsP$%7h 4(LQ:,z C0,^/xk¶u3Kwy^ |ew/(B @ɀK9. /Lݸg˱^hrzozWSˌʦdr66]9Y!y-`}Sa]c-p$^}vx"ĔoŮ Sh(;uV8@>7!H~ GHfŨ=:rUJt^^챱x 2K6KmTv_i1Ow`ƬQ-1;5˜CnLtW^:/1  k,ĝ5: j#Wz^mw4KmX1`Q#<h5F)T1!,#|H}nK)M6&Ղh~–hF'̪tm7:&J&)+%SHLdt`w0i5cT.sa8!R2N$lՠ$%Hii Hdc0.p$$tBzYs"X$9HT0 "!1Ak"Z)?MBXph$|} ΡR$"y7T0A0{9jIH5B7n>eD)MJl޴WV[gլ[MFו=_P^T+mZ ^eצJb!:Gb[*Ŗ;&L)M#t} R۸l6m7rmH,к2_86ڱV~<tF/.L>%<\\%w?V[d] V<Nūj^X bgto@f.dEZZ's WDXG!nȸV|nv2{Î<;9X"Snml&'9o@r죌]+]쏍ԲrA }nM$\˓ȓ҃{8cQyБx9A{,dGg`dvy穷~]Zh̘1`5H#Qk.`GF@R.g,: ?~ye[,kr;WlܘXBuu9o3X9W;R)WtyaSwanKdy~ZSFt.U݆9= UCJٛG X P!)݄c֜5:gʩ̄dSjD0hc|åN:ZS&cXW?wnbVM˅'+Xw횱uu;3}1QzRbreLˮw.Ƶ ʐ9£, шF7 E* JQ;eT"٦ T-f$ه,*>/'zĺG`}-V&dґ'69auTdZ^?9 -Rٝc\3B\+YGqb8`:NցCv%vzi\(ʍD}]bh ŏ=umF^,tM 0j=ubhw.Q6#_|(]-K cn.~iCt!Ss`"~E1?Η`,~XSa9utjԌh]1O!xN/F"uu.b]XX$Oc3XǔP-hI<t'Zs$uza=w˜AORMz1H[XC| /b}( ix?ƃ/ϒJOȶ+J!\'[plue[Pc)g 0(5L*klr08hx:X-Jdp$[H~kDҗ;(gr H9 LiKNIl0D6{B}]}\&#73ZGg\a_X ;mFsG%Y95^[1!94bzþC֖e9 >1J0ݔ&.uªt '(j%Bf d;Iϝ-[̩< 8EBk=)EkA\v)%B$%ZphN>nH23cZV=Xj ]'g:9C.;CX*x,"zXl_ z#9I=EBdRMjK5|K:T\1iTwM Eu1CQ4e玢POJ5[=ׅcpݺשeD)zzZߊG%/IAH_ׯK̋W|? WٽYtv |O)dF?^gj&߹[[x>XE`XK| ӄoå{G6VT"Ar+v!,O|c% :w mu#pv\N΀,DB FO5ǁ^}اJqF6Ά⿩uoXt~6A\h].p^gA\E@ Œ ZHqs0 -eq׺1UЯ.3߹/./б q"sB R>';ƛ1$t-/@/_B*AjH@;@9c W 2, '@P_EDm[s($wg5+W|m5LQ0mi2TkӜ ƌf &JX-!i$Ge9OxXX`\^lujE5T%Zjqcf~ ]1/6EqZdVIoDt(RjHJ'bm֔$ۜjn\b [%RjVJXs!yXo~}s5jAO f.}5 [_^igq$ׇOfWR w#P5hd0#q s\@7de)7:gXe,+%^;Y,q~zz\y+jpjR;atAg$UW.v?%LŨ>#Ko]mRuϋmD˽\WL)w/g=M~^MSW(y 3)Ŷ/"ud\'9XNK5dHQ#yf64"ETi7-EՕpT`#,I AIjF)8q[ksĚo)~*9͔fǖ͜T IN-`>͓8IHݛëfv^]>y]1 1l2ARr-uQur$X?TX+TM)]~} Ă$.Co-EcBa9:`}Ɔym 9/eow>DMXP>3mLSnʫI9G1liJW靄 g tH0!R e.zƈT# !If䪚'\;GLK&$uj}>6% E9vϖB`opv`f>Vo;;<,l9V=ܷ[5aYu`zmv~*n-À LFҹid=WzwJc?`;| خOG{gRSrqa%PrW˜&D+5Pү4?#yܠ%MXڼic/0mv]3FcهPhAb4Z*0 p-J!bJ3y`ܐ}k>QoB0st`p8T|P/ѲfpNSS fӽYMƗ|JTnfVT-Hәɛ[T JmtM| VK %- 'ه??$#8agw0ۻU'=b]='K5{~pڦG^*4:IUG`B9B@ $7[ƙ*Is s֩PR( āixm6p|A@oF|[H& X "u?pXk!Ug)_λZlvKUd*ǽx'rx秇G ͜ƻ I*-3|n?.Y_UkZpJ:.4>^oOuwp]j0zfz t-@'uvk-K?LMPn8Epl"^~y@ɻ*]Rab~-Y4IW@P:oF%A TCz"RZE<҈o=.tK$7@s ? H(,K q'Ep2K1Bj 9pM0Ji7䓸ЭF鋖aתhI6[7!)Mt"c&}ws`.E=[F2֧)W!\R)#hlwڡh'= Z%nE`#U`V3Mc}t KO9$ ?J!C^V]YSػhmVr DLT_[.+}ޕ6r$Be֘-v^@k,Eg!nٺLI 'HJūUbQydE~Gfdg%I&cNJ֦!Rz׻qN$eFv=j70 =H&U=9:F*`b%ʣ44b 5|rdeUA_+bvJm܁9Jۺ _׿}f?< iVle_"l]~Qu߽2$pvmdFAENb|V1h3cvtb糏r?-=x縤CßEv4aJ"Cw>!Xhum8S esu-H XFN}4swWiɖ7݄}~[)^QTeV3ұw[DS$7ߩ ,ȓ OʂQjkܧj״Z#xBnc:=rylZケkmP6(>AhԃaVa$C OwW;$E++V;Wddw?cdmޞ#3Hx8vh& JPBC0뤽c "mF܎ׂÞf1= @nԃ\Y92)JJYm*oVk+r A,laÃwvw{k_`{W;3_Zy ~+g TgAÒ&|+)Ĝd^QvȖ=P[9 # ZB[7gW-٭ Pgb^SpfBd/ZF}:|ιh<`qb.]Lkaußw7@O*cOҬE/4QB{=Vх u!nPݵm}%9z+;Gs)ٖ v_ZP/>: " ۳XO7F[دܔxɺpQ1īT@: 4}xgQ3~EaN=XO9VdT7& \R1hM̬ܐ\NP7l_+mhz#`ɬfz͜I4nni$/5X"Vs۠@ SúO_t^W zI\[0!1ch)Qbӳ ~XJq (eՉ1 v]7/lvE%^)aptW cӍ`Cn>͠5 SBQd7rpq{ (tGD OaaZanoq =r,Yf9arL\p94ܳͦϿ2(bth6Z3caTCP)H].$)ȟ\҄+2 3g9 GɆyg%O*eX6jAN]#1kc`f"/)BKat63B,"K9FcJRGʡ2Œ~>z\RmCr*ti/+Vue\MPvc~L8^R CTj㽷[}Vwg2fx <,Ȩsy;S6*zC$CZN`]C~xL!HjhRqvB  Tԁ(2~Stht?IQ#QJ 4CHPH$`rkuDAPrІNt2\ ޻鳲MS$n2ܮsˁ~]˫w-NK5OSlֿ&Sץg(mr&u# QNu5 3-' 'oہdCހmoI;:Qd/xRxRxRxdbIEy21+t]45 6&$gU D4qӺIvj粿+Ւ*jѬ'  (%XQm&9g`%ir hA5fF j ЀM,Р"K!E%W EIR`Eg3s]0RA#E4Fc~oDE+4>B€?(5&e$׭]s7 _1K" -D|n)Z]c&k/?"O+GAB=wzY6rRWO/tV2I<P,yҥIk4G}zaiS7UpݕXĠ޿R2PsA&/h?O"y岒YJI R!dIՖg,lDm׈bl_7gO|4峹1.*,"(dArςc%M%o NJC!ҶD^h}hPIcMFgG'ifܫ"t'F؊n8ISZ EW4*D^D?ޜ~/!?5x}hCə+*ߝei,UJ4.oonoތ(񇛛ׯ^qa&X|!?_;݄'^/Q(&vu+xdоrV,f8"2o ֤u dcM=7cIM*GWa֡{s7#ω!]ߞ'5+BBψAdr_Y%$\ݖXk% Cf3S|MqtQI/*IQx &Pb &f$мR*i  晉>LpmG'8<P{ʲ=;{N}<ߣ. U+(uB9Bu1pD!dm~>&Yח1a PstBb/QatΛ\c6R^9oxL 5 H[<ǃ X. 1ƭ"T;YvpEu'5-@Kez6m # EֳPӣM2⇆MUଓ"@6 =Yc Y~0zHbZ .8(!5U-ePOI y~驝'4gRPc䞑q#CIFǜmɩ ,K)̓Y L~o7U,3ȌVf,G;?MKMQMiϺGeOQ"9I%ڄ9)2hKg!9P\ uЭ  'L_ˆ73:+j~8/Oz~*A ]>{n<B;~x9kr“~sc@$Tޤ?^иp v~ՇbmdLFߟݾ?T=i;O>@>?_+1͒lb)G@d͒i~J.spe冽eIS^sN FesBw_^/^u4Ȓ+!P(:d#SBGI'fP[dz!b"fq9䵏Z!EX\*ת9\@q(m"=1S1O&#'%dM״B8(F3Rg I2'fV`4@ KdN.d^Eg3S4^l˭/2뉙Ĭ\p1 2ir0?3V(Vn8B'픏~|G)_3FAs)yu{͆%d6q.kqw?haAm7a irJ؏F%gJmd\4nUPC7e&1Kȴݖ* xhԂ^Fgg-$ 1[[ $ऍ\:o$+Ax(ɠɉx3mdʑhm3USB&pZ,(&Z%ΰ](I< 4Q s h\T;s A(//I§uND!ӰYro9ȧLipDqٌ4楗L'mҧJB%:#dV(Y֪;kV`"ȨT9/VKgMuPE<"m^%+,'&+zmAD1r1# uҸ-lPƀ§OqlP0p -鰗>Dsw1oIOw4UC8/d8M8a]Wĩ9?\̿T|Z@b֋ UMT!R}T/w;q#P37;cI#mV=vYȊ`;8Ix8I.IGCyc)uj ' vcO[$(CJg'z$ԕ}\LO "+#^rɔh]ڣZe!d(FL|>=ٳ\q1ZD;攵ۣB%w%jUZwܽy#L iOEHc&\CVX#C89L4 MT8MVͦ#jTEPƹ &q&qh D'@-.qĸ'Nuh*wr;c|VK36JO}·Cr} ?T_x^Y+e2=,5Bs5IeK0QIkT̂L:?%[[I_|ʕU4{Z(W=e.|{B }#RVu(<ICw(܋Qw$\~AX=ՃpSM}`+"ޏFrN/n#xztnFOm$Pi=cUNNzV&FsTӮg;&:#g@qPywZ4yߴ+VxF}S;C׉o( -!ct^jL9:WQ*D\c>û !{rnqyswqщ_?1d)_bHiD79λ៩C࿏ӆѹ#])S |6Z<|z?<*j֣^8'l5[LAP8r/ DW!lNj1jOy֒ oi3ةC&Hy'ZKKd\Ё3xO4rc+Se#UTzfDρz_Ue74=%$@4I4(GTS a^ECy˜hYsAL5!ظu wAuZ"&mj<eHq2$*F<*_8)hYQJ{lN*X2P($m‡LRcw;:yƂ pȩP.Cܘq2oIR.xƓӬ&H$6M"㫩6y(R JX}א HUcu\22MHL{=fuRFBdh8Q+vUY=S垰*_ ?lw bjq'+诓?/.@ + 4{3y^k9LΒy&iVՑH'$:3jHF-x!-4jDC;9E;;EmQ@9Ӡ߃VL>"u@]0 %`ta&HBUGT;NYc)(F]dIK</TG;"h,x21e8QAro=rQqBc=ɫw-%])mJ_~^׶ܫD#^a4IN07`pICBryy9Z(&'"g BmHhM̗||_W:MLF;#SxLeR( ѣ[&RS lv]_uDׅL $.~? jc^9qsyYB5@]ɣ-7/(}64)}(ALLpI]M%Jk=FDx 60!,1P*{4*8#}" HJ۩V!E{.vB!o¼e7cCoCe}˧LFßǺ':,򳽷 LhjBp%W^R D*#'A(%Re䦢 x, ɮr j?wTۯC΄ڭZ ߾;$+ʈ:Y4S=Qpv~;}َĨ +Qlo+HտQ)ݗhPYVqپ@ 9vHELjZ ;PqJW;|rT弮2MUSK2-"̥\ y6ި5Wӣm2*E5؄К IFKo{%89J&V> 5þ QWw;wޖBT^V(5ˏ_Lg&{酢ívݹ1f2懾o_$W*~~/P֜تkwD7 |k KdՆkN]&Cܖ7|̡~a6-2{ђL&s=[=և[~vw~16g<6Z0?.rԿ7:ֿJ&xjTRUw2¼AE@'YOi_x-IԨ3ëןŹ}\uR(\{x]AHUȐ@tI1g1VNwA:/Y{\X0ݍ-L)|"w.;2XڼFѲy}9SZ+\oؔIZC)i^){R>1% hyi"$C\k^ѻB1#L R2M.YimʡK*"̶8\VV:O/5:Hģnm3`h&6_!4מS;IcOVe|lxNOs{{~vuY-$i]$sFWG0"QIhvhܧt"J6=Ҥ *D>"ѥދ{3{ F,ċ>#GmέM˖"/)9Ϋׯ uYg??^Dd/]UpdInX3A#N?^uBcZGg?:@lScw!kB+].y kshy xe05:$p4_˓g$ꩬU"IDW 26Evq_ܲ4]J6ޜМ+%pdV,r2{DT&ݱEe5&Z2TT;xphиdx\2[ko L!tf :}!I֋ڱ=:$.3$A@n&]+$BP;>b#w1=p&"u=Z?]:)0nbI6R,|f 5DZ@vkd@{:@kςxr7tRTbS3ֽ83(U \%>J @ u5&Ѯg_V1f{ȑW.`&|?eLIg78 d[r"vb#dXg:bMVz (DwrVieZn٬9V9nxrr ,L/I͔`@ B7jxQ= T031s'4~+Ӵ,J.mUxrf󉞖<A8`ŝH;cw8}ߵco 奀olUf`Z9+8ʅ(Iϑ$8CPm^p3M˃!F|F EQp955e vUIω*e˧f^2*cEu*pKK%E4hKw8.T4_U/b$$`@ISZQD<DRc^0Md%SXophz2eiȍ;&X5y  [(|́1 k]&I&BP 5jQ׀8i[VL_H_,6aRm8Z8)1m .h0hmET#dwAڜuLՊbX3^>ern 5$$̀C`w5g.* XKC>b5۽#΋۔bQO@9UQLi <.X'mܻW鄐U @:QԒ3۪4O _zߟ^ZWpZ ՝UwƪWG`,yOtjXĻ͆Mޚ[B'.g]S/&Zͨ@CMx`%)FLq(GMRuk@"DΨIl.[ ~fv!m ,u S' ) l VI0^$)<'Vimr:ޭ`Ń{~Mm9jS2SذՏ:2G𨂁@n {`42RQoc\-PL`b^*Fζ(Ùn+ * >")Ƌ.؋UgIL MNB"]N_c%<,,3 ] ܴNSީs?sos!M38 'ED(~.ǸY-[ gqŶV҂פګRD{&ވVRi)r$I1ۯn59927:CgӼ ڴq[{6RmulDY'ܞS":=ˎéIUYMrj}8~509RQ:G omׂ qo3g:g*%i~8=no5&t 6Ti)2D_4"Hjl՚@S61 4OFJwZ(,q G>SVe3z3(u-1M;p1-cSI& #k8 2x%+3Yq~kGKLan5n#Zr%%Nh.X=9-1!9%|̉,TFc\䊮~[bM`\9soW5+L Vϔ&[|\9Ωϛx4"qHZ89ѪM$9պ`&ifd 8 DAvQ'n.DZ^zn 4qdւ}u{"y Z$%IlAN7:-f`8ܦJh*9"Aɋy[S eԹPG\qʽm)`& =0JT^Em7H6|AD[q.(#ADYuTLk4oMHJ],4y,,I2ggSbbGa&+28$(&922d5Fejr_.XA[ .֢-1 So"5;68TlMʧ2Ѝ(\poQi^FTmt#}UI׎H? Q'jGtzVo+)/c BW s N.||qq_[cl1lUشSX#ʭm.8ʸɜ &  UC`x}Ek:Pw0YÚar b4`/͇0h>|k>'Tt4&'Rjt:a H*__b[|ܲslׅKUyi>i{՘ J }<V# ‰T~tbLкJf8I'`Y8HJ3?h2i <+՚ćpbaDM&jH5!EbU RvtK3Bǝ²(wS\231Ƚ a?A遑\F)Ɣy扴J1PAoRPHsC1<1$`8H)Sn:Prks"Hˇ7s-$5X:$;5Z5 #`I8.66-g(1BA.p`֚('\Qhk3X%$c>g93BcXzBYP`AuJĞ @``2@dNsf 1*6[x 8EȼPk˹zl|pD2-p",<\D8-T掦5⎦ZF*Y_j)67ـ2gx`9񙦎@-!(@Lx3ƙ"cS4/xsMsX %`Á gsMb)Z_g1Tp%z=N/Yeu'~Aq ˄Sl (JTmL,Gm_S?M nfbef^Y VV|%U6T|[6Ds,PQ[}"`4ZU+.!F?Och bcAll4(66*RpUv:LPr%<:IpN7!<\ҶvӁMD k|!T;1iàTq ߓ nxa ,?\Q].x~yW~ GzdΆTھґ{3/p0b\Q3KU2fM8!b.,[,Oyg)|z88ؘjbB`!5ryz?c_cM=ևb!['#X'/Gp~p_'g.;\Z</Wo~>Wczrܟ{\_n_~ovݫ7/W??v< k .\Y׻t4]Ǚ]vl~}<./e}]7.R{/C6:W_L*{?&v%~3T㐇ȔB>8̋GX x2~t%.ZB@o&/4-·ggv R{=SN`[+N 'B] tG`w!-ktG^a6._Yi2۟üY򊌫zX_K[ϯ{lӏ-b'yj߿x{_MF c :O{'wo5, 'c߉4ܼGcx?Omnj 9ܦp/>Ol!uA_=`ǹ OФ_忞ˇY !rhCkmV_EiM~pc &{ ։mHv5JVVx֮Krf83!9h;s۹zT`{io/^wށy|1wn>%4n\9zѽ:v Y3iyJ/. (f폅M߽饯4՟`ں`];Cz}u{vAd=}~yF'I'?&~̈jŃ5D-׷qӿӥР %\N|r4!>\%T9n޿$w!yclz[Cqab MM\3[ޱ%1B Sws6m vz=FP~\*ΐBp˴_zgzg9{ S@(٭d5*\+Ո!W\r/Y5UN#d02BL,W:R&\N71xM\=ZJ՚ZWkjM\mq΍U)Mbjw@Jgɶ%,upF*60kirƬ--W$rczD0KsGa9JҠY™5!UߚQZ/?lM2]4V̨@$(SFyq@>Ȳ\V@-sஇRw5] jpp9*y x10V7F^UXQcm ײy%bB,PK,khzJU6w2&b:'E|mPVSS^hVQgX&-–:S`pBe*x*%\prmғg-j6TEƗq1^/^NZݹ-.Mc)́ycB##8yu9qͥYke4!i܄%C K&,ф%Dh K:(sNq$^s#O>|=,zYٓ8Zm38OB o$$kb * uHp p9$Pn`lbub (#S`"!ey2bB' qr0@\JJt| ~!Ŗ9ɠ{!)uaq߁#Z42AɋȘ)tH VgԄ` y9Ǡ@昇(4 ̺@A}+GDpSJT4|Z%uww3~LY˔eLY˔e2epE^Km ZCUZ4QkgKUT 0 5E RˣU4ToUT 21DY{HQ?_^yJX)/^~s/cb[rμ`1o? c0i߭7iTiwOu%40s6) ɾxW8|=0nzӪqT51,ebY֬\j[ȴæak/7 WY٫bHr#YЀ% T˘x}ȃ_cw`m^Z˖*Y3XR%JOOs<˽Lxk2i* #%e?D'>*8a1Y u^>o;|%,MeW{u'yFe 1c0X; ?,vZd5qyрВf#DJ%`N15HHW'ӯ\HҊIJ&ʾQ;]bYߚXml6EKd-lH|/uY9ĈږJKWW )-c j[r|lɚJ+}H@h~.j쇝V3(RDUx}3۹*},x pN>nneWSk -3(md߸ +!"}p-gF (P"A3) C($Za`LV!nXDj[ʨ-P><_̄TxIU 5㛰>+>0ٜz]ԉBSX' Bc"Ԭ?.RٱΛzNzb$|w5_1 J9+RơꢢbD/J0.6'(::L0:i8tKqK(qds3CU)aUʠΤuqI Oɺ'ObLb9Uq &fJ.HäK0*QevqfɊDcR ,@?;.)?R횐;\u@ԊԧBй~ a F|5YsUHwAg*IR]bΞaSISAjཬ@bl[V/$y DNqh@<&BjJլP gB#ԬljNu",-^Yl-RӢoEj`$MmhDk6}wاޜšz׈6}q( Pc4 5C' Z]Xz ifq%tϳլ.TWݭ{d1E%P߽l ZҨC {.~r 5=cZ> \OA:D«`JM'Q֫gnު ?r9`wc -Ws:-0-T;;n^#즢 j0qۙKfEx SAւ7#pjdw5/NFt)ӉbB&+NǧT] }.睳y{_=y'7P2!fYO.QF޴6 lHfό/ }4=c\ܣoQ}`uި{ ] {fw6heVxlt׭,+NfJf;W>)'{T<;Sё8~ ۏorQxS-]ԕ%=??><=h67MQ!.JX9:|qg/ϐ=srãg/O/>pb燧˗)*eߡgOO|T:Go:TppfGn]ݺS /[ܼ8~i2\v7PiLZhO+{dt;s yza:Q6Fͤp}Oc虙^'36}FU^gO`!,J:^= ݞ9+*r_?@ЏbA]5=c;0|P^Wlϛ=˃7W;CoKsy^ 9[G7W?`FN^t݂yo o|\B_\tߏ88sQ śK{K_1ri?u=ew>_{|_O`O/~:NDk fAjx:[LB?7Lpm2:@EQsJK?W ?5vk*9tro\ #vyIzn,YVa*UTX PkC_MZsxHܙ/#&JCQ`7#c0p4zAk|fH(sl-< rD6!Aܷm m nq:xqJ{ķђ~vE* =չ<2 ))Xe,9pi.s3@zpMR6Xjdx#ϚhӠ<$cmwޠ4A? YO榮zg79s"Ng 5!VqX2K?K0un/V{X[_F}FִU~EŬ`1 )BZKl:e\oIjE{hI;I1~L$"?4#MnJBIڰB0P!%' s4%_GD|øo8I%cK0ˉ"I;OF]];wy"LL:{z>Y)S0PkFI,#kP& doou}~ˏ!+C1?~qўu6#e; s%䴘յyda1wYwmqz9'm.,A  A@-9/֌$G=Rf,c8e]$"9K~`V֬^^5abt 76BfBm!)~y./_׆XtJ).Mni+7V\Kzثz@B`vd=/r:gϡ"Tm)j3 okge!3Lg`ٳ&>oG,zOn܏G'iՌ+2b`O!5>g?=>27__Z?es Ҿ/OC'7T{m6Id5jx@jVI-%lW@RNaV IBTU[ZUHec$?Zڊ'#AA/wJJJҷ?8_Ǖ>qAJm}+U]|+` Ա  *.BҐV[?ajpa7W;>,wg٭^WV|F-8,9y,b2è)BJCU2Vs >||*9\$A@|v[s==s=7سcPyΜI]8՞IymQH.' NY ؎B֚/&_@=B?|6@7Z}<.~ USk`x":8ﭺ˱%/a|DYސ 5ϸJ<c9<_LѼ5oP=H=RKEq -}b{|$գpiQ=٧Hׂ3UIEqY4ʫj!_g/?.~͌z<ֵuN{s-mjYaV۹W_V?},d^F\'I:;Wx B؈<1I8& g'0hB8Y;,'' Ă?L܍Ċ"IQ\a1)9J[C;9CR질YOuښti}KJ;={{A˝i}ivڧ1tQc$ww;c$g-:vUFM Nc#/&vi!'}]C?{"xv,x|ͰWE}'mPFk)d!P4P|)0dHSCLI9rfSD(X9e27`ݎO"M~ÃkӔhĢ|("]I9 @a2!| 9a}O'̇oE?AQu?.;>2uy'㚠u4Mx]vּGy?s;k9"%6{],l%n_=9/\Ňk-%U^ Mzt8,xq|M6|iכv銱?U㤂0k[e 5G8mŔb,oyNdK1PxsUͦvJJbz&T3C8g%~`Q$wн{z.V6ƣULYBb2Kz寋R蒍R 7ѣEg-h.aU) lX[7(:(ځHDGW;@ ^_<:WQo>%pu? }?g5۾jك3]~}Z57_ܑvhΫϟ'>?4iY,N>kWV/_ɗV;h?٥@}Fh#Jw|$Kق/"\vrzr7ދ;ڭGk9<8/OOe6J0Hnt@F^O@0K~oj(<=+K<[<,)WfFf ~jkjyoƛǭrekSQX#h5O}uHBʺkhЩcH:䪟At۸ILj[o! [ JRuًWL O; K*!E W_q`m^FU)TMR[¤i]`kx7m[YoNoI\w/KIֹeȂ V,N1/X9_)9N/6Fl.Pe:Ew2ZқH0>6%0YBTzVlF&) CS":;*v {l@b#'v%0LmUMY4TQ9{8#Sl}PVĥ-q{+aN㻧~p &t @lزbXiDUE + ~څ3s.|8Kx؁l" jnfNky)'a"aU Yleׁ XrG9V^ǢA>R+B1ĠԼ\)-KW1'kB4A;0HYNwC@R )B aSq]NAUq:x AAPCvACTS#YdM&V9D{6) ;DfejoJ+bZ-SCѶc{߇5(Drf06GJS7BnƠmRt} q/;&9vY]cl.$Of/ڕ}ȘK ϕ R^`S[ N-6mtd0q`ЅB(NfߝHMy u`4lYF蛚"FVl<9o jືy]jaBc}n2S +UÀ6Th ` [kdVRf-M JUd[ȷ0Vf dV6KN^u2V_3r^VBQt,Gt λX JLQ )QXXl;"[QTѤd)#^̚"bx@Gmw`0EXx>%;eh{5ݛߟx*%Յq]hllB=hXU"cQDT٢B2"UآZT@A'y4$ 6nY2k2LPt~-uiS99>w1htwڐ1*S XQoZ@2z7՜gx NBXr-ޞAI jaZewTiE?HiǷXUFY[w3CO5~+SŪ,v2U7odq} ɊbA Rb?!ʠX>#R1zE mWj/ Z>_ӏ9b(ycq>D_Xlt%0hğ&y%Ādw7ʘ[nڭ7F,%*!q]tí'<(tͱ%/1f鼖Zg}棭J}Mpё9Ͳ+[UKzdff?OyM0ol VI8+ޠ G&;*Rk *,K``Grm~C-y6-ݲ';^^&do/ȊMpH<1Rב)9%z 3fGh KF6Hn mOIr֘.GjgV:kS 0#̳g)3OF`1O'.>Kj=5x4oBM Zv27*Ǘ [.6V uQ)X.E]8&{x~SGt%%KJ C+屮yI[c7o`V9cmxClo^dVSom]wq-AQ~5m,]R3R~4Nu||>1U։ ңly {]F"-& yf=$@#ax@gxY UNH|A*LH]I)i" ˓c>ho%Mj?bKf>wne ;/iB@+f_q!u[N2o+&*1Wĵm*ϤwL)YR9>5'}JB3Xe>o z{ ~Q?n}Q[h7Z2Rk52batۙ2 w  摆):n2C+t gF4QLV >T#b UzD}}^bFmFSrV>,ځd=;o9!#5`iM5.[.ta~nCPÆBE`3q` g)9gt`׷\.I' UvqB _Z[֬͹/.,՞P\q}"D4se@_9{ UL x8eŒ ,!ENJbP0ܭ+5{,q1Q"5 T-LЦĔr1LLxeIV;ũg;1QJQa:XZK#2ȐR1L؝&O{FlHWN %ĠJ=&SQ~^Y@hЀZu8)#f 7Fm +4 /Sox孭}?}yYFof~k++VI5tC[%x[]>cq~0UvYmL:͸8Hϑ S-%ob6OLP]I]OL]jUb\>j>2G3ɉR"%X8=~o :Eŵl+D"]'UYZ.ju-=. !ևԂKLdg4`xuaM !ʊ šg'q{Ne}ۤJ]~v`N*Tl,׸&} l2@Y [8pozWWy"q.+j0NQwH֬ڽ0~5})oOOǻXOMOzsWϝCMB.b~R^629r}rCcԃ5&.cA)R}^QRĿsT$-W7*ڸ9=ːgt(>8(PŸ́wA2P>8͑/(fJ%Km<TNVpK0ܑbey"dIu 򡰬 2Uڥ*AP,q\ =T eVpEADYL*4"mG[I=eP[6Q_~gK1Qvk_s-h'  ewN0g3׍B!)в˽HI<(>)LQCE8І&]>Jwn^ br (?_ǰRg ̝j%׼/3;q>_wճ._Wo_OVOC[F20kZl'9>JeX#DZv\Z\'`UlW/ÇT~j1?/:X|ћxtiӷ=0y4@AфC10]\xsڀ\e`+Тq19lއf#t f# Q'V1*!'tǘ TJҧJoEndg ШW g]TKpsNrfO)҃j6)_᥌Iaeԩ!&{\Vh/=Q] x4%=L8rWT㛿F`T_"_5+n}F9 aORݯyJe]U!-~w_ /2x{ ΏoV[]}NJrb[zx7GWE+|]EkuRo_ sW)Oyop[mcU ?pcS6M9 SYԩqyrMM Ǝ>nZcn}ec:}Ż26whwB^mSSSM?ɧkjKlpLɐ1t`M2 &@]ڃ%\ZbDTb]݁1mg4m_ƒvԻ3]<| :^ߝݗ˳ 3:fͺg>{x:8E6r};,gFߛQ a\'yfIwmg7߻L`Y/GS:м wxc%v1OT\gC d#UZǘ:_~PWuiԁXT 03J"S'R0YKpҘz'N|>JF(R&gE-V2^Ji$=RТFѡT2u 4?~2*iLO J= XyH D_,hjF*hlQb%a#!@ֿ}zv蛲{f! /_]וk9 ^F%''ܖ+Hi: Vj&5./?]wi/஗(M -Pġ1RNŖw|n/%׭ @;նtW #'fPy OMcX|6 7I01K3k$x ,?&@ŒF''B*eb: ,i<-HP^TUъRDA\HS!9ǂ⊗A!o=HWڱ/]wCȴDfP'r-5w0NR|e=XYlSr`Ž[U>6Y>Z'J>f =Rߥ̃xg 8˧}3P+誼><2T(b-C`,񩟏ȞVd,R]͹*өb%|MߍN (& )v!15'ڦ gĮ৛*@"Uɲ4_,ܚzany˖w|jbJ=R.uD5Z+!W'au>+wG^PHKrn"fkIPWz䭸Xޯ^Yy8_N _}{6zdmFYKY? 瘄㹟ɧu[0ƿ'Si+́.jD&bƍ[f3lya>-59}`,KA\*&̗)TiWE`ʊM)=0%JUҮWvۊJo]OsW9*rTӊ>="iNd ȑDU.f\ evXp9@d6vvt27ӇFŤ3SC ]S#y$p7n?d9+$/pBcPlyGkx&FD`\]fҿ&UH!$KP}=8Hu,/L :"'Gra6ijem9,Icоlm8 \#2(SBT,?MpN4Z&Yy.9*2=8̘R2|ުS5pM+X^r>eɛ;΋~o|9 RgܶX\m0$ Nn$==hBϵby7,ߑ]˯5d AI7p8 ՛(&DhV_DL%5(3s=o0}Z21D MKKIǃAùsoBQLP2#^p6tW>L>r [;>ˈJ$E;beDؖ5| aN凳lc!vE]n溄sQ!KJ9 sW@L13S$Bq9t?X{W>g{qCR:< $ZtPrP~쫸"^.VÉ^;ӏ!JVlM&[gQx@k~Id; y(MC Oh:u`x(E`uAdƲO[f^~jߋ-UW" = &Qţ:ڤ0P[Z%mQX"XSa+ ClA9lA"rt!AkV&".m!ޅVn]`u*-AoXEX jwsa۸ws48ޛA3E:;voy߽K7ynY*w}5n깁.`XwƒKE(,h'ڔ1i -y yFrF.>eMli5ųJ9v& TxgRp*Ti~8*rZ4jO2 uɻs zb[*m5xR"bfVȻaJ2A`jDk[D5 u4 Ȁ|13Kҗ/,$kV6)MiGtHjcRJC/8 Y 2)q'Q%jCB bPJ~ zp5h 醧iuW;ڡ ]g֊](DV0 TgnwSUZD1=Be`ki!Pa̻yZqQGKCL-K(f"Z -¬I''0' ٨5kRBi Ӏ;ƪS4RU:i.mT,OeZDk~PCY&mr S%8A5䏔\o NB3Ҙtʭ0gT ԈL蓙M)esNnU膥^`M!)EOd0B jȄ<*iDmyΟ&E'*R@m͑G 1WIRJq)޽yˏ&FV>;gtoN(K,vLRP*piEzKD UH}$ |D!Z $w艧2#ʕK~r!-9[E$H˭q/0Cj[xK^a1Uz݁@reT JKi)w)#9EdT*l0%a+8p|b i!>9w>t_9WD^(D9"Ba8#(9:_ܲNpVdvߜ̓-&p9ŷK$~{v)Då=L_|juJ) K6õYzt!FXEY:ڥYٵ IfVqcV:Gx]4ߊբۣd=mYݠSktffBrL'l-l`ck]}Fk ;4 gUpUR4U E}7]ڍZDm.ty ooUKL9e[w 8&\W͔\VTp nn y>8095C{Vg+;Gb!<r$Gv$0rHS0sO2H 1 5A .w0S!CSνHE.eCݍ#1ѣcJxF&X43ud9A8}&flJ}өb(&Q}lJT#VnӔTPgv7@B5e8c&5;-Ś1h`SIJPbQUj .!XlǛkdVK,rx |0e8-NӛrSɫ\%tF_LNYs}Btȑ8,B;̥y~DuܛmI[[IĘ)iaڡo> )Rΐ+w8-̫ !/"M"+A5 i+nߤ:xn`[zUJ_Sn6!χe朊zR#ʄpWj &XpnM*Ocuz|ΜxПs'I`Js\EGnO<<@݁Tg0)x;f(ν~Ar#A5RPV׾cΖӏI-&)9np̾vۨ쨣G Q< ,-Js_)xFJ̼k45Ub{Rr d65RģamG^s?2q,eXL%'gYd뗱'kuO1us-2`}k}61w\upD FjZGlwimؙ5ئ;[qk"ĵ-k-R"&7[1ވO)-l"7E6[q][\YGeПJJ>t&;ʮ7߭"?ߙHʷѦ;|imX.vCC#|q-M83v1)fP V){Kʴ2-١Q/Q?A!*[[)%qTv2-0$Yx63,V I/bs68L٬8pv-*!ù&o q>y&gV#*mpmG)ƶ6CX`r/ϱg\Bz55O~xaE{z9i[TfL<)dM~u(b9 aiGO84 :X| vUfЉيFx|^?}w\=!" ~*ȾJԧ柡KƃE* ro>^(~}M0=pлgyMp~?5|xḟiؤoe܃oǽ,A F7n/B/`y.'ҡ+y!w{No|v^u~tuK߀&S*I3^}B{Bύ7CE,{Qfnn%_{醢o<\Boo${a4ye [FqZk`3r,齙Lơw~a&? Ël,=Y{p^bB`$x 7o jWa!`|4&~yMHpeOO&ONqJ eƷ`|8|~U NO;ì_NE?UPz_q^&OJLAv; {eśĻJ2m>NFsPg/-~ #oq%3L6ugLpY,\]R9A|_KYL,|o9SkїGwf,~v>2Q\JFb^2?4YNp6-}(H1 3 `S$',pYq0¹Y穊x?py0gcL_)³sM|;gz s539ӻ {`?9q#8"y >|l>r%aa-e\o'}=UX +jrU72!i\3c91#=; `!V;D}ȏ_=Fɠ"2Afnώ"E,#^o?dI pv];RN*24/>Ʊn&BYUWHfg}/o|wgrܔ/S3̞ۯ>{5䫻gON~3}5l~>t|ʛV1J+[J.KLRKYѐ,h ZeB4JQ4Wt[:M^hQa:o@1^vLhB%xLXKpyŨ$AYd#{(eF#FA4&Ȕ'C P@A 746tR;AX(ȏ㲱d P6٘# ry~f}a\zHX ^.R:aNQ"CdU0'UP*'L.Xe*)țEE! Ϳcaߋ/ټ-ȵ ɢc.oےA=% 6H?QHSQSQy'yʊd̎hb-"VP$S %'2*UGz'NdC}7I I b^L qQǟ6gṾ ~!zړn|[X$/ɂ/\gLم|{ oj.d 0'Ĝ}y3!c}o5B!1d#VAIHXKBlOV:^@G%MTgJ2wشmYhR}`'-3l='wM9Kfbf ܞ// g[$XKcr27R+K0)b P,e9,.och]@oB ͝w 5Ip`4K"PCg][ޮ>_em{mZ7MR]6#>y%_P9(sfaΒmat-vC x(y_0{64;$XC섂sL(1퇅T~8{|}U.W#o~Q˼Q˼׏Z#<}5>ת*|WV-|-z_9YA~9<,J %N#۱U֑;qZcz? ۠)1HLj~ڞsk ?O'w^ߓ;ɝ'w# }dPN#$WREv3fM( j*v>1Vp7?Xt B{Iq_m[|&,T (K#cq)+7(xz;>=t-7ְ~6,|=/Nl!:6WG)]6c/Av +;LEZiVY8Z6k$"))ńL?)Eܴ9u<:[I+>GW}ݺ1VނJ/\xLa Q87#HfK5dE`$V5STepefy023]f8R+jegk'6FmP{U;rQ֩XQmηJm6fit:%k4߳|wQ L3+ۛN3FnS֎{$B*Ia#c𼻒g-mS2#R5A@?ks j&nݞu\L)9s$pҲ #kbõ v-:uT_Sa-;,In(`% xekf n-waA'؜p]&s?m-QaJ~;w э6ψ=#mVXQ%[W=acJ=}c";%c}Âƪaǻ 2ž->ċ &RN=:`Dd-iEKtj jd2eqqhGpbl {tv9Q232 ܊3 XcŎ.; 9m^qdgL;̣P j$zfF)(1ޯ}6˅ƙUu6dY[-t=6(b=2u nu3Q_䠖j oWQ-F@oߠ$ tN7sI^ILdI:W {]%z1;uZ[ c$Mz[p$PV}aђvoַm3&iVYgOZηu<͈籓fپQdfСڦ4Efl 3M9؈@L򱂑ODL8YA"m-n{c1+Y{4eyϧ˜^Is鮊FOv`Y fbl|JU; Q][oF+_v}='X fg4meeɑ SMBݛEd0 E 0zSq1Ō [3BVr=ߦrhO082$:$#B0$q LCtub͒ \h]ר0F<\06 r2,SuGqsn:uL0*(uƩ|I|Fqi%ڤpҘݭr[ȐI#ml([Ӡ-Ʀ3q֤Cm2 o5iW˜pJwdj".!`)aj 7YP2J܍iL*s otǭ>O 6/FJ}-&LL'ou[d[QEmPh0W_'KY52_U[Kg;[*|O&ɤ8|8(oPRGW8 u-꠪K|fyë" /޴rV1Q3Q]#{ZN& P~nQyyE^?ضWK?/^2=h WW~frt`[p|VfifrI%%/8fLuDW=%o$!ٌ۹LJs62DPiٝ8!FF2~boKؐu…3΢]Z*ryTUTŧR+$4$NY:]U՞*J8bdrԨfUcYicapl8%V:\dD5X+ Dۜ Cc? gMz-)1Ƨ>do*Pƚ٦,wyFo&Ql~ڨri^B(wj/]pG״)/P)=\e w"f-8#b`5[8`UG~c<*lu >ǰNi4I}[馎O^%VwERDQ[dثwoO//1XI fYěv!N-ƩnӡКpbdC .^amN^K*C#esVۿTQ 9V9mlnsDz*$A΋Yȟf%uſ]:_BtB{UPvyYSSX6;wYKjQPk(pzJ%(X3 k2P(qU3ܟ?Ud?n1Xp%P7 +1Bٖ̀V bK"SP \FR&by⑏3gBE\䢓 s)ERly+J8ImGE.*ٕNs:!̻~sՃ)f9 "d[Y+EeޫRxN2w%c:zYÒhҠ_<fG/d8#9A郻K Mxa: ZQXmOhI/+bRK\rb,w`vtS??UyoZ{A?2oLRỸy2ZX-3C4 CB0RZD2a$F%ƻFeP:]aM40ܿ3>y6cEFF!v[; V3B\ r!YRNx倏GKU[ҦN`Jrj*{bV`%S#꥙cs"y~Ip5B$ Yqn$>[&!AQtk Bxryi58.j1e` zNx}o׾RMWv6w^M_]pvÖgt_ ՕvwR¾n/Iw\a_5 B:slW\S^Zf^:d9 Z%]󖲒.ku:I#U] 57ZHFAGΟxSxb!n,%TwN[$x!eƤpVL_ GEstbbJ 1EJa\h39o}=ύ]Q8 ˜DJM)'ab )RP 4G7 ևK5-vRk6S],$ZL>BY'OSK~:L*58.# b8n'H:1( bxގbU@ڼ/iӥ)y.t,K?ϫq+/ Wύ}`} U:YH=G ")HĚƘ\$iF1 `nNJQEӪ@\i9PƯB21#8\0c>3Hb8F=f"FDƞVb@.{Af=*nmUMRJy~ zTr s$2bJTD!$'\)*aX E a`p&m 50`jF42( Qb$ $6BsTAOP=a`yׇt,Ev.Y=ڋ_[I< |ɤJs>{?Uc>5eOX OW0k'T_!?3&w3)V?pplΈBf}\ WyqpS! #nllUiV-.ƾ15 Y؊_I > O_[k򝔪@N%V(m: "J.Is{jD Ȗu J^B G_wʇԌ߿(Wf??Eq!oz޿U9>^5P}?9•=i6pֵ6{.=ߵf=Q"d4ݦ[GQڟԲZu<SD>mAl"C,Ǽ7O^u!(8RV%OpeGË^_1wx~޽%th0KFj#"B%iDjM8B"eh#$1׶ˠ'}*B(TX$2"+v a7dA1%QƵ ŊjIA-CTBf ggwHvn  "e0U=q>\j$gM[Z[@*ʔ(n=]Ww +!$.yz%0m)&mH Qq$br = \"1?+rS Z;a`!f] '.& ȯɞ),5eQ$#VZJ#0I熔BK֙Z DCɅfV >1p!cq`(F8DX`'\QhX* o("74Fh3*՗R<xHӤ|h6LTYbXsd ыl!t@-xS(JT02^K%~A&a =n&<`NS^gk{rzk;;m\O}˃ӇqA7qݛ(.B 0nzQbNT3ARbBlf_) ȇ 0? ge7//46*9" ʒ5u>fHmIn0q_H]DTCja4( k`J͍7?YpE0{WUp79{o~ӽI`2?!; h?l 7=+(!9"U?\r"@LV+R N ^fI ~e.p4J}uÜsةӊ I`.6{PŽ#9q?|vϖg܇4-=~&?~%=uRȮS'b5wv' eGűU@+VםNQ:ɰj?e^sׂf#g3YԕJW$׭Zt0e-q@_:zFb; cJ{ӬK5 Ze݅>׌Nw&ÕZrpg`g](F%9+! mBkH[;Ss<FO)._|(+2ѣۢ.%NYY Pu.XoyF!g;F.quT'92IxgNr3 ]Ixr1j")Q~hKՏcGߒTx<*3]kQFU7ߚ?)Xmܑ5VE| 7CΚ2K8Xbޜ8ψϢIS;-V_b%"暵3+${ GB+f_);psĺ9>{nuEZմ%,np懲$3hV. P+FmB/c--#$C[3(!/R 㬐لпzA:A5#4o;iA^̌ޛ vN4{ƃ] ?X˞PdϘ!G>(4R$L$ D&!Ip)e 8JF 0al:LBCi$_gߠnQfςixgYWm[=- l$nOf_G?IlT)0̡ʉ\ZbMs[W $$ve]Q6#!ظEAVhSΦfwԬK$ukrڭLa"%Hcբ08D4Rs#$q)I܇* Kl[TYZKWTÿ* ޵57n#KΞ =6WsT֤2L@h-:$' `/N,544 =¹ 6/*gbiIC8!UrU=W85j7 DjDG"1NQq 5վD$SƋbY&q\E)hM'cE_#W0n(㯃D˒xl yt!ʭzlhD1&^}~Kn.?=0^~ZA;[^@֞N9~OM'~Z|`/ϏyG.2&# ;\p`M5ʯaw/]qKnUn%B/xd8'l_&B*eFə FIAR@"뗧Mϭ6w p)qSɂ& /1 B`>`M9pVBҹT+a%J4(B!C) #jق„"F/ 1aT tOl쿢p)QBNuMFgN'Fڝv[K)cEZ.p)¬3E f,(#7w0ͥ9wp3MR;}Ҷ8jly*H"8E$'X$B qjؽ$d"I6T)MGK4,]`p9CԠAN5h^8X1%NXul\YXʥPVĘ 14MNP0J@G +j㮝׳ |N>UVAF vFnju#>|*~[||הceM)Y 0i: U5j:HKBl ?륗 lZ%h"ƛ Jt,u"B ]*A(rC!!=FyVjnX5[m|eP"718CX9l/kna©r$=rNk8B֢-6j|rZ)K \ZXِeoEe@TJIT&61뀄l*+F5b.gH\k<]fmn],-"=#ܝO{3U)kju'.̏~8۟\cVwuoU R2ZIBcGSITr`I}˽Ï̡ӭpY˔K/3b }RFD%4 SF0N$Ayl?O ڌx{^ {AQ:3083c yޥw"{-|vK=~'_^W۽jb< 4*R~A;wA\Z\OϚ;wJ #8Uy1+![1bvK dv3O: Aa$X+\Ř%,ny0pXR[e:VEWoYmӝd¼9}ܒIyi~$5^jt*rLcx`E;B^:Mvp?/ՏM3LGJ( .:dpv[Gl9Ip$ӱfd!R[>e}hf\3ZCj!^[ګ6v0Y !1VB^og=hherlm L8iQKMO\`p^eU^P~uu}Zŕ?cC`m׫vz}3x5t4ct #{!`;m4 Op+';K1n97lW)SP+p3CrsL` CVz\Tj;ݠ Mz:\Femj1ڟM5&w|)$DFQS20 XМF%"XE(}͢Ap:zU&x?ӳiso('jf٘ yhqThQmb-/Hx3p>3p>|O^%>p-XC#`jjXMcReUp 18i$@G p x>i_;㽟^UV}2#,wJAH!yc Hn0Z$NbNS`icIR4E4$@$dOAR,(lrzׅȋm*,Vfd% Nύ}[k\;=l qN_))"r/|:DkB$ƘɹU|I| ?*Y.l=t(b~.$(>ʷxG>|R,'"05'6gO`%}ޑcYN5&a週_iF.2K3JDn/(A-l=U_O.lQPW&Xm]JCmcyU=GPA)P 'dњAHA4f4ӭ/dԕf(H8tQdQ9'zjLYH$kgN΀Ƙ#A|m")SJX$B qj(LZ$RAHal#\8Wd ѥN]o!=h}Kez3=rYa~C\PgzГsکIp~`2C.AaϿ7vOû;2m6g.Dzjӫ933S;Js)5J޷=;_L؄R57FeM P"i|s}4-ykW}2~7%]0FY$~9JdK~Szi[RBPxd6&qK?~z'ϟn]vQW?ܪj$z9ĴJr&D3k/t + hJUeAXE7Zmeaႜ.iayL^q1۲**Bz+3XDZ5d%R5_S ܀N Ka'VȐ4aQHcl0,T Ĝk'>̜roNZ=aW,^=CXCozb@ a*&)%vĊS1%$<4 8A(Nvu8ף̙;{z!^YP{Rz3āEWPΖzz%5jz}Z7JyW^\{B2e#T@iʳ49$JPFxl]-!w]45OeXf?-yߒ۷ϯXF敜fxb?u`c+m*qt3??F]qmNC ވE(L0gWHOg?-z/y&X{갭5 @Wӂÿw~tj1v⪨D\{邽^J]eyq<$'0C2R$*֑S%[pxY>c:65 :玻0CD[.ޅデ:a~",3!jc|ϪMHΥ4N9nPk\G!Aat~8bRۻ8)Ԗou  ޙm_Q@J)rN~n=w쬙fVm(E"Fe)9RD7h$rk/W cPVPsh]u 25or_OGج~ #w!kaVIh^g<

äG0е$9j`\!^nH_؆>#~`)aFjl }Hˬ<2/*S#%xIe ԍsQ2"Gh/m_Ԋ۾hV;kW&99a iH'И:e&=RO"'(eT`Ű8vQBVo7Gc@,"!XX95<,(yĂ!TZ`d)D:m%x,;( lW@B 4, j9cD;<";m=3DD@Z*h(x;bOtQ޶7˹sf`-jL)`)/sb%^k$HcU3P.\`D["'ER3KJ;Ы͔#A8#AZ-(64e2*7% KEu<]ƒ2A PwTu,ĕ-4]|FA ĥ*j*=LIo5Qqs9FNtP%+VƠV {u j1^VY+Qw-!p1RѢ2Yh 8RbT*jFQU8k pi|o'5̚aJi Riɴz eȱ%czWM4@l&9v]2hPˣ7p-j@r12: WXEow˟Y6Ibw0W/鋫}|tU`ۓcHsq M@⧻MT<é,ƒmP:<;NZ%ADbvvAC  g.UxGc<a.\F4S 6fD7&vxbh,s Dkg>&sz'y#x3+2ps L.\ŵN hr.XLqܥ*B/}?NhQ0gcsyP֞zI=XBrNڜ# ?YڟiII(QsP.|($#}ӂ &Eω#3"ǘpBDJ6{hYw^ٿ"MoT5RJ6veZ'醗O7m&+u'GRlu|=׬xn*^Qe9=֪Ƙz%*m_kS/oFqʂTo:dV7 24ܙ,X\+6XkB<ĴxMa4)ښkBnޭ>*~ŌJew9Ve.F̀M O'i'0L'|i5ڹBhIW΍(JR=XQ?&> qѮٻFz[ſ^Мޠ#NgX=fT+kɮn%}} 7iWykNuz( q,XT6ew X rjw?k~~w?|,V_o?|xaT6ߓFqOKlq=F%̗ {.51 ϛmqf: \"%žUpn]ۢ{ fh.Niȭ†օ@2ob iJx'czRhpx &R )% 4Xa<cTD1K=h1d#TLdN#4v2yyU&ɿ8Yhׂq2. p 02Y(e=r[c`D/1pJ$.bl`g#҃=`XJ0`Uep:2fxQLl7!UNVfaߨTh, y28~a&^֩h0zVn0#o{ޱھ飲|sC O2w@k<oO^~@N1yǵDU8"EWB: y"MJ5C $6 s| S .ysfeVַcb;VQbefͭ?FeήN96"[p:!NB)ѲtpՌa襦(+mb~tb`ٺnIh G5(n_Br(* 7t S3赝^9b:MH9 zWp렔VzD @dȋ bA?^A.W GmA ^zaWp$9z9jAȍB ɣ%KpRp151*4 oHI?!ʃrODٵ\i=zW*9$h#}FtF` ЀwFεNDYP]2CTlXJ;* %!r9U r֖iCphO[s&, VkδsQ2"Gh@Jhl dOʠLql& FO6Y _)TS +4hٵ$-49.Sr>rrlU旭GV+uǿ[=1LKJY7xK޾Y5ר4LE~iorw.:7(NR&y1ǘIs+?(1]@']j,)ʷ|,Pv+MloRٮS+<Am!BjT` 3LHc}ƱSɉX a)ALWుũ1tJJ1X_`("Pi`|Qg" BMMpUZ`3 N%pzX&1+Oq٤ǰlңC$desXZۇhzeV\߁}4vA)P}#t[q["^`FP;Ȁ 2qHF.W}Zs>-ټ-f`f|2BHQ$.xİ`,L^ c#%ȣ$p"1јRnSuS N:xҷ@Mb*U͡:T, =qN˫|YP e%~׷_{id#(A_]Mҗf:1v5\X-@hV{:\? oE||dfɧԐHwJ-oEpKbaӰ&vT"mt Ր.հPz䢺%N:rU_; W^%f* ~R ̋*=`u7иrΆ$cĹ, J*E#nO;N-ϔW-P?^-3}_q/x|nף[.xv+_Spsaë(z;cep!)Pt_dQy,XՒw>ٻF$W,2#on/k {<⅐gK]xC*Xm wS,UF~WfFm}wIuuCiO9BZkWv{\h /jSX.&)?/ Qž?&<0y5R#NVno֐8 {c7EELنr@`'DaJUr^uU+y* Uȫ@֫V\s9qg#4huEuP$ช*s3dA2$"U+ԪjeweV.K S]esa*m[$4m廩0k3 oz+ej\6%βp\仗.B{] %|'w /o:V]PW)n:{zpf'iIX g-װ~#H 7O4qr^VD2Q޵(m#TFZϮ7#1P<8I7NX ޢߋ!7ihBnE>T!\kmyVtZO|s5fߪEg0g}YPJBo;e9)6N<(StȣMG}⍗yMx4/۲׍\ @+vpZAª{u/Cݻ2+އ`[@{88qKIzKc#xf_Kdo$ wukkhjO #!{{c^:)H3XxXWL{zzw_8X|+ Ikp$vwʋI̞ܭ'c?V 4M9W![z_Lp.Eԑu~Զ~ }w.bT-ahiՆc ӛ6/"ͧ>q96t!?{Z )SNS/z5inNf 6|qx=w^G^KȞf?:17b>E~e$T@ 'Oϕ vUKm;B`1Y7¶k`9˱͝ "1-?H6p.I޼2@o/=ro7l_$׾z; 'TVmVW~Hxe97qS;R9lL{vqCZE)yxSeXf`lϯ4a|E;/2fM8 FSzHrՒlGg^hApJݫ*ݣ Eڠ>FuCp:ؼ&dWà0Q0Ksi@j] Z[ny" HbQ~v >QHM6i.|$n']+yW>Rʾ ljcxVqEeU _P޻gMH+R K޸+[` =SNN5I(I?EڶP[X8'wv[Wuʻ jQUkÝY·JZm; VbHsˊٌ? at~^nŏA;ހl5|Kluuᬻ :dҞm/zvE1Ε]n%"1ר8>U8 1P xřrIahfjNwk>ddw3;f)cC}ẇ7#ޏǮ8cgαO qg$J8R)'ORTzM& ˽ s7ߪY#KB* *O­n.78~|84=?!'pcoG8$$s#.qD:&|x9NZ ݝ`\Ë%J +d0pEJj-#!9l&@RIK!ZJW(lAPeFyY >e3ʐ 6KҙQ)(,$q$?Ҳ)(N !Ί"X?ܚoaSwd,lۡ 8eƤX1:*"HK#nI&U8Űz sMtp-{VY;!v^IG[9qArPmErlÃBAS,9O/iTbhLѡȔ%`Ag)DF j SMBHo󎣼Y %ɣ[ܑ q+xpK)9&DӨ3H<- )(nIS u:6T\D a[ҢrJPIr$p괥v ;[ġ (&+DpNj/B=0OΪK͟Y)~)M,||fYY1įzsq̻G|a_7}Og-F\N֎'νy#=> \GR_} e:~JCx6kC;_`\Re$,K.\cSU;kydh*N=GU S!p ,&q֜s@ t4!:3ZC$M%њ܅)!q%`$i+pdC$g)>s& 1XБi"\BH-iCpIC9&qgMR Eb +,蹖9Ǵ!Q A61QW(P+G gM3=URj-qMr+A8O(NyV1eaC; (.C`&Fbp2Fy C4Pb _z݄iI|BO/VAx.iVow251ѽEx1:}Ӿp/'_Ϯ'ˋ y߁KvrusoM̼!U! ȴvŠxg@],߃=ּnDXqJmNW%訡[s}j͛Ț%^@!/0Q_ ,$ |UAė{͌(uk tB{#Md9+`] kAm jV> 3R2޵WAlGMȃ۝Z&.RAS2B&ZmnJ~[R":e[}&0½ڢQ-ERD'㓋DT xj4b(Z>~!³#\FKGRɣ5 s\+z.xNP6V^ȹJ)ʤZy:W(z@JsNS1JFh%X ݵ/5L`˗eamQTz hFY:|?A^323u3ymȎ>㊉#aV}W5ܺU/m~N%; (N7! 5^Yą-!5tSuKN87-Ljb aqPc+g-jBD~>u& N2VȵGT;zTtt8R'Z;ºo1/f׮=3 ({ A "v\9(Rۍ?Aaf=2veӥa҇_ ?j%ƕ Tᤱ8:_zʨr`u3>]U q;*&_I1i|eaWFCb/Ł꬯1﹞ע5@qpp¾|- DisUSZ>W μLzϵ4TCkP=P՘`KSЃOϻ/|w_.zy& 8A*"Nɖl>{LzhLbɁ1魝Ȭg`HIys?u#ή{g"/[d{^d{YdZQ-[7F߫׬Cz3p`[zk|i^t0A[0ph+u Axp"$W6|mhSű^31,| \jku@1ۨzv+~g ݨn]Xyw ua 6 nOo{"3 պV6{ ({!OOΞ}G.J7ao C/gx jLy5kkk rbcFye~yxKOG!Oks݋ʮB^:l}? &#gt,Ìo͘^G[157MLzfݜqRc.|Gh+2ۊvTCnm1(c4n{X82vkPu!!_.Ta$ω]"D=~gy**}ǧIG~IF,E4~>Eerq< o*8xO'^ӉWtUt *x%, ):Irik#" <eh,hFy%} -Uv(`l2rbgr&|g!VjYJq<uh|GOQyjZN?]|O7b>E|Wp.&fr6tj@-Aj5dEK/I (("iC5j #s'hHJOxdEA@';rfNJۖk ៭x[VA`WEs aGE<TU>Wk-J_10ZqN^u|yNKM"|'o_Z n+YƫxZy暇=b;|Cy kX!Y~Ҏ58 Z-kWȑ⇧?3 t e+3DsB .j'+h~NvT H*8?rEVoX^atv kou-=9FXi^Կc8(돐%lr^擊 2C][ ai> F)}9"!__c<}i:=Dʾ˜TI cBb4D&+ŌԆp@d(xT2{pOB-*Qx&*17 c)\Ӊ2EvE(xRDQP)3T`&7Va[/SAj,S#Tzx)tЭ1ajgCrN]&޹yD.rG._.I"h/m1 2{~QM O16Z{D;DZ7 w] h(#bT^T$$6BP`CLS/ui$ +B` ԽM^{_B-k}sq53Θ5,L:}Zz jGkԿE! BGjt_)pX ϵJ)gEd U̕6X(kcomoQrxrt<{:{ȴ`-t[߬ÄþXUɁ.'9!H}b!CgwuTR*g$b!Er<{/5@IL")>T : @<._?ş/DR5?ꠌ YUPD߷/;|,WL}@&>jWV<0_{/OƼgbY?WlX λ)Ζo;wfǺyPwH4:<E1W)%T/XRDRL2< 15Z cB67SAloS tKN|)Љ/:6XH@H[ 5dqcyY[)sy$S n;PTM2r9 ^7&׷Vj}D;"•k,¹8QE2b LI`rR``2Kt@Q;xg&B"y"|"Lf-"I,O,I5*†:^XeXAD|a~zra^p# njJ8R@_(KE8v(SKaLS)#MVgO|VgOYUVX%-RG \cK$*9eVGY3ċ`(Jc,ӌ) Ѩ,u)L|)Ly UVFb4- J;?[h*ؘ Tɑ$0,E p3Npp*Fp&|qf<&c~'><<5\^yY55ĄZQ^8C3aH"(bZtGjD9S?SIfRa#'L*l'XP.yn FepK9nHK9n*OSLd!ewÔ49HH/NKBK)s<#X'h@ TQzǺGGtBq-6$2F2t=IPZA MټP ;רɜZ,R(DR};HQHI!%vv XPh'R@:y)z*DZ;+\x]Bhajn O\Ƕh$GsfE0cHk].$'8$g%w˂TLR rK; .,-'8ض9ZitzQq/h~;Z^2MDNqI )K(U;.)+3Fg"[Ǖ#.Ӭ4ZqFHr+v7a[t:ዒPzM 6n|d97TITH]=emn4NDL@9мzEQ YjQaRXm͉UB4zcpfmi E(<[Ved )kn]|a^[yE>.q>g B(-Rg" u/ 1 J "H}A>,7aдld*k-7/8ֽT_ufOndHME%(dmD9͂VH5m$QpI lKRt$$I1? ʶ=1C7b% y0/G}Et+DC޲Y0\-\Vi5+ɬpu6s0bՆv K{V-&#(G`kK^|ޏE-#lV;}=RG!VpəٛL/ڧT䇧~9X_n9PCջ 0\Es[lPp,Z0,a GB 9 :B^xJ^ba?  +S%˸d8f cl yG#!lG#~Ui_<'hF`}Ec5~T(K61rcߐ1|'뮵%ƶ%ñؚa>k=( +8< W+8ꔱWBqryOp$B5͐A-0A,e.l5* ܸ) ,\q]Nk|nb&W= 6S2HjRWjEׅ=qȸo F0,R;I`"=ɁM\4\$(zy;1{"&6뤮&FQp߇D.frQh1 QgZXXB5x_{S7` RuTI ^yfK)iUq-Kxk_1v) 7zl8#[q`___{ gl/qL1U߬\cwMaJz{`FJsM2{)Rm*g`9*.eug]05zV0h}$BH c Bz!swH݇ȼ擦v(qcuM3-EJDYbRQ "gTjARt)%H ?K;FD4i羏PlHJoI llVTga{f[u7Lfaw3w-^(/^P~ B aZ+1:2/_[aBoX{;'67jLMCOlVd:iQ S8~ WHe!Ǫ#'gN-_0Dwav(A^Q, o r|d5fv6}^W_y73(j=[3y ]$1Bc iZH v~r%%W3΄5@%0Ub ~M;6V&ȈtSlEw{MhG:ץ*Ϧr9@]1&fj>w0^b|X:ٺycvzӊ/ɽ_3Ѿ7y}x@ۏ{]>K ~XҝIܛ٭5s$ǏJݛn8TjXuJUmW 8F ɨN]A VyU-ލQ'J 8S#V6Ӎ}VJO\v8~g90)a<qUd'P xZ:;㱦R"Ԇ ky4܊+jd Z0*W-!'dYFK%jd[wso.qB9;++u4) IYB;].+3_ߔU:}F^;Jʅ?{]$ɘII8Pۇ|gES h;٘^͍,\>*?j!O=_=XĜ}WQ>S+_5݃Oy>"fOhW^@FѨ{a1[gDPgJ Le0^3ݖIbuG%Ts.2,cL#!2Νb6 bK6Jhta]cMvA,٭r NJ7xW-|!fnEwH`g$&pLMJS1 O| bSg(~cOO`z#U-k6esE5 .n LY&Tfurd2s@ex49 P˕ ϑ=SØr:h~WuGAG`[6JLW:v9JTAWN?9q]qK E~eF!A5PDdKj=T.Omll6w=1 bY2n޻Ňr õ%jiVD(981G {ْT 1~z~yfe!46(#CN4W}m([_%@>\4٢@/l{ Crb%e'b%%#!53Dss*H &iUtp> hPG$1 \xe<ȺaH@ G8U{6ϓ?C MĄ!Y)-9ΈW OGBUf!dElC3Ιp%Ēk9w΢ y&zNxez vvMWzNT:$qJI /M"oOAg}0۱ƚ9[ě5*yY{T!ٹFU|: E=t>n'Gဴn>zKC DMNna<0`dY r8(U _αl/rnRҼ!g!;4xw NKiBMΡ@FJ i;^7hw4hkO*&W/u5G)01 l9<ݻu65N)'b?B` )š#܃9lRffn3359sH:='ִ}ٴm &'h>dX¨:[`rl XJzߎԄh`Jhc%L!M&CƏD_Y4"Z 8[XPȊ,\%%A Nx#e F.w#o  I$T͵͌-tMΥrFDJ"S<2ַdqFB9pd(zv~{,@BT?oF0e$!mt@ L*mgvx c#O `g"#`() ʉ*d5j^ ,F* 3+FrRd9b80nC[b+/K( .4U(k0YngRO%XکR@{hՆc!+Tr"yNغ~MvA2Gx3_ec)G%H~=*`qjxQd/ZcK>Bl R̦Bp)GWtۅf)R69(.nٵZ\(PH<* Ty ]T9D½ Kt}EǩbS'/n?wj}SiPw@s!dgu p 8jO~^"5XGABfГlGD@ȒDK)<-qҵ5$4j1SR0S5r>0GWsq5)Ff@<2)run)5َ5֌رEK< Y'эZ[ wHR.]͙#gS:/Ɉ" +09083߫(6~{%j$.Ar1'P,FYZpNW3MR ҈RLo4EN"|_6rg|6?_]]޾ݯ.FWw7ھp-i}T2-,:?'99ŗ oz@Nii?@E;qiWI?FW@ռz}Q }9vϯ߮k0Zʴ~u}kۅ-ݟdP\Sƅd짊%`lLHEop>Ufp V*Q`b # JP0+,[gXH1ˣ͍cT1H_y 7 }X9L#g9oFvX9Z\Ң_&4^$UK=$Ro[K0's-f lf4&!Nm_]-nn)>:$n1CqΘf19X1!H )STJN 3ZBSmI$ƸCLӐsIWNօf 5Ci.)*u p0̹4/0pQ|F3-~>\Ҙu=z'\̷Jb_GA#x@*+ 6%v5 ǿ5\:B!.7 "e,6vBwW'.,HrݖAn\ >ȟ ϗf^5: <N ʩNur܅^q< ^wF?t"dL*޹'pcyjgsm>>=|piN6=||r.=xטidc'g-0OhCnֳH-"%v 8o :=W[N҆Ր9IW,USjsbw&%npL LcRD܃r aR8v> Gp^OAϧصNxc\ׄ2oCaagWWPMXYy!h 03dhpѹq?\:b g*/a!rฝ+ fPЂ)1 T`ֈ@Bг .8Au kٚ :X&p̝r̀ gR(3E=28L '&rF{mkPZ1ǎE[NkAL[7Nq/zٮ^Սdn$u#u#a*-Δm#>nvqV郮8ȶvJTx D&ɐֲ+/|@ sLwϠ}Xs; [cAgp@s@)bIr-q]mo$RXMGk"[&>*8 ɃX<Ҕ5v˶qmCg;X ZH_o Onny&Or4^bEJ 0N@o 9mFVq[֦ioi03|%E&A Q_3|X=sft 0أ4H`A>|~ T6mn 95Iy;,"bYӐ9}UZ`2C&&I[_-H'lvncʀa4]nܶc827NV}c6"UJƗ]3 8v1hTC4yH򡷅<<`>`0ݐI9B50-)<\-%fl;ᙸZ >Q^>:{:D“XI4y:F -!])=ԥdGgbTJ664e3ijzQuX,?T]T@'.T%pΙ\"ÎaV- +ai-%Ubkq~%iKN9wKd5Uw2/mc; >/ SR1)nٔޔxKu#YVuYA\'8(w1L {iؑ6i[31AgqgmMP #uҝ gsEB@h.= \H=A@ :`6ET,klBn ߮h_BD'ѾyTBDJ(Q%ېc(U7,tO@OɮORfv_$늑ǸBTjsz { 'Yrd>䐉¤yڌ)5`he9-RѲ.|xUI*2>.SIq5jL=_8KEɊg*A C(aɦKS~0]) dBEra)Yq3 `Ʊ0kUj dtP /8N:i6e^U*R-GWK5V%lMh^:$r^rl~oVRTcPlüǙ m? Li/q-tT~i Dq:%>?.6lbAsJ}(aiDI^b9 1Zq旂 ƪ>"j, RWK aU xaoO- MHT g^\{Ehwrq1&TuLBUY{i.-7-+|y裧ߖr`,-Њ钗LXcY`~eMg"+q(i,$eTS6/'XJv>W篰t@#TlUj$MQڲ rYkXR!:ڕ 2!V.ۺk4ZjKbh{ XYUvfeaM`X/N~*rV<9P%d%BdhKhv;OelKR!vd&{ݔTl,)Z-JdJJ"Hsмgf!8s qD%0^gC/ˊ:BmA!˪̖ѡ~R&)$kًrE}tRN\o7;W0jJT3UrwG$dgP$ ~'Eqʴ,:b8N[<,3 +J=N!Po59J/AZX A3FeM9#̏ {~bCA 'QiʇӂAgpvå, +!fR*9 B ɱ/FaN+sZ$wkw}\|xL7!XaJ` Gi!iB Bc7fom.ӊj%WqHV^!j\z.\(,]uOF-v8s1uQϊmM0z>T#Q>+CF~a{Am׆~g0Q N9mػvM?bl?cv–.lCg'ւVvz~Rtv˫]4UM딍g3f %-?RV}=:PQ:Aӌ^(e70(Z/ש1t˖iEW?yꤑ;h pZ~eC!g k 67> SLzFZݺ0:ģ;D 9D^3fK+qNVl[Nl2Zxl~RM>ig~OXYn9IN2}?sc$RQ],mΑ*h4Mݾnm6 Z f}C|O -yhYit)iusI6h!``*~Dgkl-dy ` ٣#&682)b<=CQRǽ4} $=Y^`#GqDʮ[|tA I*oH*xt.HeFT5_!U-(< tF4emVJK P~#4&l>ӜXoY]b`hcz@7[3 69qxY0j4ى3'(4j׭GcZF%q|44uLl| : %7(ifA(sg@1tCO9@$Y:gBOY:jf,T1 7;LB; =q* >K)Dj\\\Źg S.ݻh v=[Vq6wJO=~v^apB-08٘CNF??} G7h-8.Ft݁퀙xn>LɃȾꉺ)6tNӵAAʶ},iD8NƧ"{z LGku;h~9 8~ ߮9Smսd`=c_N^ӛn72m=ӫ#~v]ӛ׷?_|~0׬ɚk@4h&g)>)Aؔ>=~jr)5oM7pk ^ߔpGO :hjF"NzwW>yX'3HsW]wo.]ݛl׼tMƋ~tՇ =ݵvO׿/|o\{_\\r竷p?1׸Co_[o׭0 /nvwyϑv멍$}i?5B=07Hӏ^~E>wW_{0>eӻpX! wיL @[zrr)ny9bGGC-[Io4Ђ#X~)Iknx남f`lTa,IānZְ/Ս?/К~ڝ F\]:_LUd6sS>o޹\\d1c(|nUcu$VߣDr0igL0ƄZH8N7/o/{m( ^uB!Fz 8;`2_ `fʶE |lA}۳eiʈb>´$MVc+z㴤H.&;jkoh%ʅJhHyYoe^%wq;EfY^ߙUN;!=5/yoPw ~E_EJ,BFq0IV 4"HF4`1EA(B)H&YxF!/-B3*.BsB~eMͫ_R7A} I;~W 1{|ݷL;Z E:P1 M9)t*;K. [a=OLi;: >CBzŎl&P|k&Q%ٰ1)VcO7Grk~o" HtJ7rg1QkIwPHpBai¸0鍧Q {`[v3 Wu<զϬdًŻW!yCdsm$rm_|:-h0W\I+:lZwlToO龜٫~,+ Ci$ɞ^م;7QI5c5CgMgL.0tAf!GA4-3Z#Pp5 V4 -?ΒȄ"5Sj݇6N𥌵$i.߉S d0vg_<: o7tnE{"?OԬ?NC7GwiȣDr*NFbhLx<"1xrg !D g.N'A2!\<ˀR}r3[aEX9"nŀw]m]bUAXZ}OrB8r]>Wzw\;Fw(-⨶w+ťTS"Y?.KB k4-#%GZv4n]$ad`UgKt;5֪ݩ]d46}~\9xǪxF! զMd`[V=^^;+tEP՝vHz ̡{ck%lV.kogU8lSu:=Zǜ/KU:J(`Z8ҡ~:|-vT[]Lmu2h^҄tdiEܳ!M8 p7-X`i`;C~h8p5gweEsDZ*pYqpŃ9k`Ҏ5B)8R6aE?Btѝ%p T; N8G=NOߋSjUw{[j1oSk3W fVwqS;iXߒK=tmof{ X@pBU靭IqqFMYnR IYZ5)ٚdB{ QF@? ?\X'mĺ$QB?jrXߌ3\Fh} 8 ;!UZck1rl}ҺW/[UeضfIh.֌W{o8 pOJuɄ?yӁɄL?\,\wc 5]B2$jlm's.xZضAv.޺v,NXv{CApT 9+ 9hL$Z)Ӣ rWkhv>G \b9WWNퟞz-8gtnpj)&Domp)&yrwi:*uZtq5Xc@ [Ls(m8_6w}i}PX1SpN:oWq57҂tg)׈njGصNKOydu:90+FOi,WKwu+M8벛/(J68[ MKTa: ˈҾ:'o{8}Շoge$sE$i> `4BLնS!JvrZo~ᛗ1srW%;G| $ fpgtx/nzv;^p<9z`5feP-2T ƂsS ߑe\Q&%QqDj2SEq?0+bPF~J ~=oe>T4L{oYWKפp+|04ea2&_{pDe-Ez@ԛ^//R˺rJcWzS,Θ$|.[%"Dg CMx*Y4a\JԳDh.8Cܢ ir% >IqL`.P`Q8Ì&С0aaFi#1M7;y qynqn2df<}4y4ŔAN4O=pBL>=)MjE #7_bhyO7|=MH)Ѧlۛ_z5 "[1l@5i&kh Щ<ةxm1fx6RV0kځ !h$0v>b͎O>݇6WF+n?% `t0+WՍf=hLqjnftd*ﵡdo>KAtf=ؿK+d ǷͨTT~U"o-5cB5e£T($ۤVYֿѡJ8\=py1 x͢!֑`x?lkv5`o4P+G (Iɛ4!uHEQHu*S˧a/6P73f0 ~xWr^6Ƒq. _ie%uب- =ҚŇ .k`Xm{ԟm)֧4k:d_*t<[~;2i0 'f :T:#WJc<Տ*s-S؊9L3'ZF}+R6B?{J0Q>I]pr,I H#aT?nO.X5o*3A3#ځ+"JZ7XJO5 rsqB-PǧrFRݏ/#/TJQqt #.3l+r.tPB)y/Ut(7E蓱,EcX]:028Še'mH}V#}FO{lrH9g(jCP# DUuuW5/Ӎx׷lTjIx:^P`rɻ{#B7[k1 v>YSPLjnCJή8}|IZ\;Ӄ X. i#GRJQT=;cRL8~oq#ͯ)-~oS.Uծ6au"L4+l"}VjR25gL0bPFpTI4mI&c\+]6AqzSř&^{&>4i4N%tFjP{MD ,=fh?~.?BZs[>_4'}).>=}ty"rZgaw+,7ðK~>nZf Z0}Tˁ% Br^1DX;H_;OR,7Bśxoug|9@~;.'Ch CP[SS֓pzXrOsI3lߎ~b]Bӡ.pm(&y Vބi>*G4XtҜ@y9K@ EHHPHjs3164D4?f1RE>~2IU-^Gϊ_SE՛ǣWl94_YѳpyqBeG?N+^{ KwCޝ@v'g3u?a'`g-޿y(RmJ;z]7PmТBt݂#zz5 &|R'}qCn?/pil,ڕYV+ksQjEe?;1,m$R9ٌaO383d-„*Vbթg0´(MPR6S>9@l GI%-I~8H+ǟهKzׁyQo - δFw+lKʭr1ȽJ .aSDݓJTu 5-^ KzO@}63aQqQ%6H{,|YX!n$O%0$@23q6K LK51Ě:bvIl`AT`i>?ՋI+EzmR5yK[Ϡ83}KmVٛ}A˥փjInyy 9\QO-vҥ~So Pkr e5\[!R3K iI#)9$[c:fmԙRonSu:`DźIKק³mN=hF7@FSw]36uֈv'vq7Cwz+aqJdc\ToÙ;;˿954ݥ t uENL;2&4 &S6PX^[sn]=2Oob{?Fj^0Ɩ ty<,~4Dvc_s Y:s Y:ρ\sja]܅ĕ@fxcnw6ͺ m΀ݓx*JB5ۆۙ&@>cUW}JA|[b@:Ƥ|cB%ym翬J=bL,& ;1a'1&4,jʹ6Jyye#9"AC:p@2W$=<eS\0'e ,GX&D2p# #begZB,D80i$Q5%:xv) GHL0a(8@~wHܧ𥘁x<+f0v-gmXB-,LVTZ>ڋ;@nJ\Q8UD /b;")uET'{I8ƁN1  ,,UoNmh`Vto6FeTBX.s:Ӫv@qJFYbrhS?HPyFA4sR &-@?)M eK)X³@!rL)&6Nb-DPg(]r*^8PD(Yee G{"n*],ά.@!3fx,Xb8kQ⻣)(B1Ϊ| x>Ͱ7sF$bzsi{+D=P`& AUavI:ɇu0 {>^($3<Đ)x:vjC)R6p1^ڥLd2A(Db&X|cb}MzbxKNBPc@'M\9D53x7&/aNz8L&҃s ֠ȣ =r%YƕXq(XKp"&Q,εFXjn0bn[!';]"N?zc-<4w%kYkR 72mb2'خB{}Kk-vNDjXR"k)BD !BZ{[`0/\60TQfJư  Tb)O%J5]Fa/`h/[ߨۿb9Bmr!&77[F[w4y$`!2+X#.PLו7cR֢`Gh7KS67V_k$T|F k,2oEQx@&"pI$ 3 Je+)&0RG8 )!4iiϘMi@0cdt $o/Mk88B p'`ja('`uRj8!J$Ft7 SP`C)rk1gPA#\Seu`$M[Jk(P$rc7= s @u|I A#rx-$tM$&f3XB08%,ltɱ`*bkG4G$[*db0Մ HfAMR{/cd[;\@ \֢hDzZԏ=hqS_ k6IC Ŷqu-[C}]FDf(b7B}o'M͎hSwhD*}ݶs8I 6Uap]4'p%|<%{?bOIHD`Q}JLlC $},UeH ((?)#T|8}:VP\M_%skEUW; /1\\U$S[wD+#& p)Y* VLpn V`P9N6crZD@K$Z"x&SJ 3C!Yx)=|D {{e /JM/*e^ւ+*7(DHY/f*Xɽ?{Wm K_R )h.^*uIe>;_.5`vh%E8)4HJ3 EO.{ӍF/!Ӌ°Rg )2]8,Iڨh"TՊTs'l^DLSܯ͵E&d3)YFwACx^T%%U҉2ϴ4y4fbcPR[WDL]XP'5Cꠤ9}7R!O:H9( H[:IE(]Jx!eա}V8QRIA& bl\rc,u0܇<<\6iRTFKSUDkexBAeUVk\u4R%W*4uޅNU; 2mRXYVrV=:(9E:jDgh"DKiX*$D)3U8ؐJEh>8+^rp7k5UtH-/pRH&hA"RCi C/b bMxxIQ 1wISKm%|dC1dAĵ(!@~j-G"H*4Ws`E_Vb%| 5NJN/7Z=鱗+grڵZslDf+#4~z(Fc%~nRoPcGz썆v _8nASp8wo*=,t&P׼!(^;[l!<ߘݽGye4|ưbY#ܰ^ f;zM>^^}V3MVOecvu# esGkZmmۯm})=4PY=:xOQJu+ޙ:zr}/}| c2Y֝Hl 4f||w ^w<稈6GHA5$m֧o 2QSyBnvr $ (Ϟo羺AS3#* - c@X\1*ȉP?ݐC,iS;z^/}(P{ Ȃ Νu(DR"^σr«E.ˢ>8L+a5/xg [DET]a(s eEtfXwBΖ]u]c%Y韚Ҁ>S3'I_f v#]XPK Q\Хkfi;] *L`3WNH%%=r#RA< ,Zl]yvrN׮pvB|mWT+wvZo!> GKj|o+\^/fQ,%yv&XGuxr ꒳k{}ʈ%E)ҀF;m#B ˧m5ەڈx±ƤdvG55K}2QbD4ḽ;77"ZA@N@ av$k2zАѝ]5CZ7nŗG `j6ٻ]&!Y mX68kW 8?{}1R5R<ˎ (+=1%7Er6Eu=)өSosTe/k <?ߢ^S"m^qvY[.N]_sy(|4tjCYerT兟:D{+ҍңwrzsq~D3` 5:g3(P!CG"TR% yog}\N=6 vhH˯J+#Z]꭛Z{~krqw_H|ƻD]tne`ܖ +ԤeӤxqw?⍐*=i6rEtW1tkw4䆆Gi۽JOc%`K{w'JzW+{j0PMBj5x3J7 ri#%Ӈן3Nx3N M^VrJx)[SWGg`5= ;GW\9hč5G n/Q@)iђ1UcckTFSѢθѹK؜ $ϔ:>i$v6gCc^i-Ezeխhiq'C`@Ud`OǠ(X{ݟQ}CJ}!3p*hO3$&֩ޗ˨7*c!Wrlpa0<_My٫&!p݇?j|m6AjA>xA;58eIفoz6b0P1)8=#\SJ0a%QJmME*>m2R?eq/R^+[ln΁OQkFu~x XhbˇbWQ묋źL)gWwv1٤flx>/c%*6_ouY|<]؋99֔`pQǩJO3* S B[eT>P>̀ uT7Mp=A]PU᥀eggj6!7*C"B eVVݦ@itXxAI.D^* 27A 63Aҋ?R,vZՖ˫t"LpdۉgVyZZ2ﳳpl}B)F,qjYj+2ƏHQ hq9p6>6,^0S<1&Uj9V5vtUdU.DQ7-Au`nAƋ9D { !S]A,%QLN!^œS1PEڨiZKSvӨ^hwq3 I¡q}!$ĢVbD*a_RP}3)4?Q7mTȳLQXVWxRU(LW7_mݦι4f]`߫KV~CFg(֫pA":O} iBȊOUHlm!sjG Y4S@CޟɛOxS`s}إωRp{_RR<ի"^ob61MƁRq uBˆ)/mGF,PlfU~\ lXˮo/[ԹLNBTzyCR(VWd"ᱺ#@ny΄'eIqwO):q|#+ Icڔ W oK( DB Atw O;RL[jt4<2tՀTu]sך!7~>pc>\$Ѳw]=7Љ`k X3/$Yyz;EFްQH) %"EGdMIP$塒*JW.Ӝ>zbooFTDR4YED.]'&_f3'8$?,y"B#Լ2Ak5IeW rMv[̨zqeAW}KQuRuk^GY^,t(o/3Q흣??_()|{3vrsweĠI~Y0$ĘMɫ0UL0˄4"dgїwgե?_s%"[hRSؙ ~ t.gT*4j,RR7C%!E*FAdBz,#Qh"f.#qU ZVmh&u]B4C" 5-BKzY"5QYXg,%2 GdU&fShT"IAMWĉkeO튋ʇe׹fHЯ~-H>u9eG.Ik2|+{'9 E1S`X&(i2o2_=ֿD`VL_*E?&,/nwugo߽n7J* & :A&R2cŊX" 4̀;+&DRI<\jtB^7eT]  14yI=Q#ԤlH.@6\j)>a8[N\M+a Om SЙ5R~.g@xVh@7S1F!1ȏ>y{SAxB\3Clms{lw*~gE:yy}xW'b}^^ѣWzNU9¸I[gy}$-_40C\*^RON`z_W`TkDn G['D%xR~Q.3m Y)a^_^]#wY`wTz>O((4{kBuvaJxf`05ȳ sC$K4'nc z&<*"f` E6 bϣˋ,Y 2I!q*r#ñ(IhbDb%M areYTj禊}Lkd Ge;د kylRTA)NHcx&U(p9f41h}PZr L"rb`wІbL&80uXJyDYZRU~0KrZ)0!ve[Zr!iQγ446 ٮ}*l IjI-ۥ9'JK)+^M޲R@q)CK;nn=EɃxk4zzxcTg GV;(d溽tOG^1F!T!^A.>UZPEFtf+wY֩,"DE*,2pyX RB=*!UB1+kU@R%jӍ$*0)Ä[o ZNŨJ: 3[_ AP? h0AiOV_WI ~Z *peA%pA%4jkzP)>M3-/ǚ R)$R $h/gZHj2 ?,ST):O᭗=K٧[a :Rb`ISLB֏PDJHI uD& )Y.RʼnAN"b5`D:Ӝca+SΕJuv%咊8'N̘DQTa%R1ʀ*š'dC`Z"cl&-4ETBYĴh%oWqq@m?xr*I]ۤ?Zj<8k[KM0>TT>&xy73l9l_ׂ p3 \=*kU }VX;W^k5t^_bHg>-ÓI;׼9>{kD83P.ae)L`f7g~fS@7/X23OǛlO yvyw}+~Pwu><7Gi:_guNiA"*lNNs6[~$wAOXΞTkp\F͐5 ^Pgv./)e=V?+S!'>E)f_w:nu1:uǨ:< \bܚvB̐?`nZ^nu1:uǨ:LZ9Ens=[rS֜|pXF%cf\ 0=ПgR!1?r%HArOGoŤ\r1~'] OO`Ni98?cO/ |F?!JJ OLiRQ*;Ӥz-׌@9'7zr!u%=~PqSکw<sRs8Xsj݈C5D+pԃ` ޝM$=`mYoo4ILS Y/ME2W]| ̆ 3W _M _ݚR/~w6yݍmь/Mq{@W 3oB0놩cߥ$ȿ+V~\=NEN^B!)sUq!eM.~Ry] {%v9* }u٠&x((T- M3#ۖngClW E^-aubQ!?K~?or6̽T cVlǃeJqo~BP8ǒS#&i0Wq-O+w&6 ʕ2~Q_=L`wM sZ6:S3iAUCF_HFnƪ!F/s/4Fr7򒶠yFa  emAqXxʳlG H&8!- ?_?Z{&4h#N,-c`Hb.%`>L(T2C4'a"EFxʒ#93}|(>S0.[&K P֢XID+Q Tc-|\AnKxs=5ET6ߑWi=`UBrӆb*\j ])k37P2$ڏ4;iԍuVa~̝@ٳJwh+;{ p/WZ٢qj,3k9 T,͢_\ZB Y[eא]K%cĐQ!l"z* p>piH.{45X*ӌ[C/0Lr.8Q|HޘP%-)1B_x/׉-TMQR雯2ˈN}u~ǝ">)i?QB7@Ÿ[+:;^+6MBDGPiS4aTu:N4<[μ5c' !LQÜG%vkGH)yx 8Q49rM)t_סּ*D}ye)J(KxDEuE$X9m%ck8kgk4l֔J>z'^cGmD!$W]# dH\kdmCBr(9}UF+YU3)T 4G7,Rr`F~S9aZZ\R&ZA3{[G7`Rd8 Ј GZ`L$O_.}GڼX1F߉`Ҝf^0tzL  <Ġ#{wj'buOe ?O~zZİ&n:{wo0Jb$8@ hXE' WFZ'2ӔaALb;"f 3'JRű+ 4>}fޠZ8uA0mC{z }_~w~㻳K?p>wSgQe}u2J Lrcdʴfy#YLbC)1X/?|(Dlzw^ra/8Ǚ] &-1'tq2j??~ܘ0vtLH8G+Wʕre\?{Ƒ ]}'Fl}K翟jR8g4j(Qr5_Uץ\}eIQN1I袷 {M qf$$ђLы.DfiYsk֜55eY3N=wp:ire Jc#AyID\ {>W;ZyqT-Mpq\\, c 1 7SpSINjS:#BӨ)OmFˉٌ?ZlպtK}GYl;ꄕw)mZIhPU$DE9 (QĠ$(uNYqI8 htBK Tz 1Z ̃Q߲ՂlSPk[IOЭQZPTFaG`:ɕucV4Df`13\I^ >ZH-IJwP--3d GJވCrBE"RXTP\+,EFG >Vn޸=6VX)Y˶}+,M"\ `Ta0@^@~DrQIbxm⪼GNMa0 ,I=~$zBiδkrGLK-03_[eml|1x>C}yc ?v+sQhk5tTiE}V 7v#+1ֆ0чh''T۫Y,&~I55o-}V?;4/>\bF l<Ю MOd<[TQ]>鈣p12f⇘xr-X8Taprٝs]sij4욅 dXy>{L閛׮{*ZO]0oqu;tOi(,=0}Ge,/^~"pFZ :;?SZo;gv:ul\\cxghCFh pOz$ Pņ<ǼjPA :? Rv6*LbGWsMdz-CVq@CK -LUjBiS<ѲB7NshvuVyKެP&̝NW*"qe,c$e}u7sx<\;mc|6sq<.>hѧ9 y]b~1^ 78> L܈.^QSSql(mleT't:1D0.{N #x(GOTpP([KѫlQ!/מKuZskP/wc'z_U@n7L=]'Ŀ9Aծ:X3>1SC<خ7%vkOZT5y!kډWuNfutO݋%KR勾ҊFQ1q@뮚p(eso_5pR}^Ů y j%S[/}k抳T~ZM_sDBPx6Q![Py냴?t Gt8T Oe[v麉 /Bjm 'cD,rq&Ƃ̭NAo`HPgJtgfSY.φ~|ř5s;pTdFiV7aDϹ/N"sRݖsg X3yg"ON]ޯ`Ös-&-8`̧ʛmB0#'.S|3}$6T2ٙ6ტ]`67 VFn#fE5*3Q+lW7ӥ=>l5p.TԶOz胈KjHtMSN(0C{_Mo2J+E &55[eXiRh :$D*YJsA-#_c Zng"Ҍѧ;Ϟ]&=m$@f3B+jF&d7D>_n  NZoHl,O|<}/hRŐQLj z?Wk'soJۂ`K&zs(^-{v+:$sw'/Ș*ݷKЊbsڞ }5ԻͽX}ˍ|k jh_.?W9\SU*aT7~]c@o.<רYǧ͇ k!vTD&Z 2^O!MU\ Nzķ]]߉YqK;֎Viu4Fg-Նo|Wҫ];ޢ;;Pg)3ZlYcFp\MJpb;е3؉:3t)_~3_͵fg^%OքQVArQ#!8Mf 2:[ o7)J/(ʨַd9!;tŎCԌ'MyIz@zv>8`mXN38XGA O/0,3̰4zᘡ`8ɚG$ӧs&zMriy4q@z'q郞O}PSe%5rGЩ)3fg<E=>~r<[} \H&k#*nG K(P@QĚ/W8h{~2<2ӷؽ_<''<9T\Q@J)KTõJ>EXKkw2ۦO k ߃K{:g$/W1tpVq VGmNjIP) "vuM=.bI8N:+lzŤ։d(mcSv1\\e=5|f*gt+d%/]RJ JsaC78j&91Te9952'%;QDɌqt߄T& [w)pT^hJh2~ejNpoָP-@%#6qo7- .t B#OQ6RD~VhaԹc ZD'((0r!IuR3r( $dqi K4W6"Mѣ=D!!z[eìFZe!DH+GTC e GT-}X+h-g^̠1Z&7魻$Ɵ<3n&N3dcF)lІ vB([8ʘ\aq=OFů}r,) H掷at]DHKЅБMcxwW; RLǗ}:8MgyH(VD@Ԇ\؂'6 6$t-?&0U;IEi GΠL+/KPfZFNjQK:'P)o HPŠmZׁ!_0Σ$mi+,Q\Dt#suy/ =@ Gx.JӃp،ksDfc8B!r7'7r| LCGPyC h /?|z8Htv͋wWsVvD^Dݷ?Q"GPBNgG1k0xp#F9\\xcoCfWcGoz o 3\<~v"w5Vur5"/ ьv.4)V G2! %ErI4e "h#ҢQ+ -j4"L^x C,da:KS2IEFVqCYy ^5 BT~1BH#TYPFStRPKA{ 5m5٫X44cZg}_pᔖV$âtI ѵA O"|oK,[."Io#AC{4{}1ijx% [Z ɨ,=Al#=Y6"=v}vPQ,=[?c5GD[}qM.ƅh.i?f0R})x's*t_<@ Pa*?;mVGnhTcRG_ Zxwp?3a-v&!~R~~R-k7տݒ\OiZpM8wcAռ[ؤSc}ԩwj7]%>D;96N@"*o0_255~u4}wT2yQ='Qm"krsB\ԃee7k$~> aNo6>*y-_ ؄D =r^Kq֊Ќ=2 *HmRJr|3f|b™-H 1''h@`pC]ɍWx`\nl>eaӞ"i?Ѵ-ujc#EXq1y/!B͜4A(8kf d1J≕:9pF1AK 0B30XK#taٸe.% Yyzʫb+tIL&_q^ؠO@*MfIdm6ek˲c][}~g#;|bͬ [ =1ݢ` oE?atGT'J5d+9ilsT7>`ұI"D.Y J {Dܿ  O{yqiA)xp/0 LO LIgUL[~;=ч-,Eb| QS28Xk+ ~yоe1ΠB%+tf|9+ jW˾b&gg'yu\mKG=_^ $|%~;?hsBۀxSeN.&gVzd.١ ptsS;#9%e]%[YZCV9!RK#&\'nʴ"Jnc\Ȕp. npvWټ%Qh܇ei\Տ/ANrjNʏqt;{{{ī ?K-woPrͨ&T4?h5ka?nX\=/sRL!1 \_ vV#K(r$2 M YZƁvV<%yЬVuP-=<-¹&uj|K3ujAڬYKWЄm ` kr<ךK ?PxVyGpCdJ`dxj"̆=AAT44kT1{5ǓI6g޽ĵ5C, T09\pcD:cPkPϝ -ɬ%t%Z"ر$ wIHJPh}4:Q43h&TsA͈9|DO( ]߿`*7*i.OeW(7.ox3?{v?pOŲi]6|xF+%純95R2x4 !ۜ[B&TBu]fgsQׄ!cYWX^p1(p!r¥(۽pcos4^22IiP+4?-se6F{Sg<-"ᵫc>ٗWoɛf79HRT7_V|y`ǿN0Bu|3f9҃`.e&LI&pE&h~u|­c-r }PҔ{@!y,^rJjr ڒID@o*tm7U^i\=?ܪ/{ A!gVP @AzO=:n[*N((bx㽉 S ^:qiť\\lD#HE$ken4 MZMgi!DD:O>c,Ci.PS)|AD hEc:__nL9EM;2QFdB82Lb4 "k8h׊L;Րwt<~1b!5zcUj:U*ݩvVz2w$T:c4=ܵ ч4״3o>x867j.28lۑCDa{ՊZ"҂ΐ3? 1 P0shЛdɡ~ҳǤWºCpÞ㚯{]'-dljNlxu{8b{[^SnкF 7Mdvm )fێkm$0 v2ʐ~U/Xsܔ@]iZѬTSJ:3HOٸlFZzFemT6 Tj2Ow^ʲ3j}Yba̐{5sU&ZwaHB.o?D0oά!.]s 3e*rMB0pjsӕߣ6.=ܢItѪ߇-MZ |Rš=MGip7Z枏q_dj/[fλnaS.B!1yi~MiIW[EL ĞIӊ sY/|3AB\"Fm.keMho&pJ ]NC]S_-pp5oam_osvx)&ƤdN_h jVhIIT1rۢUCOw(>-X;YhO7:d W%zYB[#OwEFS E3ӝ@ʥ8Y-0"B:b.(er?Ż .Gtz.N .,gH3fh8-3Q @&ߊvμ/E.~LٶT;W iTFzVNb~i])'/,J?-6؃ч uWorUW߀V ;uٍ)EMJ_^wTqBɁ3- -E'x˟f _?ߦ- MiˋxTe|_LW`5A{4Hw^\9lkQX%e/nڒ2[!*r)OXTm靷2"ܸ$$8_nů-Vѐc;:hEW很E2BY_//Ԥ/2$*b)CW$_3X/hJM PJ#(@6D(w1 eAgXŵIBKjmKR{JR3ɂF Å;RzJDί {裧x{T& f4 M(Nz1uPDsFa!'T!D "RT9Eir)XkPONV8gXi# "pHw}6歑dE;J+f[ g˷{CWvU%Ů8釯8}yk!c,ꭉ3*]AC^8|>6H "J@ͼEwRݙGe+a[4=TmTYhYT諢*..+Vb_ׂ] WdյG0iJ9uaLܢhjH;9O3O-i.V n̯jam(u+&)Z/ګ!8H>$})U7 ]՛bA-4sS 2 ARwg2ԞyQSlMt &WwS4@sswH!ye~)$K㈥6,#3>Υ,_%ZDe^piwaȦTiuDN"OUehՎ`mH+F2%uvST§n 1$: QKn͍?yڭSU !\D)i|Zbѯ*#GgzN :iؖN?TMEE5cXO(9v Bhog_9 t 씛`NWH5sD%/>o^je R3u2[[G)>&I1Jr{-[^= O6+R*Gwo@kTh\mg]kPc ;=^X@ jp?1T*E?r!CN0EA5Ƽ12mWo ּB*e`߃`;yP WzM4N5r2ڪ)Q߮9ZkO<N2 jBO7iAT?==-~wY!KJZ`E~$Foȃ>~/7iăxP=AUFL(wDŽ^?cY3ی sM:=(=5vI4$mARo@t̗×W>TR*'3+_-fiigx!u:zWov&;6JNDGaOݛuw7)1NW[&-gko2@D0+8pQbO@[e]p@\skp4xQ/jyq$"xAQHӃI9b gQViKC_@MMQgfb/Yn7$18TV_~j;aNvw1E=hvxKm$8<3yrUš`\*(W> 8+% sy_w;-W#$!E^7 49oZR-e꾆53&j2Mf"1GiO:!`UDG4B3M-Έf] !"'V -3 !Ȼ+^dnHB(}V%IqlJX_ F+KvsT59ɤ@B!._26aYf0.҃ iuTHގ /} ɄQlMQ27'grJa9FZhyH{!sBIj0LT0E jNS@3| OB#m@`; H6`[!!󈪰FpʍpCS@#}1b|2_?.fvxE?BeaTpqLJ9UH2!w/ߔ.(:]ʺOA翽lX12ov>Y3MSDYsJtO˧p}$pmqDpLG]( ~S c=_`wS!Ԗ jaǛAgᵢIv4.%>UULJk1d%ds=>/fNC&-Cav]~?Xz8pUs8pŇjO=Vlx=SGB.]ΨZ?uZ* Ew;[/ɍ;{~2Zi89nFaMw9);hQTNWT?&[Iټ~{*+;i8Pg7% d4;d4Ju 8DP*tc{qtP =͝;bc4m&\YbJKˡa('T n R9[.( Vv/ۀ=hk1vJT(Rxi=N)!%wsJc JJ,EBf@ظB aaYބjTZuѩ2N$0C&,d:(fw1l2BK12".C% SCzDp\d>E~qI-] 7ᧈRNVކj_TqM+(yiF3˽a7(Z +cC p&Brlj5@֌ZOH A)Daԧ3EC] q؃[ %tB~$:8<:B|zpjxZZ:r3.!8<}!DDDTRҋӴʹ c2XV?[a  ǽ08AB.K DOزDl-PrXjkVjiv*_-Cj凷V[\#$9巸w&' kk%?Fv[ s"Z)}2 ܒ v:){P|y bD9ZÛ'JckwI|N֫T~:&@9=c;opBP|s /A̽4-^ǖOݢa:+PHלlqBFqP2Eg "Zp7ۮ͗E\Xy-TX,o Eƾ Y5S5wjxLr'eq_,XMF <ڹIy Z{"}~9(m&V JZϱ~e&LyzMGyQ=OIj-Rwb,ռfZTml1UGW_%?YÌRQ1lW fC5 d an8"G1 Y Og_|+[QE]ۤ(9Sl<.qU>T.'l) ô6x&ؽ3};&éƛ+/F2hɑL|ӘNc05!nI-mm22u`SbDz6eAٖS.G|ǘr9;juUx̕ e nAV&xo)tF`|}ɤxx~?Kāg\; |sb1SwuD%KaűT!sq8k%qzJ @G}~=u.YsDz8쿽b$1nMUC:6#}(;ҩs$R ;nvV60DŽ1@.ţ+ K,O -R{/ _fS=3Y# ]PހpI0llH^dϦB.-Z&OpcEЩj UPپl<;\3 d{EOa9YM*VE k.FfʸQxx_12B3°$0f?7;ne:Q(F[^ݝV{6#FB)`2R1o8ABpRyS }>{ ̰D:5`G yq!M 8HJF=va?32cU) =Sd4acH ȢQ4Ȇ7LȑD; J(,Ƚjk5H( ], 1y{Q:޵6r#"bmފ!]`&$/$x#^[\OQ֝5dVX* XӮ37t)Tv2 }ᯇ0 'p"(C )w# ;M H2ee؅a[؅2>?I|l2>mocwsa!&418?bH0W~aqD2=XȲMo?i}juZ3 %tN FۨeVyIhurW{D1Ul"m}}0z)Qİ(r`B΢nuWПuӝL=ȃZW -Z\;+;gogaq!U \2ΚnAYi [)0S3|'@]jJ,LTմfyvF>ŵE_7HA:dLw$FkH*~Y&E2ƕeQh2럥6*$g}A(ʎt>}mqbZͶI71+ͣu A%kgj@@ D 񖃴j[gAucTR154MṁTst+Vӡ>yO\pJ[ܰ+370Eu^DvJD4) ̂gJY52N3DJ ˸E<` BZtPV_^"u)Kmp\Y]_`e^@ˡ҆'"jdYTbxVRZ񞴱A͔i yWIT 3DLY1ܻW\JX΃" KDי0+]ڊ2P1n4H߯5@^"J ZS_9NVxcʮy՝$F2jq, f9+dεJ ts l5-߽yx >=:ևۀ3uBs6Lғivxl208YmpP1ZtO?˽vhQ=3cGΟLR1|=+68W=N5_g=M 7NE{Jޒь/Ӡ /?.-ݬ$^@oGg d0}BPl4o49~ oƃ@7ѝ!EDΩ:֪֔Nq X%\PԌDm(a Z)E_Y^pnc d5swT׬A5UDS(ڔD)½LJKJ$z&i"# `S2140HӁ1QKub+]c^V =?gZKGW&3'/Vt_q fmg]rxugßTiG Ҩ@wt_5~Vky9 ocАkڡ.)SRlh4Dl՝a{@,p7$C{,0oUfoǭgӝD>IfdZ foeSMhXB`epć'TFc~~NW-4j=L&cW}7~eok4n>وZ"Rީ:ɻVB=;EELb(ҹkCyJONETڥMɠܪ}=>\B;Ҡ/:$B+JW)*e_U)ddԞR`"`2DYJQnJe1yȎHi:P\P. G{W17̡gЬykARa~~}:x,4BamRcFUu}uπ0d"N*5/=g^ګQ{2jUQzEtI( DK8i(񄸒 9~|FNVH)j:\-3qgִŀJ\F`Y!Ԛ=7e)Pf*%*@Gui64WBحc;):6EF  \)JGoa"MY0 4% *Ui$qc'eF}n>Y-7ndH'Ҷ$'W6wF]߬\F T‘#k!HDŽGEM3@eK ^1#6&p;N 29UGrlU4SE:er GxT)8% [1x GD1Ppx($ҁ7R; nvt?y1X`y5r6~~t&tPZ`6=:|F\S|&mDco ȚMo⍟]V @V P f@:HCT2 "p_/0T"!}p,];Kz zT<ߑj>XlW?-7է,aYoZ1L%պ5U-A+ﬥh_[x|@Y5#x{WAt8Ϩ3Zm!Nۃ3/3&T) mzϖ!6ffqR |[Am naY[-(pN"A0t-8jNǙRoT٧/5Iӻ7].LYI嗋_Uhu9]R|zO)qNR)YxkK"+}QsY]?jANrdc88Au@ʹ]杒0B$:E g\z-`DP24 H!iҹ;;#Z) ,h-TKR#CxQo cԶ;0lJ2+5F+ЄJY +W:2DŽ!QRіC3%Ѵ +-k!'Ҭ) I U+̔Ly[a&)(5ػ9C׀ mGV*Ӹ d@L,)e1Q ,bEzͭJ DbsŸhC{,3S{PSL1 ꄑ\`P3@9NQ9uė6J&FZZ@TPC(FL!d-JS0%%0i&ZhKB!8ce#ul4cOiUJ(4x5|=_]81ǿx;>j~^>W| ~m1o߽axDKg߼Sq`5*@bgDٟwqiR?M&|হ!) ü~{t/TpzLT]ͼ엾p@IxON9-h)KyI5ȼF_ XEA+yw!6/.n& -c .u,ڃy"s*2`WY2*3!!ʧ;J;J͟VG|J +i\K" V, 3J{u`Rg)Y/.2A,:ewnr7뤣Iҟ_=o+^ƫk8OTM,W㒉1zADeJ(-VtD QQJK 1B0F|,'R㎲"& ĭ!:O k4n/C)KaLH},z4E3o^Uћt;̹+l4弣b4=n4ՀܠJB{C`@3yܚm̟^6y &\f>LZ[GWLwQvYY!I&$,Ϸc5Q Ksy J^.&}x|O߽I3{z@I4_-Lq'Ƿ\0.Cܝ}(/ѾIuٿ޼7vƤ~HT#[U 8SS҅ۯ,N7䷨!w |5XR l=싉oW-$uE}ҿ_na|1⭇7Cr 7]]i]w߯Neg/|;*^Ǜx %Đ^wdc@k 9¯//yizxN4:H&=/0X8m%9bd$$XU**<<$υo{#uR[c;-^Kzw S#Ǩ&z0!SH@Bx9yxw28(Eym3Z>ݻJ$)m3u~!gc|b=cЌ^OH:37#8p<Nw;ԡSH<%$Ǟt Ascs?-kҞHzx>!/_tDw vU&cNH6s:S$pED[UzUJoQ(J.OpUC85ރ%ư A ヵԗXT @x0Amx;f3LZp#hPy n%`!+ Y%xp@VpD0N;L<\k'xC|`T*8+eJ͉`"{N[cb4]E&pLw괗ݻ0NTj)Z-;-Oc *p-% t.@1eM'߾Ù %s{>1f߶B6Pu `[]_ 8NLYI B3G:CsWip[L96T+A!VKLwQJX^06$)0`NQKsj`xVZWX3Tgn/{U%W3,ݸfʈR;L@Q ~31b;Go55TS&!X;v6 ;7aҿkTIhtoc݇. 5'j.֖M&H19Id|vl}۷Wĕ=x^F _0usq'-IY-ܚ˅f8u14eMp" %|cWT=Ń 8ōYeyO=| LtJݯjXzs*^uХpt5,޻X.03^QJguP[U u=$DCtIBڤaf$3D]լJg1;ҕja[%Y K+ֈ׷s]^ݷꚷطXJR('4F,8W׎OgtA"\{ЁM|H_0^yVGy]˻ )KZd.Xt͆o~9V2E{e=PQ=ۮw`Ig? |gk+ѣhVvwXYAE;.-ƅ6STX.ae^Q:eS-{deOe:W7(Bӎw|OITA,nl͸ڷ9ZOVvI4C&Vxoed+y#RuDŋHB]͆ `Ĺ(]*qe"c|8`!!/* zɨ-CZaF0-ZR/T:;Ն%WbK8)s3p5!)vy ̴`UGSGT;䦚 ᦛjbQ' w77K,8 q@:Cbɹ+.Uѻmw@e=N{|+*Duoj!Z5ɦ*?qXl~=kpqMB!\k6΍W$u20DX9Ũu1~iTonfKJ;1`ȵֲt.G ;.mZޕ|,O]< =)cxrӽX/FKv1Z~Ѽ%b'5Nd X\<.BsDu'5Na }S=Vjb n#a|O+<0.ػ̯L*W'Z6Փ'[V"5%P|6KZϑSpx~SwOAY|]m N;Bi۝"h' HLHʪbHTdǕyܐHT^b,/` V􌾚w|Ǘ}u\|'c"z>]DϧtQ|Zލ){28Ñ"rXK1GȂ2nXcW#9蟱?W߯Fӻií}ѫ2(Vcڍ|6 1/9\ˏDQMN@X"ӵ 8茠iF 罱iT )9tdClt B߬r9&qcf %R.xj `YHC1ri 8O,Gj.j)#YRq /JH[KF{tf8 aT!] TN9Rdݒ]͞UuGL7k.JT{XBeR(8s*P "bZhVeȨF1/{}UffZ#U2Rkfl&J5I"x@7P<B#ˆT֪Zg^^NÇ~OML0H\j*f~l˧Gaqx`Re?oV1pE>~G޽} RX>&-?nP57"bxѧ9;n:/P擮T1«on^3'JS%ltmf-'*fJ5}6[bڊi$h _jQ8A{F6TR?+q,ûd[n 1Ti-KhH1Dq[-"0'.i zU0\@ðLn$k'x%PXʌ0U;_2De_ {;뽝@*m&HKDsr6m$2X(!:ƺq]ow*ݛ+q׎IHxp!!?AL\q&by~5^O8:EЖ94}t@ڼ;EvVU5:zRSE#9aZ#PImS%L,IMvRg}c|J \׾ t c-P8\~?ZߦbB-"o>Z_>8q +0߃}vLϯcOV2:u>(M0Bnkuo%Lm4eA3L\.CVCc P`2#ae6H\Hg0|!x.f feۻUa^FР,pK@Y#+ ٨C @pD` m;7J0|2yPUJ lmlu` 1\}ASpL-X`1dH4)"b c5fsZRiijGz0\;U -=`:P8B&cTT3RjO}=q քPPa>ʐJD[ E-ϰK0u4>d͒ T.+,zz$`+Z h+ZJc펱"U_)THw r8믐H33c2|#nի[+Ҍ-˩Zl )yO1RMLRєR 5CM~Mxwh) R(567g֓-Z ¤1YQ3D}%>FB:W%e)L&;p,|dH^+ /[:JxьB(<`V-LWzqՙډ((}t GaBO3rJ(΋ȺK&(Da塰Xsm; *uFS|d}RcIT;. }-~gqj3M电yЇ\3i̸=D cMa=+L`FC3Pz0Y+0 Hf:i墥YPD5mh"PW%\&;%CjMd ޾ w[2E.c= s:Y3l1V:_Uŏ5KōY@oSQ=|:SdS9Uꑜ:TCD'1>J9i.z EAE GAOoSPqb Jt%44JڹHKїdy5NŇR6GubLR`Q-g4*!UEdEhZzD12q2T1@ZAP*f{t#zRo]QycLET JS9 <)ߦTJ _ϯ }{}^MUy𴢩 ГEiz'QH8Y`-:XT}OۓR͙sұ_Hvމ2'V`J-.bu8^]:P~3xCܑ-\ O>>0}B;;b莞#e)/GKJE|GZ3޿J踫UGK`y/3^^oOӻkL{QĘ0UpGR3fcM1Sp 6΋(*R iݨ@#G:a;.~tR#;X@voꦘ|Mƍ"N-9kZ)x*I#I$,L?"J% T3.U'Xc<7t#t8;{j~&dG7#w uqG f@at(S4'}}QMH՗ʑeT_̵uwmm| 6~q}8Ф栗= #LUWWTWU9x@*>Y|H5 _ܲ]hKLEՖ$""wpI BH-LNî郗Y3 ҩcGiDM9umuP4Xo 02s`iDeg 0c3됦ZovKMU̵WkM@;[<@Qu}oJXF}&&g0$wi)DЊ dV  XF|V- vqKa\+#p)J̣ۋ=7BZJ?dÁ͆Ӟt7{y+ RݷЪB, !\c FoSxuZnK$Spc3d&{W pgr%Z,Um!aZgi mT_b~g i[e٤wU]=!Eآ5|_i͟z z{CN.)ؔq9V̏T7x&).FU] 0QݯSjJ)3J4[]m PHp=޶ oCRkCWql):]o vO( CZ'`tV(ďҸL"ljzsU^!exG"u!΅QZ͢rx[[A\ʶLmPM2&m!cKbTSRQL-DJU@DĒg%hR-d(2J NHEY9Ai='H*ɉ\l.4- 6xB[k]OɬX-#1\a3ڶ5Y[sLI,q˹=LN`t{?(q@n :? 47[@n$uQ3g𣞸5{js"|2;ݯhyڳPh`Ʈ.ԯ y*ZF4ڰnRVXNgn'̸S vu[MnUh W2:;κ.})%[f/ͽ飼K 1+?QhJm.Ig P:tJ Z>Pbg6y֪[rV]lhFft5yJNև lǕx#9K d:.µ9|VL峊mbk NҶ\8w[dhB 0OKMCg C\$?aJ,]cẑڊfT |c]>1zoH=ajzT ͛A$v:cN$qRS'&'Ƒe}, Nb[$Z635ő@HQU %9!4YGr4)Ѥ|p j EL(;Ac6Ԋ2e}M KP@aH0ҋ_I13YOZ4#4-=W@vν?}gd\1>9&: K盧~.` |63?ѬOuS*>4Jߟ?tp5|ocmvg;/ٸgc,M`Xo8JqC& L@: Redd<񼛃emcu}ugق!֠'_jfz{+Xb6.Lt'"z5&3=Nhm*7ݫOfr5dWG_EJ2׹: >ܡs*8Qjt &xE.mtn pw 9/b;IA{C}`d) >,/ Gg+$I+R'1SXӞoۯt+ :;/誖rtcC|Ć߼eQ W ߚx_3_Y+TB a2@+͸=) ,g=M1Qp VL/RҎ \`. H;cA b\s@BKZ\5?\؈ nnlLnӣD%Io9xVD5zy+2g0x'̪'nAZ "H82Zp@@ɻ}7] aLuЊ{}5*U@ ? 4P!13q7<I07|j0S/-S z0s(2ǗZW>4,!";~h;]|j >Ga빴K^!e 1OA T \n_ޔ0tW6BUZ@9 fS3Bs8Jef@ J8؛m6h\>By2+*s*OC'p@ v=/eWa3Ob!D: 83ŽB?l|'+L =6h'3WżrqI Æ կ=qD0AҞ 4haq-XQY$Th镓0-%f?w}2-MU))"cม3-C0)]D(SPn!?[?[f__]a'XQ ~M݅7 q :]LpKJ?7\x|6Y M}o5yds4Q☰rv#g8TQ"s&}y1(-vT6/{s .); kԤT&9FFQr&'̔+2S |mևJG eWuM9!Z кꑸω`T$tK\964 {:49 W3+̰3y=f1*!T#BVs$עEJ HJ[7yY_jJ q$Ff[-0%S=}g0>A_?-ߍgҍ7_^w}t -n9co3aL),>,2 Ɔ|0ࢀ=ĒjJ;d0LPk(e[6>&tbdCl?vGq<uxnq00JGz^īZVB[-b2ƟLc!z﹬76&wȇ- +}X8'јZD[Σ%`{0(…ZdǢo&3}%x1͔{Nk]UP+zFу6bJG.u WנHj;Q 颥 !R*-vTgPfzy| |q]Phl&zC @.n4Z`t@_w_v~6 Cv%51TW7Nv1J*sMs.hk.Zvx!mp!C%D e ^#:nA~-YG쿣2((vQB}9pwP(=Sxbx4W׆g8pe2VrHfHV9A#kdF7,g`$X#g1WP9OA[|\:cxh<[Db*9]k])U nW,.}!ꢺUЍ*{.h9C Gff32<;ֺʉ,;_D# վ.U&ȫ?!nq~2a|5{g8׊9-6;fO Դ^[%=211߆ޯv7+!@JS7kJ)uu4s]ms8+.ۥkwi&R5[35"$q~dlXy8`ߡdp~4Ӵ C4$Gy'auШ8%'d0@ Qd0QhIćxTFzN??SIex"Kx9okPjk92KJ[V) N%H\7"bTSgnqOo;#6S2HEk20Y%|Qѥ9Jœ(maV7dcMi%]ZvFi(Oas=4o=9@A0zX%i,X]Ą4rEix>Aiu3 lCY6h*wӈGgیPfՐ5qo-^B>Y_ '_UOԲ+[{fo%^{?w!j +]uMDӥ-Wa7:H̷FRf.ӣ$},j3pIC30g5xvYR<^3Jz~{k2LōԼ /Jzn %bRI v<~WSu}#ؑblsW&``DE4 ,\DpQ5 aP@QBx. '@Q`\,JfdΗʳ&ɽ>Uo޺MKMz5۔2U$Mjj)q3 U|65X LCxtRRn, HhC,|+\bAxXQ*Yj48Uiuxk8HQľ̊"_EB04G?pe14Ӹه脹Ά LEu4},yҞx.0/;p] Ong /wW7M8wE1T D(̘qP5| kT@br`fXtPe7~L 0m@Z(Y(Tin/`j[n789Õ3oYujKG%ŗvZ!FT2 卧uƈ- 9!+'ͺF:Py姛YD0wb698@[鼔eķ*_.ÆgFR; %Lj kOkX.B%mw/5bK1#cDײtl==lIF`쾸zp7Ѳ{#FP0ᇣҊC=z֮5Or6^J7Ph[IfMjήNO}n3[c2HvWLQ3HW}3&"BjVMs\Ӹ 9w[Jt& 5H2:mKe^ \k09\B9-en <%$-#T&3,OJjGoѕZV`sS)*ysJxǵEy,9HM TwVb)Ԫ˝ +h5XRp!\NyYzR a<4F#Ͻ/$wl ҴZ^"vM&]A6v/V6i $ Tַ4':;}_yA[c*~$@W%] /mI)܋|%s1qb .HeaHgY:N~LU8CIoF`8GlU3 e-x7q;2fP kM\)GЇ; ZHIhwBw LZ,yU=!g"•0ďjXuyK X,2PƩt*/'o%_73'td)z$7l/fց͖쒰^ێ<^ W&I"WǏ_>8oJw{u֯#5:ANi8)m,AĞ=fF #Њeyp g@h9xp5bk"Z6Gɽ9Ph勀RǣBZ-3{O,Q-T155(Xy (T,LpșyQƕ KPP2kh%y! ziziP]&*IEpD8)-ęg&j.jfRN*ZI'b!yΜE,RܹҰ) o%]Rc66AK0IVКʟ*I]ߜ7J4k$DbSy&E}͟AHTg_r?3CHH/YBy_ým_$L!D\_㌤ g9Kx78S3%㉹ZsLLii#*8iuTOIcbbDoMmMNxThe$H0l\\!XZ*،҂w{um`ÜLdgU'54>F4Wd/tmPjkyB'{Xiª_!+z.P=??h}ήi=SnC4L e_iQE߼<哕iulEߍ?\~9 gr!`|ͱ@HC"0cL(õ_]I.^HIs=R.Nm'vMA6M|6Qu{&lIQAk!5a u(jAPp}-0!5:"T#߳H<)x4p]?ȕGO{ޮIGӆLOm8^l]l >%qK_f򾍰zka7DMu&2Ko}s1-7/df3I 4"3rS~sr3XMʭ7 zF:\o_mI<(%AH{'\`4U*OF%ʱNiFtIYD4/vVTc)wJ:W~JMf~ By:&!ڲHZg[e;T_}yrUz勨Q/._TuZL/Wro= RXyxx.LA}ރÐ4R8DFiFL1I9͕($Jp^1-YVP8r'g&(ت5;.5](h I|L'#XJ&-vl q˛oEzb.g?U']Y2===0t&ɞ6iy$%R%)Cm.8udpRpimJ#&f! [ T EnKf4YpQ{k`u3 ȸzk[GZԒ1!ǜwE᬴VV2UF2|Z[؉ZB02oj eD9L1fK)Big4+ܽn618MҀYʢYb\ӎ>_eROw= $"rޒFBk8b7??^M ww(Ae[T1GXWʓ†txQK|l\ 3I G~p" ݏ~iN'B1b4=1H#Ƕz|Z#R1@S"k'j(.8ں/eї4'VJ~9$|˅W6FJҌ\l|~:">wg ʡ>V~hƢ/=`Ո#[qVՠAy["o@w͟21G}:b/4HF0QAIpLOޫ䂔>;RqZAjwh߼*\s$Q(d)<(KJ]6nq2$;0rsGO20\IKF Ct) 1Sd+JVw(lfUH7xOi#;)1YQ')C'/!FB}%%aVq;EKi7#9ǹG Mj.'e8% E3&U5"Uǟ^_EhXwv7{_?Nuwopu?{Wȑ `<"h@O`k̋DnR[|`0}":bQb)ҭV/"##2pw]ɻR hKVkt>c/[8l8Z#Zr# 8D6p c÷*V}z 3wVhR"8̴%hRFb5X!7";%ޔ2sz ](8c[6͠bWշ=kUܲD}4 ҝr'Rv Ѥ hc a>L KSLL!FT*6*h+⹨󒭤 !e{i,Yyzg> 9v^Lz:H#\e[pݥ15o#!jPҎEM~uWq޲I7eB<N}/R&ZldJ(Y(Y; M^jNXSq`(jHK@{mQHΒ"2yy|5 z>oN4턜| h}.nhIjF#͘ŇŇy|{ _q4+yM4+yMv^S[>!rN^oG!Be.PkPGt2hXМ,f[>jlaèZu hg++^=,@RV^7޷/M9L˅`SUn,G5vnmR_mYY;p-HȝAɐIN)Ȋ7tZ%E6Lz2y6IɓT^453 P]y5Ds是Qi %T#)3FZ=rGe5O]> 'N@ flyhDT/zwh+:2 h{w+4+KliF h0E+kR=FP⋫X xBK*@ [wmJzP1AVi6ՠȕAߣUTxCDg }71]y&fiP9n!+G LApn] ܦ=qF M SAO)OsYlP<!0 @%T%y5rap쭲u Xzc}LNnQ[r&&"$$["fp^c&`C3jŕ%$I_ |4õ=1kڛO=x~- ۓV D1oěo3Pzۿu3ِť`OZ̛W|q;h؏N3=e0}όL[N|}sqA$Ч|7am_;n־[ NQdz?s)قJpddyUR#/xO᧤4a4`Lfi @:2HZ$j!#deZ[{CгvG++ӟN&g'$f3'7GSg'5$hFf`@Vd@ 0*6(dnhFdy +y@=/8=M榵e@>6\,k"lʽ{`ʵryr mrK)T;L/*`j/KKDC#Wwîz`aad܂]6J8,B8vLr6|,^{աC(SJnʽؚs/@ D?T9']#D \fiE͎'$'9:E| i=yfAXSC`k.ɭET{>k gRiT)GuC X+Xй\֘ݟ-+}h7n#iBÜ9 àlC߄77o^fg ϥ0/} /Hgv$!pn e 4`$XQ=b4jf4I荭YNed-$DdLQ+, A5)N+@ILR%]M5n]Y+_]O$ûKPL*P:LSaȔ[h0VLuO*,;/fi}r;DRњaf|22,+(=e<ݮÝ1[(M7@z%oT;čQjMqQ!nJѤqCr-:$Iפ{=/%+ōt۪N@.9JEZ>dHIX$1 o+!|9,0d K\9õvV٣Xm3βetZoq)_m TOnS&$%,ԁ LV՘-SpA`. f,W+w @X+y_cCAրA DjAB& #ӆ%Q5z0ZW@_*ߤW{bI (\MI +Yy?_c΅=hlveE:+|jg~p 2+oVBHF `K`4>.o+ȓLH'<4GTube&u2Cs#G,j~jj¤*U|^jֵEϫU:/]U$볍ݦ+PWŢe*%k:ܮBJ}Niji`֍qWo.8֐r%9n\X%|q128~HG~ kjՁ BrgSa]sӋ 8Hv_gG릟oc|vV@)\!Q{mbg'ן²wfir6߫ŗ&4pus'W.#~F~8y8-o2QeJK˄ޖ+2P˯oYW\NKX=eX,ץݜ/{^?*߾:ݝ65'Y ,'V.sF\DkdʴܵC=9]n*vŠԎ1DixivaڭvCB>s)EJw>_UHnHj'@*S;jYV4c퀹z:к@.zq,Fu"U)62b,4OckEZx?駹v]U_M̵dUEuOj)Z/Mk](B[{@MDkVDkVDkVDkvW9dI9cu*:M4!D"],s gYFͽvTմv%WmZa;;lWzk-?`1-YXErEn>M?}h~.I20:[|8;y~sq,^ion~LɛSA-I :URX1S"@(F$$9zKA˽ `1<"%gI3Ks'8~|FȣdV,!7u'k=пv$n-Eª%XN6bX HqRȋldnrg\y51h[r&&bD.B8/1l%6/V\Y- X{"i{#k?HS;O?f }5Boեdb'v!YT*B:BC:g-jɬC7\)D SM HT85>3ݵ] ":#.Y!__gGǘs$a/<p,PUـ *m&s<"W[wc fI9Sf CKLv_gM<PF]䣽2b(#e#v8n8zJZ.wjm@_8Igv*riϲlro-]k}/s 9z~71O%&yægFJ&mx5k܊nZV<]5=~6b׹6{ȡM~z4w$T7B0z S!\2jθ]lzC"SZcyN9K?7 @Jd:gpA!Xp1&2xi<&|ԟ"clZI@Ja C ‚E#) $fE3T@Rfr-NpjV0Ӂa 0b bEzAlZR?]o7W}[#?s8 ޗ vmNt3#;WI[ncMbi`UR.nSq\̷\)Y9Ti1`v!_/RyAtHk2^/CG2Ƭ!4yK߫Hn{5Cl_]Njsuncƶ6=?[KzHcfiDOccnѨ8RlU#.O|֜ۤO\Qm'ᶤ/e}6͚Hࠬ[U20 ~k,ID:7˦9\*Z6On4ɘ@ +R` =gFg]8' "tN'ƳSrKӳfQ-٫gF @8ӆAQ[BJF]3#>8a ;ye Vr8ű3r^vnOBl;,S7LGaC2&7U)'rpH78GVyB4[\,4bX.zAh{ZrM-hx˯K ඗7 6 aOػY?L 4n)P [ΊZ0 _., iTږޖ"g6]t@oŔJK* ڟ(YQiP{\hlEeFTB|QVak ʱ/yx+f8-ᚃY˴tZRd CdY žkX58?\}Q }zz3wM*ف?*7G+ӅF?l;&Ÿz{ h{Y_9=lWvsDo>ߎʔ{t8h|Dj-xSd=&!g A%<O&?j{>K+*8E:/WSMKM>3>iI-IZ>J/O3GZY::ɹK(<2i&ݟ%"X;5&kH +int}Zn!"&wh8R[IQ{9%hF4pn^p/%Ol#ۈ"2ۡs;qK12ȕam5Єܪ O CO |vU./o#(V>r@ 1ttݗgEңQ^vžՀhDQQwGhv\k-+M d1-$|n]j.J"VYv8#Op(ܟ)SYA쳹gJ*:=yO0ÇSRDa% EWFYd+**`)W !|)EGeJSa‘k&0apd)Rh[S}%X\B+,9*a-$L9@"(8@'ٶ,5\{/{r,|i;=a!I(;[QI6ı~B$ņ!jz2*\ FU+ Dž+ ]ZGD)0Oǥu{b~O [~`&Rm.0d&OZzS5挔a[B<~J=Ӏ݇O|, x|04 C{;\Vˏןz& 4'::iVHKsi, ocTse3iM Ó; ,Vy2ߧ4A/lXI4%pҌ_#jG?X!՜VeuYp):ryBa5k1YVC0߹VaX-E;'b.j>bjjmʵ I$g%un$w̳}J X"'\=vEߖ_dw>٠}}`S;GLOϟipd?Bt^^^`VV$zRhC {r'1LSRw}Bj݌CB^Ȕӛَu,`R11X==Ox-y-rM)w0떊A褎źMwѬ[HCB^&LJvڎjxӎUϢQY*kb$:u.FwJQ-EL1&tpy̘wpIg;A3 9ߧ$IP[22:Jhjxce[|n CU!`R.WjbX14KC1וֹ l<{0B\goGAϞxgvqכ2A}DqtE6ZJS.xe Erk[qψgV7" `dDԣ^4A3>IkJ%{]3Z(2h(~<[ T@%/E_3{񨆿C(6;rĄC_qҳ5.9H`g-N}Vow'7ۋ~vO|og<,ηCgTS)X̓;cސ>y)$Z5z-}S cs1د>M -cE]e"@[v\,j-{[6Ls?[QhoɾsϠVR$DŽS% Wg]L2"T~,yy_Sٻ>Z $bs 9@n |yr@V#<jVg}p) F"$w:0jj[t)3r;yEܦḒmUBbN3ҙA>.`c~G[6Kiu=IF/ Ϲ*B='L#5d\>J״ӦR Tr ̬+BZ*Tiܨ8 ?*o_pewţŐ or~9e '@ ɏ%^̦MזxNj ][o#7+¼ -VEryJك ͞ 6/c!%OٲԒꋤ3nVU*UIzf˗E Rį]ereɕ]&WvYweOw޻ZŒHz8JIKEʥ%齅A(i=\k)R%KS{٢[@'O=`kBSŕ_o1sϷ;6i9޸|mkirϽ\dnSύiaݰ#N~H!ol}}T_'wx?>O"r9bA; `Hc)\>j_Z!3UIqpsO>z!GTxL U֪HϪ8*d<⬊joV5*K J:?qEݔF*|BR,YQtdKV/2310N*Kb8AJg] -Sz( 'eAX۠EQikykrV9qg \YVrJ5DA1ph  V2x 5Ғv թGCL" ($ޕH'8* )Q ьX `\2C!l&N=F rA bhv OSS5gN\ lG -j864%F7^ȴ$23xZ ۛr߬]ͤJ]?DϕUY@o!GJUo?V9>yNy"1x<|T)kDqٰ{oj@Ogx]t߷rlc:vuM*:Y~WvШCktO7UkFW]}] [>tIิv(mnCU(#Nʸgic}@i124St1q%4u!Dk,eK%l,ڡ%I%H\Z`BX(" ԗN{Ki\[ B(Z@jr 3[EP~ViMWVpgEpa=BBO\H1 "Y(v1* (+xI'2}`IqcJ$#H =ox8n.YmwW;C]!d+@ȁ* P#(oSL:`ĉr } qs@# %;UkA:NJ3`2MjfNp8cѺGUQ 3LEO|AnЁa!,[P-ȹ&QE&oK {F%>\<:K[8n%H9daJz\khpd^ՎaJV&x?{>gzpJ{ZOߛՇwx"E/w=|?.fK ܉fG|Q:-Hg琼D\v_ BpzPv/GMAbH}|%F/Iwsj@Z *;)3S΅z]?ɀJ#b@|dKɖ.-]m΀h\i&(%$cytFgq\9^e(9PҔԉ{@F(=.Hjs#pPz_j.Y"daL祌WZjhi lmCzN9^7TmuJ*ɆZkb+-6S ގ~H`l)T`!Teڋ8oTtPcP.q{kJtVR3h\)KOQ&l̮6hQ% AWhUt<2~xd\IyJvBYp&RKLiJk|t1 1T'} IO)HBPe\K 蝀FBhJ" Vzw4GdH8Ro.XIːnwx42jDn8h2e;nct3mDD9.n6m7 ܚ8\¢XrImӁSFj'wŇw_) +oW5O^_Qv!T8O+X܅s|2Zs]?wq?P.vfU>zK{E*-DZRPLI{O׋e㨙wcfM,Sl0Fڽ#Or5-oǛt]RVZ)-*- gY3X+nlr2|~cjJƈ TE2[ `(xo.w[?ȎbO"C2f U'V80eX'FJaIt̔2ST`dhY2#=OGl)4@)@SU o^_eMˮ j@$Q85GOI\(M'%(I=.pP0=`DG8YV\%(2Z+g_=cŖ1gsQͻ| zKȫ kcDj9j+L^g8>lRHk+nss^rԹ9܆K1ziJ2FUA>~\iob)u6KNyKjh˂L>,+76%# JA00^` 9K{1z>=iKƵ<\{˙6[c"Yx ޻򏺷08ZrAhM6 _BŅ=wp̐PdJ(P9vx3Mx˓Qƃ>?jѕIvgN{<-GyRSR{k>jâIbvn`j[#ڇ?]N,@=7^)N8?ƤJ .RU Lo7Įs3&:)gK/"͈yZijEcn9I:=0!Wܻ^RqcqA|؎ANQGT#zޏgNXV eGH蒎y; ޷gߓ }}k=rȫ =9 S୷:IdۻLvocΒ]~Ŗ,[qb,X,p}Q*Kπ&Te3|]X~rIry~E@{K:3 F E7Z@N*p y/@1Pz!uA䆗KTnTcGu9|ւHdBXBJCB=sE}錷?þ% BhKYOk>7{aBT4'@I)}P=EII `,F@ċ`@)r {`U6wEVRRvq.eͭg`3عB[rRjmڅgV CHFſN66T H F[;ln:<ija׍fEpdѫ렱W\tHWLȟ)֊km\W߽(a5\w>2XszdVq.85 <42 8jbVB[.̕ж_*ud4l9JYuusAcHO_Qu0Bՠͱ]Zl&^_H_Y qWIk2I QMl f|ko!mBv,~?Pdi45wME6MPKWonׅêd%}M6G1§P& [ cDW~'lvm|40I11!?S؜PT=<99X$  u/Z k&TkGVY*Dgl]Wg)aUyjFo'e]<{aO1Bjq _Tio'e];8;~yfnwEH033 JfP .dWF#{0ՆrpEpE,I7|Z SPTRVk#:qi`ѹ&aCVi:/ }c_0Cÿݥ|[ZEK/u01ZŐa7fwtϜ$&c􌐯w^po- i2<Ҡ8Z^oCݑ}CD9@{+p1`eV>7`ZЁJ}I; Z۳EZ{v vC_ ʠ X_:=l.t[8lR: R quȂwMF (Tk~4bNd57}}?YMV ; 3>~"Kjމe}U,„cz&ϏG j@&^N#糛kWNK?~g?KU(QWQ|eSR/-aY)r6b 4*( Rׇ*νLчɧlDNU7Uo(6?L.V "5ETj+%0F|V*}wQA-Nbҽ~ŗ溼Xv}gXRŲﳃ '%*>(chp\! rM%s-#P!Jj-QhsQ ĵՃR7%v=5'VvHzbax7.ݖL{Gc\rQDn[kYjGPF8}yWYRI,S$5ޓzU~6 3p3uzp6U5I=P:M:Xy:tn "GBV5]?5[(1ZfkD}HDfl*{ŤGZԜSZT% Y8cd%+-_(ESDKBK.Qؤ:S6 w&FqG|)\ 1Ut%deC'+ N(FE@R ΃çs'9*Ÿ@ [Ҫ(gUO&@63Ĩ9wF/%ΌF(&I;3`v0?nRdh-aHKg3s I33pۘnjZRc0#~Q+I B5qE]]ߏX'K>pۓ~~"v_6=ST\\:5#Н`7yqhh5Glj'Gb:^fk3tgFsJi KpOiN>3 =q]$}˵Yb΂^,J Y!gbE;\B#&j5D} hL jDIQƤ4(I{{QbnOaB"*9M:9QcflC_E*Wцj %Q(W:^m %ZqD%s1/Z}jeU3|C> Q\ 5)/mTR: $$!p[ tQ$ 9hPNQ[J ) C\wO\ˁ4Ut˴ hM+L4ݵ-n9b;/rGSOo-\)k)n MeBHPjs VңιJ^춘NU%X邙:_'7. uJ 2H%o-ۡ 6^ "E:ҢtP4xŭ3OuWWB=% ㊩As*HYl%D=z#}U9Jh?q;5Ei/weAb6c4sQh`,҈/~vIHk@ctRQ%TU6lvVThVT/X/$^+ *5L>"[,Yӻ򵫅M% )r1|BʇkW iͷdٹkD~~"znapBY2=Lfz.S15='޼@}o;1+CǧV4N`ÇfrEe 4[Y5qj]UN(^+'&;vpn6+<4˗N6n4y7Ɖ|q&v| >q؈',m͎֬5ׂ/TH!zI@-CɉL+%?|Ql9s6k6xyΙc O#׻<<DO hn)ѡX9ƈufQTJdn qUvE˛4ySxMsJ} 53*$gzp6ё3Ngښ_a%!e>$vy!q0XuHj ^F f\.[ t=W11w?pOyhϜAi{aPb^ppbjҝsמe(|ݐw{ ) ʶ^;L^~ */HfK XSfϵ, 4^a UIshXܝ湁Bk6FcyJ(2 Lu۟*RWY~NWX1ZX[H̑ߣ?I~@*k 0U{R3"i61WF1kX`֏P)` @8e۷UϭӛW.~va XS0 $8H,Ht~ QpXiT`V/'0ms#cN Q8C[ÙaQ>שu .E]pZVs 'vi:n8Oq!Xw(oyY{}26EWPA~]6|H:;l`N凛ۦ#mSUMe]3mpTC=ӆp/ObuYQa|Moe}겾 )xX*ltYߧZ-BwgJ;h;!C*~3$nN*x`;o L=X&g P [ܮKp O? [@x&gaݢ QԪS<ƣκ.[ګ ʾ/ܦBs@`퐲fp/, Gr=-#jR~%;ʤ23VG04jx\np5L1k%SʘXI.p?5֫9bM*lK$ot.7#ͥGz@ԌKCc7!iI(̢X̡ZCI@qp[̯GQ z[-],^_,ncpQշeK+Nw̿1wyB0^6v .?yhLuUek0Js?KX8藮 mk~HkB,䙛hMEC]g7 f\ RL':=Өe׳[%qvkBfTT ($юeYGw|*T d/k˚H TdOCo޾Л »L}{7V7:@ާ YYt]jYM*8l8]atd^zћ|GZ.gn,"f>hAYSʈϮ0d)#*veԓ2ѨfS}} >GY: -O:hK|M^K L mQvx K5㰽wcWvWj^(%s(JӼ}`}<{gvWVovy&D-}ӷwW>>2Tq}W҆Is j5GGkBqO25,ZQGо'@?Jg4⌱Ǔc/nd;[^|GEX;yoW?ޞ&x 2/C 0 LLX/\Hĭ[s޲a(pTʾH_v&Ggm}: $i`!͕U!v4_~,o̚׽j'/uV!}ӡPFKwy60FCMvޅW`T d~u\:oO4 i4 i<6oЋ|"BgIS 9'1roybK *0+|3O?K__ޅ2zxAnCF,AvD,>(n8|8#5HnBؔjN-Txm{U:*aYr-:yrY5k@q5%z=yBXrg Α=!LyW L)k\V9[jEm.ry*a'S["b0:2dgi;H^!-2l'2 u:Lt~-7LB- ;͘ (q|Ѧ,QFXW %!!`|bۣF Xg|-'^*rz8DSa%dz/EC&`@9 4,pӰM787k?g-X?Y:4a )#(^88Q0&97'ܜ. lPMyqwլn.3< Fo;c_κ))ǕMܻ)o\ǏS:` /x$ 6 9dxwI`?"1 $O3D@]G52"pm(^UxemWK=ÔGSʉh猚aj5Ή~rbQnW4O]P!aQ|Jk9@{o}7Y;[H<  릲Spe0@>c&\&ܒ9UTNxU>U D(ޕfS+?Y^g.9Q4KFa Q!0ZYKe 1LX(Nk|b}N_&eB6ngC}^f 0)a)A1K.\[ȱt^@4bbj6 o]')E)1{{@iƨX(ʫ)U oQJC*Kψ3˽ACs`؃N)q9S 1#LWsL ~ (5b}͞OG dhE8t8|1~z,'o!Ix@$h9C2a,M<1MZDg$ %:$B=网;$BmX37,|d1-mvFcuw=ew4uvkB6)-U2(q.:XQD .5Oh#ma,ddҺމj!t`j!O7QtJOD4 w)!թ̩踶ü :8Epwp2q1h+|Gue;h9r lե^;՘B$r,"cn-(-9UtM-l6-w9!rДF):f.iPzep8݇\)hc8wPD>+i/]j98|+L!J>|naG܀m8ٞX6'_[ى4nO= VmUp}w7-Of{Yu6ynJ6ψ ƺz~JίI-%F7Q)á6*}\2`p_C!%ܽCEvv&ɯs.EAl!yt)$SwL{"QHqӁdQę1i0i0i0ilZ)u 4T8)I JQ]HE `5Tμ܆8r?hbX2$duq2`Z8C$)o"C݁f-D W!Ao$!,@20TfĻV@)81\8 eaC!+B,k0XMq7$ݜ^I)A{}R_}oˋM닺L$\)!Y97\;)n>Zu!R AYLDҷՄN0EyrA.6Z4b>H]γIT'î‹Bպjx[y~v | ^ esex"D%zO  㛫ϓ{_9n?~g5+],hh=m?1Ln5ev&ae kETz(V|޽ލ@LPc!X`Ǯ^Qǃ 4¿?oKb9aĨ%ۨm܉ <ջYĝhYrLr#؛(>o~:u/wA=}}9"N$ԔO܃wb.P(OWI+j5u*a֏ݸl1y6X1Y8+o8)f;L ›)W1C@J(WDNjNjJB]xDdESKX}J2A߃IA~Ǚ[g7n.j>˅B0>2BƟ>|w%!]t^dt* ox9;/.lEZMb'4 ]H,U )y){R<jia2@JB?!:_j2#$Y?cƤ(X p~j) XM:4wf:`qۋn01x~0r& '{'DQGȎLPn0Vs U^ay82kBHŁ%!#0aq-%( wBo}IaYGگ̾xW1;=O*['ד[`̃KywAǧ^<\mE85!}uoGO7~Bg$-su L=GZCR;&1Fcۯ+0 .O9udB6::&툍OT]cU (U:(AQݲ:("e"@^PlDa B@j ;Wrsۨa58R@}~LJI|g䮚e50sty8 FKl,%hXMrw ^@^3}xeii4۽@Pܝ9umVAH65 W@QVb[-E %.r4 J3CtrH}K7]ZT1r9سA!EĎne( r, @ˉ" Y\mWځBNK)eO1}u0ոphnQ quDM@Å8#[Pml a1I;Ib&1v>&ݱBvҜ*^,O{mN fc!_3_ʠ2(tv/G5.'~fGx=3.fm(}G.<{KجqZKsP-HY O3Qwn>YY:釬LyZWi̘8sɹh%S)j|_hqB^0 $yT=ə;fKNf5؂S3e31eW֊D*j"xNG;Z:PBԅMȫG 0G"륖߭7&mv6QOޏ4ff΂@>A-pz}"Tr!, -1gutm?[*胝?2ܬ} [3Y͌@̓somރ넹u\u s{˕{9W@X2s Dj q[ٷF?_ޱV'X 3\S{n I=^(%:Xmko6޶? M 9Dਖ਼z1^Q<п~3a89Úlzl9scLβ[&aEr'ƶ8#20DxU~N zwn&*?m80d`^7.Y,~sMbv\erLL~$YyV!s9~.h\z{H BHy^)0^N _0x@:Z>c0)ߡw߼c (Z{ԩ}gXHVHgooFlX2g 8fݟ!+;P &uvbS1!'ǜ#/A7OO]~b縈w糜5YH>/|C8!yD6 洡-KPP@y|{G?:m7{>KŒQU.*kŸmL"f[s Rn.&̼x̝&OpN=mh١v+Ht%$ߗޮzr-]?(R!`)HݡҸrȖ(ŸY:cF'zc kjԟUtFg}HvXRI]=, R,)Ve3bD)CLF(F&)VI yn4;!=gY It{{zLgv jƸTDD@x  [Hv^ c iw)%1pťKcFSNrN  JՉOTVM%Kcq?)./LoUr0>09 >(wi :Ulw;MU풘$jv-sT[C=MN=*'zoFRwezznkR=כlVk κydu>ĀtkgɐrU_w2<9Z$v'wjEr;FzyoF+5w:dw'Fjfw¦$.) kzc܎55pY*I dT}N!vE9 kR+ΩS&`B3Ƅ)0Koc svB5 IH^mպvbՓh:{X~%tȷhOw7v߮Tv=rD~R ūB=Lv;ʻ2yl$8Cg <5P9>{5v,nb3-*hN !VbtoAq)SSůh~TJ_^ +% 8-'_,KZ32Ln$^ ilSg#T y8KNbx0! '-W:{H\!VΙT9۠0K5qˉ#tL6 %VƂˊ yy h~V3L~"bXȃvr ݋,XԌ1dO$j҇CXB At^gWt{/L|1p2tyz[zہ}kENo\}h[&z[&^ѻF؆mD0_@/ξ= j0߆ "kVVfp]ϵ 6Ϸ8*՞n u8(@Dl=ý x~[F)2ӽs0XK#TolN8TeL¤Zia v&MXdUB>W_,z3'\oge&K蚝nf1&cGw. #c}ˉ=M>T~৛j4}OĎOy#xV}lNįMn^y=rrC/KFn?/!zMoֈZIJ|rn$+j<˞v#R!hX BD'v6|1JPX}ZEO4Tu!!_.Sve_mTgzW}=Kؘ]@ DPŒB'$% h!9W GJ\݉y-RáqIh\A yXDoo"wሚhlR, g~N& @$j:LsugFaYYuaPzq$|r{PݭII0{ˆI59:WQ~vfGV1m](gKi{ѪoNsLj<޵(R[m2n üt*NJ^Qq۬UVWV gh~MRQ=͹ 5-PO}x̭[m@c"6+{h7fV3z?HlSg ?K^KBr^tB; oU>`JaÆr%WRzflV)f_~; rOp'\yߢ 7]B-'l93lSneЊMMkC*E$)#c('tLB^Nno ]mgU:c$D08` jCAƑq:&2(En}ik!`'c"rA\9=`n4IZ-=ehl-/7ڱx+!v}q??c^ԌqI,?bKÅ_?Ҏcس-ӂ?^i {N xÏh'_D<1(NW@1p<ֵfškś5퀖rP/Άi6c4_khB70 9e;IZ||*_G8:P25_&Bulr&Su=lD wMM`eJ}iqA?ݤwSr?~ aLG H"5#{3X Uu|aEΝN/#Tjr҆lLn7'o[?x<݂&z% +9Ye1&ٻ6r$p|"q_f`I,y$ٙ [j=V?:;xb+dUc}U86٧rkKr|1ql4XI} ϬÝ7|m5Bg`P<0*<]p᳜xg^ߥkU$WILv5ay ˿_,LF ,kGgXX{Wt9{r/*!kQz1Fr=#oBwLwey wT17=^y07af7"Fy|wyTґL!KjŽKi`F(᫔STkDc2Ō) Aqt' 7Twčgf[O@f&}Ø7 L0oY3.I+;^vm~~E&f筽΃d"]1:~0J1ăDI9ƺp*LCLÑ*Y|z .͆_r)\[LF)B!mЙ@ØSd*&xoɘ"KT0o.m/S1mQmƴjtLSn3SA;c2X*<$A.[gVjJo$z4=z D4zQ#A[MS$άDQ@I{bvs]8vBMZ8PAfqxd8߮SY a1XD /[RQO?|x|B*A(CW]wOz 81ۻ3HK1 g0dLg}3'B=#3_5Lx?AmAPR+tOXt)P^zXG=T .֪B%s)hمU\? , % -$$A"q#Z":A}-f,+/lqpGgwr܀Expz@0\yU%hep<(Pq&%a 26Tpxmʏy#/I Ȥ~p6&,L-&,֟B.뭻.kVow C_RT$>{c+.ohsj@[^~wruZ܂+<͂ݰNo<0-jIyˣㆩQQ:N_S+%%mՄe;gq](X1@q68x.@0xug4:eD1+qKk ﱨFv/*NN-';w/*Ald9zBs{^@ZHLzN PZ#b<<>YGPM 3 ] b?Ƀ_It#5AWjfP |2j3eznr\f <'T*@02\8P%4T21Q+ ]kkU;~ۇYX0>3j=8`p26cYRMX̅(xa0fԦ1YǥMb\NOy]iڕY. ==.y/|'Ѐ7܏8RR_Oϵ1좼w c!yhExY@yok;+ɮ]ˢ3x~RR]68[0pc}g=On@b! y*ZK }7+zDVթ*~pEg֭vCc[UNqL{qnZ7Y(mvDVթ*혳!̺nhukBC^HQ(Z<&qkJgf6>RPR8Zؔ< !E"zLRi&-RLE &(T='oEV(g[Xy"%otoZC E]G%xZݬC]IJ5$Vt\-]a DA\AmǪwZnHJ@ĠDrfKTEBD)Rꆔ:^!;Ϧ7f|}EXH8vbKEy+j˗(^vi˜vf|c D1p)QՉpOKL]SHRUO׾1Pvkkќpܳ/n,(ԗ/g%F}+beڸnUNxJ\%U˃&}JRG ک%mǛ2${R '_/RGrV$Q\\mue( %)yRA5odԒ SZZ.gL~3 ^wFp+.U[%8B1pQ޸U}Eyc(/ j`!̓IA;+ʻW[p}G\9brBQ5yˢRSh3Np*o j:DXf)oS $άDQ#iՙv0ޥ>~5yYK9:2]?q1 GsW!j+򍨭JrDohP-TXhGG *2.~{2 5u{_x/V:߮L>M]xe=VJk]v8:J|q2\ n 86J|!FCQ abxw%^<e=#R|Nwd5S$I#J<)^(?KX98p|KȈf}[+rm4n\*Dt f i;3Uf*[ r)@pzkTK[:kQ?Q[ ׂnGI@z=5agL&KǒL oRZQxq;+MF<@J᳚Cj3~ϮtZ~Epxԣ-4 DM^$*0 -VФ\Ϊm hN[m>GKe=p=8J.(^( :4V$y X%!6C3u\ ~;RpW c$YKQM)w6J&btHc˦2nA ˳]䷅3'afw;9ܷIϵ1wj?AngG09{N^ ^IvBY,\)+Kmq%,՟.'[eyEkCUN΋i4A}nu1QbݎN g֭vCc[US), Qù)qrSb+7EF6^4@+)!kH s*Mc.a\Jǰₕ Gzżs(Ҕb8c+() )}XHZ"hjxTHxjԤp~[bh$Ġxm *5u.(tC0gR\FwP/qQL4z"Yc#`FK mJYY.`{87UCBLF֤VP3f$($>S( Rn"%pqAm3Qu'[Ty.R} >Mh(2lp ҕT0n{3TNS(Nm8W@)#㑬6ݻ疚nzZؿqn4v vaN_KPd*ˮuaf J2(({p'kUkK錠M:bdw0.z\9nO6f@V2sI Ecpp[PI*JB#Aro ~ A4%>+POߊlPζDFྴT%4m9;j4aRLSF.WP ކyw\&&aaX=kCls˻Čgtsr7>c/RxB-׭vAƺבXvM׶'eoMEoa%b:-/"8/HtĚ$5υw,t Qjyo%L8riŰVxQ([ϺrĨBw`? J08,J$,;*U#;gV{lG$VD{ɍ]0_{d;@"ϑ3#Ejk!{Dv=b)U]C^\F9cYx.ϱ׌ b<֎OT'gu0#8G)E1I_聗f⅒NjKA{z_΃?γ*!HTQ*AD9ʙE*@"Z "Cyr)xNBe +N_rfmkpj'*rk|sk3Rl%y? Yw̚{^8 5<_gsJ&G%$E 'c*NȨr*J,^4@IL)Ж#C3)P4H< }E$Cʸ?VaPc%ѩKݙT2فq>HPqDYE޹ҟ*cwiRncF5( ^uTIBeNckʬ=Jֱc_Gwرd[HN&E !ņQ Ƭ;2jg:C0m̲ݒ>أѰjjҢlyh綍>?9=7߉o|Y=.][.zƗR(tbAZ}ޜh^<8W9q8ɹ{<1Aɹ|ΘL_uNMjtJD:xIÇQg_T0І-fv=m{;wrtv>vv{vtI`ńYqrnfy z*_X NW*gmiE$ƷC⯩DF,Z6<.gO!m ,Q78/iR$DqbHdgWH&\51)_V +45DnE+M<,%Jh] KwJCI0b@tsf#:vaJN&s !2DǃX,&Z=r/[܋.JLThZwvy \Od:T='FLBrvnNYiu0 g5(U8Jt!Ԋ8xM:8;gc( zO7|9ȯhGdfyL+\}d_v5*E?oaN}_|h)1 ,QZWs.#yOA'&D8ZvғN#:O$ C"y' ~_y21r#k(q"Ej\Ol+$ !a[z:<8͔~\""(t   By#2Gф97ipv+poT,8k#OP9JM6: XԎi^1mPYanUOs O޻YwO)9.)%yJ yi,Z67$Hڀß~Z*}{ZT,j6O??{,E'!Rmo+"{>%uۋ۫Ϲ߬I>q ‚!g P,2ҷo3sw^?}1,y[݋Rʑ=D*KՙY^ ;/6Yv9M4i/\& i!~Y~iKqo Oo=ZܽuriJhiվp.n)Zj}y ( *O˫{1.b6"f?eS~ϡY;7qaoo:O4?joy+;NdQ5bD3`ڛ"a6ZJ w/浱?O53裂f\ު_uҝd)sNdL4荅5F#K tzX-:T%¤y7aԠMn#.$ZdڥǼX~E|Vߗ$y|/ \EzDr$ۿnWhcw)mt/KED۠K2ᕊM^/c޷@ 㪻u1l.ǙR88,_ `4Rˍ:p^ObZi{O^/~k^줹Mض$$^r9s'.۴y=A~3Yp4'<c;|zb,:Ϣ<>/i;+tjvZw,7cnpY~cm ~nr ~D*']\ը78/FDdKPxA dܹdN>RÐ1lA5٫J؛>VwEEs۲h E;t}(~4wo(Jn#UĽ󿎷Z3%}Vl"ح\cUx#x>uXӇYk[Ƈ{<|J8OK 2n+ S &ܷ?JKl̩J]X>,-m3y{fv =ga١ru6Pƻh>%P~nwAIvWS`l?л C.ڧOŹݨ9xtߨn"rdޭ\л C.zt9 \l*%o[Q^G)W.%2]WfG[?]гQ lY|to,52yE!~q([o՟%6g;;JfΧ7dGyx$˹fK.\<'T0")%Kfc>y%}j;keD$08]i83E$BhJ> bWA"ޑdl3ź H>"QaAS" 3CPLKn=<|Z~N}V|qw[fb0s.CX=koGEЗ{UZ r8 .CO[+YҊr/~=,(ꙞR#aOH0\ȂО )Ì{Vh"0x##Ce"z*^z)ס2m X2 =C<$\ 2%&MT*dT]TʃĽ҉șdnaG4 PL5h+ƢhT8Pvf<'>Y{աЃ![!^[O6X_peZ+'R9XW J&@Ac&0 mA!Xxl!Јriϝu!9vA$ڽUpDJpLz z@Ĭ$ۮ@nc8 FQ ,V\!qzEK3RJT+1Te"Ϣ!JM)5vBĶL6m;a7]cv$ `VFp.܇gs-(4jX,B)SzF>77G멶wsu+NߐG)ȵdoㆼmȆjg4lly6^!V7J A)+:EY׭‚sn2=aIC4)qk5H\shBe֗]\mD"w!pYM6 :{|YkQ3HN=R@1c`̘d/kف<V]d6/mh9ezW̝|n?oOv7Wv<Rj.o>ґȖ_ewv ݿ<),Z"o xC]R-[ Hw\%>8U55(A`2DGmֺZtSO AR+pU[!uɖ; 3ز:}`^D'-hTKS-}SyX(` xJcYarD`/+_;]CY6v%IrmiЮHZ$؏pW̩ۉE6} M_M<=;71>Ŷ_8L`VsC~5{DjMIcje2"eul K fAf H9p6Hh(V ;Kk$ "uoPl&X5Pkfr\R`ySRu5t9Es*k4zOh$䅋hLu~?VKѩ}GG+Tٴ[}B6pNˏCk7vAX^|S3̥EtMP(˔RZm:MQ>RQjZb {_^2&W Yȶ/Syw*W#JUšzb fLRT{`t& \YS=jȬ;R;<6@>v6pHr\͗OV˳@N̳g-m6^p>=;71Rh"eqn@WrDjDfФD0lT Ɉ#Ғ ܖڏf)[6XOv=c] q=ihVW pabluP+&w`4\ݻjfNBe?vj&51%wN?T mt1$䅋hLiv'#F\n4(wnd>Kow^F.U2[ȏn4(wn{z$ubLGj1$䅋Lv#$) Zzޕ Z–TP*̭6ulHY+;rsl( *[[-;>c)aEI"s[mC$d'~c?]_4~y^e!jۈ/Cx uhA{zYY]ld$l:{s٦f_hW !8zo#(iIO͛oV_MHµ:t> 7.UU"zzuqq=`*k' } KۄvOi\tZU`^]ImsN)86ku14j$#'V6[ )jv*J^[!`7oblkJW QOUf'Y]\XʛxB긃i3TaBKj_\,"g"8&06d0zR:(ڀYS>Lt V8J O5zHp,Rr)?4S!+k茩1&D.*^S%!bU>vtFఙEH) $ 3jYe 7!Gg<wWɯywV)`D‡@U`#cc>Ƙe'OgM< ^`"Һ8Blmկj6Kk .C Ti4mb)eސ !cQY2Ӵoji(u`If1on_Xm-kQN4,-վsӘjav(^چq;pVqȬzy{c-i^-Hu$Q k Q˟"w=.iyCsCzK$[Ё epkBe!J0ؤD!2ӂiV%_ewr6SKII[qdK4&)Lhbr)*JF Y+6{?60>^V@oj niZ‚T:i4-yR%\W;ؖ~0.FMWOݴQ:Z^2Ogg'7fPYmu룏 0 h4Ď?ov Axd %0*wqKt~j۴^6F@ţ`ɴGkEȻiQµs (f>[IJu.%U&2ޑ!~&^7J@mUjc ml .C+m5{vPlNHҬ\Dgo=jͥYofs9%+l l;갱tCJJލ| 7iZCPg_gl̦2)m>),Ԡ̫̥ 1|DSA{JȅONj)cTQM y˰H1`P|HEx3澠bLm/:oGF 1Exsxڈڈn1ƈL7dmJDcH’*D^e"UYbukY!PI)?{$Gb.>e-A  F1Fkni7X%G=Z*4SݥL o;^mJ~4jJ-Y,tZ|&7o>}xn <(pߤn@fѯKhv b RY.7m=_~[Dz)Ǵ@ `8T%?- 6by 0Ĭc GKdiuG1+DBoc Hc?Eb[-%OM(%M_+s(ybIxJm[֧:JTBBOTz ־nbMF[k-4p,atȆd'jO=OXyma A2T%Y? VD k0oP^/)| •s*~`~-.lڔx! Y#'bnf]9 V5}!Do.<5|zk*#]{#Ϻ/Y%ېBtx!Ny\.$SZ $!Bb 3P!vHg,@DlPMb:3)fy`iRsZ@ +QSɫX$vKK!5M{5󪝳)AkPQp-%d6SgDvca4Lqf#nDbͨEk4!Qo5P>2jeF&Q KVbVr:F*`vPB &Q9L$:uPd(t g6b}1I|D[Ogk n؁ľ#W#jED־ΣQDZPehМ| ,+6h+fycժ)ITWQeIݵ4Gq~f Q+*hp$~Q'Ql ɐ[xIh%xpȪb0ʧ"攪Nl@wJbid Kd r@% PThOdPo2$t:3"$Y)6/jGZͤڵS޶liqt^-±y6. (X|:6}$_3TlffbBRze 8ld1 !Hx_Jsi"r|ŅΔ5ST |QoR6CD(ig.̩7IA,Z˾}qD(E`&Ix="lqDIVNY4 d83 I\s aD Þ"i, DU@1] G켼F/;Tz^F) w\t$"p {sg=ĊE&~}yi?ވ֒|3pEW*p2FZS!b$5E\$3@}Bۏovn>|_Wxj0MatmD{DZeX$5C_z#&MDdgx" Ž;<ёg7۟a^0]_EZޑ 4^Lщ|:%Sq|iW$>ў'T2Hæ@Ni:I ǸPOOm&Jp |25|?bº'*e #E6CYmEJX»BǛ2 hu.~׷ 7َeI*RL5]wK쨁y&3Fm/aoGAaXHs֮\T%t۩; ;^Ot0&CpB*8h O) jTY)Xaq;(ΙY G]g"?O n eexf- ԉ9{\h6z* 2.sB*݉4:fq1ׅLnp&kWAȲWz]Zc ﱲ^G>ޢy,z;]hHxmJ^+7נE& 6QP/{yr{xնvS" ]"vrFuj!Z",W`X8,adfWnv&#C^eč O}p[Obq=|mZWԚf?뷙 8y^mTI?WيzZwl4&7MS no$lStE T~O|Tjѫ#)\,`B|]+?{D߆#k.X|t;nsϟ ZQkEu@沰\a#s)CP.!z_^{b7>y]}E/_eۇŠK|+۷1/~)n/߳46ZY_Y=uR|P0r4/2ɡK{K;mC:RjoQA" }%$"0)oX,};BnTڧ;skܚiVYR'pCߎ`bIi}=B`}srח{GXc42:jܓ. Min@kiBt-[Ɯpϊsj88`^kOH&H*L7zlCPiI#!3U D=Oxl0l2gP]y{-_Sۣ=i[ Qqa~e߅Lto;z$q`qN:D/%^8:8jtNGׅj螹N+2> lɖ/8 Zki[V57g`z2*T7Xj[>!g`0W';pZm#ׇڙ@cȮUAjA`g:\| %Kb b21JӔZ9$PiOgiZю菁ʾKn&k ydWǸ"vjC6 ~@64rz7 gbbIrPM,Bр%%}Q @j/?APb^Tސ4 zM-h]"k07'( N9:$VQʑ !jNϧZa&VvD2$QDnev%QJ (#-hav dm`c\ɝ N.Q%c6doԸP V_0h/4=_չ#c[4֍ pe,:j 5QD蒴;d[jvV]S.YO鱏&'G[IʉNFgϦJVqsh3g2C} oo2'ݠ6pFkCM+`fio<,DFXSw\Ot5M5, m-G{nhq.p #bwbfuV:<ɴub[oi}&4IKCTU7|~QG>|leI Nl+4_=î+¸qn輇yrhEfIn٭O`ABত}%qnJzPwt}nGJuS4*t@mہ/;j_7EhBhwF4nN=Oʮ7uҐ5ojp]@9QɛgGiȻ{}$0M(r-aƿ=Rwݛ߷o][c>0}έW7kE sUY)i'9hak{wo݄:=<KdSb[^v:Y]H3ե>*&ë<B31]*"׺t mՈ4]'=gf CneJ.D,&{vrO coT\˔LT)DCLDUrL5cY$|ߥTQv:SCT5Fg8(I9.j*>Z W eTG2Au^-0aNVidaߗYݲJfUv,vK)f|e0{ԈKBNa-gX9>G%6) P>GɍD.3P >*ie z)QyY}4Ք<@)U>XFB 0Yz#fxO_ݶ:~c$; 7,,ar,z_PZ6^Rh0P4XeZH`[&kF8xdQ&xgPH>! :soMczo>5HMdZ #RWc~#.NmmJV $r8"ۏG^Z+d]/W6[+eu%"_ >,/.$+!5dERX?!5KihǤ~@j?_1V%RGvSJq@){#׌plRj|Tl! ,CY J- WCyD[ybbJfr%H1ɏ+_3BI(Yy;~ѐKN.kYA gXmՌCy RvV>'X-w̌3|]ħ#5yy/OʒۮGMf%<5P;YŌCc'6T6B3"oY1Z!'J8wIZ)Xϐޅ"u"RP:DvN[֠3IO^6T)z-2a( Yӥ.IPF5IG.^lҩ(%d1"}PzJaVVe#QJef5Y'FԠL AJ=#lL Qy9!sf"̐ ~MbXOtJݰlNW ^[ Њ\k,$< E2N2,,C_gp .QSd1|X@DsL"4nMC5VLXBaIes3&SDOo1K?IոIpՖ8iZDv uZ&;N&OWJKs3Ķh/$/VR$faO.`SD˚ۖu`%pT0eΏ=u B~U"&V#zOX#9,;vȿ$ Ke ](HtQO%X9$WNl.g͝iaBk\҃FIέVRX]k^S~4o=Q]Қ!rA (]ADA̖c+ +]tAhf^ގ`cZPvbγ @p ޓG@cra=(L@ jC`Q0?B6czvR4m3 >S )Dlu-;Xq1ς(H:vc)# `<[dR#mby ٿOFn'ģ'YJ/H1" !L ,)fZa+g4ruؒ{SI%C|]~)eXGpd+/U d;Wzgעj<>JOl|1߸/*kR݄(?u_}h1C O;BvS0Gp#{rؑP 73."sݙbK֋Zy͸vuL^n=_qΆR>!.K'`QI32;zrSYRP%"N%cq^7'I*!/uQV ,7,@+ѥ"n&&ko딂9l*րg,r6H5,j@ae7AaũV8B0S)h!2Le}VE126d5E>:Ɗj4vaD͹GXiDܔB,9 T1|sij.mzO79d}>ӍXr=!A]j'$z%N.,!M%cqgXZpMaUr {dDnr}MT [FR1_?XZ4\~[Pm9JF[ ٛ$T%=@|iŲU,yb[kPy3!ldQl=.}%L sZOv+If_jPQTUV 7R N9͠iq> fP ȚqoT{įR'Ku8.ܤZ)?P5.p8z&ӫGF3$Y] iOO лu} 5R{.H|nI'&?-Dg}>_.Kg]$ #ڬR&}^oW\o4NeJ"}ᄭF6ؾԳZ׫,zќnRD ||P~wTew+FsVƾ fxGOqt|Tө'fizin$E꓁%Of7:M@,6TUv`^lܶ?$xM^JbjO^er_LdN&R{3J9'L#'2Q^΃E g:~4芫Wr|Idă鳝a_Z#΅%Y~2zwa 4/5U;&p-|/灣$ K^G)vkMpA5vLf#bn)4+6 z]?_=m~\YׯXx>W[35vvg̴\\gޚϦk9oxlݽDɉ\(.qFDI({i0Bzk+h7Jm_M"J!@ Fm1Z1>@h$W&A^EXrx dV jЯEN)D_ L'6:Fp{Q1gּɬYBgdiՆ5->uW׏֍JH4sTLвk{f_ \+7r R k^%^Vs͇fD^y:IVJm?`fϸ3Z"]H86b21ގ!v !ܼ48vQ!:%)Q(C6^|GcrVȴd`BnǨmU~uw+F~1lׯ&ϯP`)n܌JnInoxg>>V] ͦ#eۼ5Px[@QJgQKbvA1*s{'#Ey_6LɄ@Nzd90wDTMQQU,գʴیI[V Gm$ źZTT0lm!XEN%CsRVX}l{pPJdJMؠ]۰BAXesۦm$QYŠ__9Ftpa9-:iK8_[DV+Ʋ!"iCه5#3MyѨ4mta:!e''+_3|wjnj+QYJbsɝP82;tP2z)ګtԶhu"<`N"&%<1AB$rE0CjF=To`A6zJzGUJ Xބ<<eԲbrʄ$u'W]jh9*AI@E磊h.%j.2+%RYK]yF~ ${v{fi=ӕ lRٰ,VURv`vjScWd::5\kX1lP*!SOjdWp-2NNj8ֱ-_w>OF=}N?-I=/ƤRv F(ʜ*2 LhcMZ0-jVYQ,f%TȾup &s)"WnN bߠvIM>Jn?| މ$zUywz_5kgPXQh>vGxFVj6m&yٮ$;V+A6M$wdH_xrܗη_DI-A lr4,R.4y Ozvi˗/]/}pn{˘uP*wB7aZ;T|%&Pד&0O fhͰBrt7]Y &qZEb5e{o]BQ:,DhgJR .;5E3ʢz&ŝ(oHRĂ# XP+p.ِIj\*%B%yӺ Y8OV>Vz.NMvք>-l%[S ]Px{O&R 502Ky=XFz|gk:QýNzYt8ICͥV('?kD+}y>#H|D{!tt@wq( .N&'1',v ;س# {'YlBhs>kĞl& ֽVtcTǾ7k$~E,)~hyRtJ{ssR>?w?$#6 R@b^L}':.ac*^,SӑqbQn.OK|L([v:L6܏ ^|rm4ZjRA((0\ѠYYAڛ_ˀ u0Ϙ)T74iꈶi zJ%/eD:KaU6٫AEwi e*Xt]RVܨXt~؄saKLU hSٓV ֽk9k3/Bx}e6jcܒG漐s]WB3Tˎ/noDYGs376 f7ɗ'zB԰$s}ldܬ꨹ZVn>ƚ|>2qNJ"igud)>TKꯠ\.lL苂L5Sn=b̭tw*=٠}^f0]ԕ=icc)3o!/Ahc#Sy'ΪfӱlJT-e,_NߎĸPkn6oBArrJ՗[6bfa՝|v6YNзf̮Z,Yl?><.V#̟ +s7B xX\pUlddFgytHۇ>,k*`#שH>8"ۧd6xaeL> MZSi_ZZ5ӗn1͊*6¹+kd/d[oQ7)"`,Ф-:A&=}qy!\2Jt!oܠ.)lFhƨ *20M2P5  HJ)>2Ĥ(Yἰ2PfQEs\":6kK+涗VQ\P8YIi1 Ocs6a_x"ؔdmbEu7m]RegzzW/}KC_~n cj}^; l4D˜ُ$*rRbvATqQ9͑ԦRU.,Q?(Cy7ͦKBrҠmC&|참9ul6`Tn|>A!rl\lbЛ)>>g?7S~qj'Qm齴W®K`U BaVqUrͩ$ **$ZRB >j*Pe%ܶ#T" .gj[+F#l>CXk5NWf %Do~q]a_T9n(G ayƨI i6D8gal'zDp;al[ۯwv~pPU-8Hhq20g Š>m{=uӗN>o\e^@:dHZ6 M$| (F/:`sV6s1D zNCj#ŰR"MNNښ8A- ϐ2wOQd7?RT:!6RJV}h{ICJ:qNpD {KgJHTgZd9/gU2'\+tk>,uid{{[ZMn۶Wȅ~OodEi!Ǔ~ ͽmyĩPұ?[Ɯ!^9^كom @|zg)$J wӭ(~ä lϮHms *0,9Ҏ~Zm)\Ftk~MrߗHbA~?J.B݇@@z duN;i VZz|}!!Q= (cSQSx#G %ïx%G SۧZqtWKДQL1SRS[l8Cq.jn2D0LɲSGVRs/fRs3[,]waGͽ| p}K`Gͽ ;GԾIpjGYwK+>EB] !gW~4NXk-i4>L@nvеP in Dr3 %RJأ\qB0ju3ƀ)I6Ԋm'2p\'ilZlgbX>WfJJi !+-rDXps4SQ=y(Kq yVZ( ^imG3JOJ;"&0g6z6Ta#*˒ yJA2mT*c# ^,Q!dPfXa,Zw|/kZ:!ObDPZKXI<~TEl`u eUI$\9+SJD3&7ޓdW}YP0 ,`0ݛ*ȲF,+R(!QHwիN[GɖNCKq2נeLbIܸkk$T18++Qb)@D&q 9"Xt(S: "& k?FF@,gϲ8BvA84 G96~@FKp_ =x׽qT ([(Q70 > ƜV{ 1Z~ W27-7ޙIvy@\ س|#T ]X C^ ? '8wwE6]@t+~v$zHse͖f']^a%\ }Z t5^g5 rns# ʖ*\&%_VWB$YMU=ȇ,@Ҋf`v|FCw86{f!sEAlY*)'[dƬRlMf$  ne ?˷m\N+asI:.8 KRN+L.iJ0ZvSa%K[/ٷe.a u%'(igTPr-"64AJNXl˰QVӗHq<=& !:kt:a&,bxhN#Nֱ&ڤa8bXeXI[,]h˷϶bICS aѯ{EMwmjX4ͼ"z4fnWs@ɫk?b$>g_64b'[Os=q $g}:!\fѹϲ&r/V]Gɿvl&, ^ >}?.Y.Fʝ1G yN${5S[Ҏ&RH"1E\%ީm/vɚ^@^ c\Ӗ]p0C/\Wft~>c-(4!1{)iր_!9tSbD!p9ř/dI~ ~Hޏ3H+ }a4~40Yލ`?Xc,&)P/9Xt9>Qmj^o\kGlN8zj|C !$5KX_k!$Ti'ά AuF{#i)'\aĺΎ+c'V03J=|(1ysI'b5ett\t8GHULi"ƺ1qz1|(wܵç;w_gS nhLd]K[0noمk.֘F[v{;" DX޸P3?IӨ5LpޙIJK֒m^ZaK=UyۅChpzrσnW=SuYoYR$^ J7~ZHq9Yee~oL_뽓œ#M߱9žG>";{ }n7s%l[ېlGn~=lݹ⤬% EL ~6zOq,9h4﨣Nw*4kּrvkCBsm$S=}VpsD`ʚHĘ:ޔGGwVFs -sYju] kׅJ!+ +THwewfCN›Ι>V6kΙwwas&9s#aa(%&.볯;c^HDQ tp=i,,Yՙ=ZOZR=i=*M{v9ܐPp\:^~'zJ)sgcfK?[nsgzgs]F"蜸9yu|Pm GEis*QKZq›:|[TjXN)qW {auI5ye:F9 V1Z0cM%rƄ)'LhĄќ`߷L&;Lb$uI`tck/]w&Ni` E#D9ͧPL< ;~s\g$mi cf& dPfAlnnFHZjK{L(͇_?:b|7}/0 3%%ǣ{,YVUsfk7Vp V[iKһm1 SۥMß&\`$dip*l3 l}N;o8?2y %vsY݉TT}L>yE_eMD3xv7y'3 qN]j@kxZc|)xeEon>N[eENUcoFܴHڈib2<:u%]mq(Յws߶XZp9Kרbso8ym~ڠ`׮&.fZ9}/^;߈L7*Sݛ i;鏍xfze$Cxq]Zfa,)m1i+5v[-UKL}tٯb왁xqPzd#]YT.vP-3gw7\sTqG8v<҄J+v}V\ }*7V"$ pôJEI'a)yD4p%S~/}z2K[` ʨnfmr-ɪzQ2Ϛp[oQzaJՍB1p%y΃cuye ~zy>g9!IG:l558:u%jR !M1(GDTqq<588gwQa0ʾq?Dr$)>hoi 9 15)D1!-5 KD h$uBS |h7NZtԄ;8]rD9DtE#j7GTU9#[#G77V8\N , (sW#7I1"{rEHZ1ԮS!HmL1$%&⩥~\0T`Ll8%pjE]~SjSVc&[ σ/o ވgT ܢsNfp7]:׌+topq򂌰+%%*1eRly%)411HaP#JJb`_uȉjgUMi.O#ꒀܞ(*/XkQJGNT& Z|XT#Sm8S!`cHEUqTPk.nu'QLTCHSe>fӧ"dLk(#4CX"vV髛ODFc* #DӓxxVŒ~>(|@[4AVi{56@pr/oʤoy "jḺ̘9Iڄ#+%8km( Xsk4c6Niec0gD(QY Q *҅-wl4 GEr1m aI9I@&?OOӫaf5y OL~Q%d{,uW<8|Dqx)ZDdtڤ1vi T t9$}gp("wwW`耓M~~ߺX,FB\Pig<ȂZB#*1qM#0SCV0/ks#Ʊ.;H A+\}_4U=ΈR[; FyD˔h&OݰV2\ìT# THg@ 8=m@^\!%¤bpzBd1)-OI #Q)Lt "K-p00q  r?78RD4d3#RDʌƾ Ϛ3sk/É@z. |]wm2@8{l0. 6щo+ɓoQVK-;$}bXzb f]H d&]fO8 v 1i8P"lcȝQ%VJQi t eZpt+OeƔ<01 ɀ ^ ڏKU?+^üVg=FӾ.-cX39X2Ό*V*l io$T5Xq Cs%ASXW"q)"MT'#1L H 5HWWF?/+K,7eQ-N$H98eH&uo%"eЯ6B9blM+8;e3Imn &Z"v%c# fŹB+VTKgX@RJT'# LHA0DUr*BN)Y,7(99a; `H͊IA؝-)fqA_%1uea+CJ\8x6հ,8bX(Vq氖V*)0*JzqfU3"9, |->UZĚIŅ{`H͸&Z 'x,}s>gHj`2ŖH}}B0.|vJDX{3ᅬ)T>1JˡÑ2;x )e^2V)œV90i)(T2e);A[X RѬ0-7(,s~#a|_ A|7g%qΈR Bp BTfcK `!1/?8 `Y_ ZAEZT"m7lb‚/q"#$`'QZ w})h(LyV[ X`[')ڔ5=ҋ]k#vʄU#u`de57ןuŁN \=mR=Y`z#34)U}> :|iI\]rrpeird)1cҚaNQ0TCckvt@s$ZVk#a-LC9X Fr`SẶ5=-P)l =TnSE^F|(]$[Jc9\$Y<.F'e6i=7 5wQ u)\rr6|Aј p >rHyAQkʕJ\4rEA To l \K:'BPɖE@So/FLQ*ٲjrNekuXͷg5Jv '_O;ja9AOނ|>Uݮ(Rxifi=G;˽4th|Rau--}x-6ӝ_d |QeSRɿxp[&*ƺk`SxW~VO _{M7Lw|6ڳ- [U4䝫hNUNҧtZ7]IѺb:߈n#cY<֭ y*SSLGnN7bۈ ഒԷuKhݺАw5 Lh踻OXTw7⁃3Yyk;_ZJMq2 43ʘu  k UYFL2AbjJImMZ.($İj6pIl$Ӄx\#UBJ97}}W6F _BA=^rƶ`o(ɣcթHw6EA%y ND TTDׅaǕ"1mRo-ƔTNN. BX(P?i|KnǗ\C԰i1.8IM ((QYpYU%e%{Yph;WvJ}`)XT BT'1mD 8UW" n]h;WO%hrJCW=qJua777jvk`^%f}i˱bx5pwnSWǟ ,HC8BIzlJvI(>&)mqHDu@K"JeAEgK=7s)DNftJLgyŚ"NgI.+A*+#HYz:ѨHFXj}z6Ǡ& ӻu˫ zmӍJ%IƨY^%* c+> jۀwoكi_j8ZD o.5 ]@xr:|4c6ZV-DO\LX27gbQM NbQq7_@43T veH`\mc4MzJ7'&֢y%\^T+ܦIYZ~d~,Z5kE^5j^{ռv1XcUḦ.T(* LCCyNM)$Wg>߯ OW4X-+"0/?ZdT*>RhFX$%- -dn$B +N6rDN\`(*RY h͢ ]ݹEh!ǽÃ~ȩs1 FX]*]o1]p7\>γoԧ?,}+/2 ]M36ضL8h{/a/ 2,Ć n-tFpVZB!ہݯL5&FA{9\t z$Z8$V 0s5+*KPS`lsb|"CHIDh.rҥHb^Ea&k$=g7-Nx+;%3%$Uh+zfT L>>0"*ǡD $c s*Zoã{ SO(n}IO|4O.|{Ȉ/PpdFL3.>p UC孳۠;XKM lDEe̚]RVg{H/MVF[dB0"F{7ݣ-V( Qm}BR0>BwZȯ"< ZۓQ0g V-,Ҩ `hC6AN4/^Y&9EpӎWMNS,e/5 8'ŕު7DY?~^{y]R)e^P˘ԆrU&%QHP҉Bk Y!T đΐus2~tPe9~[I4)48񴔶(7hHCҲ4hFCT`2u=b Jaj[+Ç"4x#eaVk$dbA .yz[94n0=ZL{#F'ŴB:C$K#Sr{f>LO Y;Haw(\$@0룗vh0k:XiD\VC3 Vm0)űDP2\H7Oq1 3(DOl%^'%tzUd6k! eAHpt 8z`"߽Iw6GAZoqut{i$Cn/=FcX*$S N.d1AQ'_{K#:zT9B^_aW_y5BV$_Do-5!JT.LՖ|KUBT B`~#Gx}ݣ% ҅sM)'R׭E)XT BT'1mVwTSW}jd@ɧs>Kr6*]=<V?|/Quz#&rII>nnss{WO •OKo=YP|!d])w]𞗛o13GG!;EN5bH;2Ϙ.LJa39H/T;歰w˧3f 3.$/CM(xbelIVZA&%-yK(Jr RIA){R@aɡ($` y3VEʩ)$*r#-\5L{xnNK[((=n㊫Y.7w# :$2a3DkEׄwH\%雱>$tiPRɘ]yp i!hK!H=)]WVvFMp0aj&%r2+_ΔMhArpHe.Ɓ\m B -J1.Y=sS|8No >5M[D>>w-DØ2E"ñS7͘ö @s7ͦx`6tܝy A䐆e/R=!ø;䝰tVeJg1"ε(;wk21!98Y-Yx D!?۱YxwK!u-{%O ^2%),n)ĽG^g%u6.Y;{?cO<*7_ \uK Y[Q`d:s:M`]b]RG{ƍ /$4˱[['[vI%[* Y(E9o?g(^0 R%gi4ݤf5ZuZF>(iU}m1wWg:N`5Fb~KEnOJk1d.Ŋ`8B 9l*8xGYs^=σ?B/u]OBy7hiU&/?Φƾ8J:r'#37@M&Of1qIjfF'R*L<,IB8 ~f0 ]Z.z3X3nu :_뾏OvW4𓕝MG֙s%މAz,)xcMQjgbS걪*H#cμ $`u֚'55B m9RsFZLKq"em'BV[99;5bOHFIdn=P降&QY A%03wzէ<_ #9~WAܔ½w+Ǻ)FۂK!Hpz"P T`RR(MR41)DE_]:WxZ  ͂ʄWL!ً?ydzpt3-ۺe% 3 r!_@lqYg_I+Gq.Vz+{Ӈpoe~Yofl]O<;VIpB;wA^ޭA#}sTY;FmxoEki\rT}KB qOgKG!y}NDu,3k#bRiBR#:'K!p R'"yS$2$Γ2'l`3tM77ބ˿?;Oi\_=HVQaLc>׀sNA {`C񳝬to6g'=\ "|(l{ ~Wq |xʼ] 3b!6]cBFP4/ځ1WDWcss0 {dbl 󾏺ZQ$j,VHaRCˣRx'!'Rc$ho,/X2L-fCRO\ZzZJorkivS9'tJ?-Gq1]x%_bqF|SVZEs {ﵖ`Yk!ëaAG-%ꮴ]}iLzl*BKr|A,aq.,XՖ&d 3YcӑeR}C#6 nh33Y]yvJ+&(;+sT G)L/х!#@/)0//X)p1>1&*NI$܄08a;+AT&) -d*$Z{߿"@$0&\*.Qh׶~܌!E,(MSice)2;' "컔uU*nTs[h$! \:VCx]cNDq D㟭^ciD]ai]#iWx]x|[}bfO i=+OgTJ*|D0h P\Ʃs)`GwOtu?A{قEM~T]Z0:?djiQB`FѾtv(M #c̕[ |2ۙ`~)wЛtkKxSpbFQnDD%VNpj5br?]>l\}0[#W|ٶ֎H=ٖX-s[k$rI4o}F-%^K/[KAi)Ƞ GlK$z-D-*NK Z*LsDɶԘZzZ"7(sp|)-Tak2*T(W)$I*g_HF@+lX"EAj s.xӇlK)%-8-$S脖b,RKX!r]lF5(y{VB.2D1]kYSpFQ1% b :UVjF͸U\UA*,)E콻~ۯIYsm`$m[EyY{bX wj7kI|ivK'opyԁR]\Wgu=AMgր}ׯwߨP cٿ^>(:{uu|r2N8Gw9\-L KXewZg0H4WAֵ癮[B"z#֥KI<9y2ޗ%..]|Kٕ#9vwlOv.IZ&5OpS0eQ,j]]Ee,̔mT2W3獹9D^c{1GI|5iϹE Qj3QHu]j>CQ1kZ.OPzZz)4i+YtLXDxmξ txyᗏ+v mSFV ≮BD˕@>M2z1#0 0LrA* &B*%~>}D(jmp^ S<;i2za1o! Ⱦ Mk?O)$$ &QIbko& kmvfˑ"6ah붏\*"j߆ %Ge6;yYPk7"!0RPڤʿ?y?\ 3YxL,:g}cL hq.WGuWrt^`/+o¸=ZnF񸣤7]>X{+iJ9AgF-2- z7>SdqRSʣ/S9콡zMM9'Ck4I>> M:ohAhe@*٥tR\3[+AV[4H'nqZ'CmFn״a xt!VHb`_SNbw- OEJho{SNP"}U8Fq=؜b#I`A0JrSgA4i?%Nk'ïNR"P% SD8kV̤VS Os.Ie}KT3!DxZfnBVy\u:O1Az]G?CL=>h8[]죟w] G6{ ~pঋ嚟Ot~;I*|{pw=2ts3˺Sف1ڃWc.q5?=^a``@q?)O4ޗiロ Ԗ' ZTٲԓm ϝ[^bB5)XvBKAii&5 A2ڿ2 -=$d[j&^K/XKDhin;}6-~Zޏb䟻Jdѭ򯤸+)-WK}]fAy.Z9ëUgq=6eaQa7 N:&# jsb[_TK̥ aw&)fblFqޠJG!N%UNi'h3̅.cl'TKI=Gq$"݀.QMzH`he.Fct2&CKZD8Ke:ֆ8s@'Dpa"|Q##߯W G Xԯu+g /EXLIh%e6/`= wW?K`PJ Bx\>U_Ho(E( ֯$C:X/ԘN2ySUb~CY gnd!/^&K|Ӷ8c=&c1DUy 쿦˷FU \%+Ǭ݌HXA4]BK?7!Z+3xq.qytșhf F7Q]]nhg M.m&U/P2sLef?h SDUqj_ .\ytUT_%!{ܫ,&?NF9$^EOGnO)y|׵oVFtʃUy;z/Y_8HK lju &A1^}֋4Z?*`psKF^fzJ|~wIdǑ/qN-=$d[jewZ ȲD9 NK3q~KRt%PSKI=y%5^}FB"N|KiP92c G" ER\X4B:-:d#`"nneRcۄ-xruAK RO&8'uⴔd4ZJ NK315Hs$D-ys{B 1 N$f=푢1&HVJV-gPj\aRPf#PFU 5y7 t VY8S%.j.P"mg'Y1*A\nUVCu7֙qku&d2!Js(!rKH6dY州o]]:Ũ%(:t`k2{􋭓|T/ .%O%wl>RrvAdS&#N6Rvjfu6n'Ex:ܡ!afW=PbeKj=$^*WrTm%֪#Pc"pql'Fk? pc`ӯ.b%刨FOV]᜺tN5벜V;lKĽ^ȢtUo5~w|.?,d[Z&<<鹾]U|,ty(f3ݞ-r pC0;{nȻ@W~hqz;tt;)_ t,QiAj(ׯ;@Rfc%z<[oMT׾@x7ڗA򠤰l LA9z~|fLRIqVh݊tS ҈ўg-"F3SW6A@KiqL%iIP)QYe!~(D\~>W8'dzYz(IVz3ާWSl.6V/϶đb1hN3݈՝VY%"A*猴+L@" 96©P)pjќ#q[U#՜ ?K@CԤmlҡns,טVOiae*|_1O+yˣY+}!;4%R!! 8iߖa~˜3xjz)|.8 8ՄI+4RվW˅jCWq&nGtS;!:B(D,*9uG, ^! pS*.+ SRnuM[aL0Kd/̢;M 21EH3[?3ZZybH)W6A|40Lbh =@b/j?;=M;@n>d{dYh^lykG籱dzfLj͟C0Bp ifr_XS ]=- 0cWΡΡ3kWpAKO>Xd zY2|}r s&{=z' TL6-섾ewŜhH6YL`z7B2~ ;wmsR"HjZN=O3{ 2'3q7kŏPbDu[F#ߺ&jji91$W˞uob(^ut/6|ٷv7ԝπ7ಙYUoN~68 jts34Aw`7no|0 Q3;exܼ'_ mIIYPmS(z7?e4E]٩އhnwM7Ih ~JY6`2@ S~G?"Q+ 9EbR-Z2E|Dd<}(c٨7-^}?#?3X=>O߀qKrIfƔ7n`;W]|d\уdgq$dIkTfBRxqkl?zn"f߿:Rv2,״[{f=MNnV 3mnԱ~K$[TXa>DS_"jjXwzQ:2_i?""Vjsb 0Q+blhH 5"q@BZeBjXBr3u ả"s|6kk?2"R v3ő.ۥEġ$DZ#Fca$Bl$Xb&dXtqϼfn=\44 (Tp8d"Tr(iXlĒYmu`uH9$U.$35nwí[>@ᜃoHBw7IZb 8f5k,o͋:Ƃ$mȦ}S^_DD|ӂ6Z5>~x3J9 f i3%ܤӍ߈"T/q@1 Z G;=AZi5tM.}hulU{:QіD*o»Q|ibOqâEԜT~g`&W#ݛ-iu/ٗ=rxBT1v7;E&{i46H2AǶK""mZi}6aPf(`Qغǥ֧q+t4.% 9Bǿnjyh;NQ 1)SW){ 0=]<}Vv0<5G^8O[sY6Ht8On GЖ-,f mbdJQiZu")AC!Q(Ɣ+Fq,8R`d8#ĢuFB@ s/j#^mUo|wg Zb&$C]q`PcFL"EB80CB00TRBlH#2"$?8WmJ2ƖIes(<(f6f8# 'ALJUz]V ѩ벺e`? niBiXqhÐe4"\+X!"FF Y Z+b;XbB`UhG{ިP$-YSai[r714DGu Dx&+WHGIjT&:&2<=8bqJ&lRI5Kv7\8tʡ f4"5I/JQ. BP9ݓ5C lI,Y$YJh^E-RJu-|u衙 ('M&q5Sx8_?0\.,gG.!@N{Q Z-wV2+iqJvOum;碄҈jl{G/-^3TO{Q;FhߝBwc>`p5?bjU@dlG;m1i>Jrd= Wl5^w_jN:_\6މ~5 Y[mʅ$"cF#Mǔz%T͕ڹ1׊.ʫxJg*Rr1tXTL&dsΞ5VЗۻ/$Lp.)О%T0"Pp[F붌JBɩoսQ @(9%{n𓗃K(v;+T-UՕi#Aݨe9fQTp/o~=0.mޯj TG?nƨP;?$`_I"& 3^ /,fz;2]ڮh`@SSaFm~^M/`0e\̗nYNwv\`eSV|"ZI {:G?n`vT}Gv8нiꁀڭ EL8k7^(z@Vѩ2퐫Xv~dS!!߹.C~2PKhI|f,1pkɰ1 Qp[%\Y .Z剿߽yF^wGt\x Q;a$+.ht/>fZ!܁'L t- we!bZYyMT}!xa<07aQۏAu${&ځ%cQbr a%ȃd.X8q &4ִj51,ny4>vX׭Lk ̣xD1,Ϝyo0wvFXO6ؓ` #s^cFOÑ|dҫ(g5 -02\c(KdSq6VK#bDC*1^cqq7[\Mi*ڵ\&ES+J pg?=Hnۃ}-;?c@^7tȨ[)*I aut`)s0Hx=r(WG)9P53߈ϋG27I%r*upS0xbϒ=r 1.q}'*$Xx,`5GD$QrÛX˞iaM\>sBKe2 ԅ$0LX#lQE*dq( ,PXqI@wlf#~閞0\i斄fq郙+輒S&Z}Wd2tGz6Vl7VtVyڲi2Xa+Ld VWY8WuL1;|e`]Ijf0+Qw)({+[ސPnY~Bq{ |)9,;x.xEKaJAA~;¹wwE iboeٹ L2mgucA}PW?Z'\C'B.]QM:$5XfL16ā+k1XyƂ3U](7Ũ+OEeŜ/O- 3|y0Syf)E;?eVySfct8FM2)?>nJAD7mDt`7 l؈s7l;H|nH] ")wA:Bi΋/C'|z9Ga]\ Q4W LۆIL Ai`'4 s&lI0ߢ&0[W7gig & B"k1\1Jq(rs.Ԛ%2jmO"XO!1̗u/B֒NTZ Y׷^ P^)YLA ޵_{uw v vedFXȖ!lkc[Y(Dhcz- K 0[r;G^vu}b @ΤlK\]mGӷoNt!2u!8P('r DIJ 0*KVF$zv+iFr1R.)?M5s\k׊JvV@눚F{ +4hU\ Nymtİ\1rMՊzʆI-*7xuAbxLhi5\ 245'TL$Zi(C\d;Q3rvLiEk#p:i SyL;s!:{ƽSJ΁!25F>D2H>"ӭy{RhEևt)>;/tE"֚ nn,86< H(LݝL L!6;]@\B',3Y[f-ʻ*1,rbCd.lDDc*ENhj+^c,3yi}&8$vC2)dݧK rpH[Jq5_JpN ])%XQi8+['WͮS_ʑtuJw,[(GqmN(dT gp2!xP]/uo/w} (tgZSrw(Ltkm?{`uībm.Ojs)n]QA?{\u[-[vr{tIE7$^_xlQԩمH{<:MZ;Et>[ϡ +no*ק#Y37*T5L9n1Ukn21oxBϨSlޭY?ӻa!Dl hwNcIV),өFw[u*|ޭS5׻`!Dm 3m 6rI =O4>mG^3/buӫ(XJ/̒JtnYRigRdbF g\%ZR {/N,Ղ6Pމ߫8.wiWs:P)1㤋@d8 qM3b\aP zt^!Fl07n,wΈ->iujds^ėm()e ?),|I( Q=Yj> #|~,i@4ȬmTuJ;@=hksPM;%y|]h$B<ӑ^px"rZsl<8EVzQƍBcG`hTL{>ͷF͠lh"H\ZTqE2Rf$N0!|FXD Uo| F P\2Z[&a鉚srJi}׊Z[!p-CTs0J‰rQ+6çf8Jw:G@Ȃߛ%U@T EP"ԤIdr('Jd y&ZeSfw?[ rL%m+$?w/ncX37(V {7[ rL%mCO)lޭ!ӻa!D۔4kS㠳eCgyEtvg20QʷoM8!Obc#H ]ALwo,DI\Ə+U&R0qrWR%2& IW- h>'Y~2EW^:=wBJZ7ۛ_ j-bе~Cl,npnJ&C_k!{)΃WTrsY*PYrzkz}MY# e)֍{[Jԁzy`g*{El-fxJM n7hxHL`4_V(:W59fuZU^zn8B. 5uƵi`\Ҏ3f)WZ\/ljʮUv\w%KkG y2<f-%ҥ`ZFu@xFfFZa @5뉸5z~E?n|Py^~^ϟퟶ>ן>E> I&E2#4 qq ^Qjf"E}rpDN- -Мi[fQnt.3gՌ2h|f2eWn%j2xi#w蘗V HRk 64FqRǹe46-L؍(ω@\0]h^vؤM8A%ofRjyPl;(F(AR-*x" 㶥5 }5l/Ԑ߼;`O4Hl"ŖM|MWgbMYNƖ}p1݌XI0 UfER7,{=3rx܍̚0҉Ѫιj5qu5Ys>ug"Je.!,o:)e=<Y0Bm6䳡OϮdmVQ(ֶ*$fE\ YypUVU*xtJaФltYɄ`y1.)ϝ}" \R]/QO@c^S6(QK& (l z1Tت+vS}ax@j 2 F FrCҩѳad@cX37BWS{7n21ox墯Sy2L6&i_neڢںiH01^"DZq:eT0q45nH/K=VÍdPO؍.MV~uttyw㼦Co2 1 gy&|xBZe&~„s.z[ oRkjvu[+o4o5׾6ǯL #ZSS$@&VWn8k 757hpWxVׯ2??󧥃WmfRM3qp~oڶƵ,EA@IrghG/d%j< 4Qp5  Z;(T(ɘ&2+:eZI$g`_,#WxcC"-'$سu[d RvKSϏr˃A8eˏ5"W%kωQZ+/%a1~ly-^ ߇vl#TknvԶrdS3Xuq,OϮdcYQDRۇ J%!УcgN*S/Y#ɄNS;أY+!H.]s]FIv qfgO1fHm-*+9PdJWZUm{ X5,Q 2ޟL! GІ\{" 7mr"LH ~uh(!`c75dr >d$LFrvIJ)Z͖X]$ȣ&dȁqY: 1pN\䬇s{2:'hԨ{Fn$/{-Y$} 6 / l6{ ɓLWV[SҜ`2zXG_Q7KQ/yWM̟BGj[T!g5%4{yqړ%)VF_3iCOMi8>{ M<n͟y ~|("g.`:/$dxۂcyEYXBu|c&ԕ'GׅZW!0f18)eE:(8)-FmRzR*"CJ"!TD, {49 >kK 4ω)>B1}h*wϤ(giBh0:C<5.22(y[Rw{Ƚ :\)-y 낔V'CMr8)&HiR :hZgӖҸ򮰮7=)+ZB;N[JR"Q4 'jYJOPJe_* H%,,"2S!+UDUR(BlR|ECMdƃ$fTc-_j\v4re[S`e8qʶf@0ed0\8\؇|5>-+TDJ֔jSy{ \Ԭ/ƼMKg5h#yԊGj0bg޴⍤89J?F"AJ"M< :b )#AIF}N=qک̈bmn>ʌ;?aTZ9( Ȯ'ȺnTzE)Cx!Kmfe<`s,2׊;0[;5$8)QZOLκH"ia;`]B 1Zn-mnzۓ&/iOқr&N5.Koǖޔ`Rzӈ KoF,MJoDd( Rſ ӻҌjқuAڕi<* 0ZN~h`8o6!Pz]zü>g$I 3dNm™LfV.yDJG"_v.ewC}js {◲,ɊkQH)qRZ@MeEuG:\R*QH)qRZ@M??m)(Ұt%R*dPS8uS xii*%3a8VR%9>' kMsվۗFHĵb9[Cuz?@DپKn%1}K͈UmF-]7f w(p?',).""he$YG3Y&֬\51eiW e4bm[trս}?%b^ ⽣Sm#U׌e/ߵ9^ysrQA}c`Ns1c&Y'U͋ 8 @9`|H>bly/j*d#u?sq8.Qq>{$doMAl }J cѮ{y|g֋F4xYI=$wVf8KHFjhW 6+&?| WpK,W]MÙXڻ.GrbٻZ D\nS$_hm+@V *kiח8u'؅ ̸"BQs -1Ig($ +}>a}u/fp<5BF17Dxxx*X,>KHm5鐑,T23V0clhWMDnijCMy.$eQƁf΀oDRvaLu֧, E nSNi ĥӐYΌKy БIۑd׹c>ۺPgྼ_l,L>4I\=;\R}z52"ADoͷ_i^<ᛙ/~pst}z'Lo.P!^M)%1L~yr'. 2-M>/&wvR_xr¤b=蔅au1y\R|Vk[U)Qr#4>chIMᢝq^w!R.6_n,יD3FS鹓 J; x4@*йr0u#Gkp¡2MfdV ?uڻ EVs3eT_>ʹL*<;Հ-/:tZ(CFm{6r$ї=A39z#0{عLOY,IM]|I(V!|%QB"{rZy3KyKgmun/^y߂Pѳu]'|FFƑvmOhnbh>CՉtm .s_`E'av4 5 x_m}CJVqœ=~Ch?S[LL'̙d{eQP &1HiȮI%hd/ߐ(0jku֢[P*ni7HGyeԔ!MX~R!&$%4&M! SI.p"TH }(q_ix_8^(&A#k:e,?lGE*t/y#vu;mdo't]'bsӏWTc[,R/*-Ł,tҋ8HY/{_-V )MLyҟ\Jr$sGPr |"jt2̡csgnL/ &,1lG}HhT$E],:scAg ]aWhIƳYKXa+%-9\p'j gѵ4J G֭%!܎@yEpJ2Vݹu$Ro]՘`i9F5}vc9@& ےyv2CFG;|J^i)!\֖Ilo ӘlU ׸#W,RI[/ʦ ۀpi@.-9)!Dܕ0RJ5бsdGƞ$P]N!ܕDR43z]¢nܭ~jAv!NGU Dˏ TXpK$T؅:OCe姡-+kX*(TЗ=V*#>5 =Qw`O9\z䲻QJrzqqD$jEn2ncB  BΘ{`Qj)r\~}0}aݨ9d￲-7Ս@='YfvD3fO6Sx&'UQ S9}T,ę*Ժ59Ku|{ g:e~vq?娽 ~0vKza,Z)N ^62HI鷻ƀ럟ߴ \pu ݅~Q 5NH=i%2WVE:y3kR% eြdy<$ Ns!w:Ni#4!CMt{)9D?%_ Ps$De󪆉3dtD9d`GED%$:Q(̘b5h ^$DӄWt¹n/ 俆Ox0p!!A8M::ctݰFX͕"] Oi?ctPHfS7T?x*٧ϖO0xl3#y">g|m<\~VO\]ctv֎pR=Sܡ8p%k˘N펬0ڱ +_}S~~+IŐ$2~1}XcO  ټЇ uW.yE;4Y!6/Dg1cNl. gP0-Ja>rgoLt+z6D+t KDW Et.LES|Afš%fݺ?MMi.|f8Hb/N+~ K (U3g.PHOoߦBoY2U-I C&0Xd(J׸|s{,S ᐘJ3a`Š@r>|~qSj3Ob8R+~A(FBjoJ5/3@Yz3Ongl&3B~!g|*IR|'?ه! FȒ?;Bkgy(2wlqKy4wKTx͊ 0ʞyoKȹbCA0r84V*q%y..Y*H 7Y;k}҃Ɋ&/7!EH)ullQe]^T@&TKu$& !,\A/J)]ܱ(u?o"R*ir!` 3Ge4UP<JG6Hfw1@dd=?o~2@*L10 R3lseb.`R*%< x11~!F} FPB'pӞ5oSy)d2m!d&5Z%iM PF?.DT^1mGPYP|d?n1IkD ,8F NSa(δ'a e1mv'v*#YDVlG,HcIދm^~ vϗqF@J4΀mz0vY3d UzVs=Y[a((U8hʴVJf-VZYVJʡxR`[y7m╺!YQe,8ݣ ]o“ӄ%S`&mWJIY(_SI<ّEo#l}%+){;ՎF_uWjߒl6:g'0H.3crV.H"zgIaFM7٢i.&(9d<0 _dybIMz,;*rlkrtdk3:.8ȑJ/|g,d'[|tA*RY!Updvw4]A&#:12g3DQ{͐v|Booyʘ} fQU0^dDʦ(me7[+ƗCw6)Sq'ָS\TJ@IbPBΙ$Gګv =\= , uĀU&FD@P1ܕ{^;jMŹLSp]=52{fNԌٞ=|-<5ﹰ1e`Q-zFSrW" .08.ʮ#vnKI]CijOZgOt=a9Q4PGf n!(fR a:A5% hrSt9)]ѣv2 q!5 L1zmEJj .kaYF\Pbb @c7OFv]mT׿Wc #c"bg=8H֛\ne/j+UEc3 )=9Ie% `pWiymhQrLDrU]Op[+Cٱ2-:7{t3gqn!'ˮɞSG cRڨF.@eIg R@(UJ(5','ܨmWpx!RA`kgB8q+0kjW2 ョZ 2rObOL)Z9çZ2ˣۚ%)hǼ[L]H+W1,]q`̱a`DW31z> ֎ߛoPiź7!|@F]k^:'Me^1*]&M/qS|R53R} w3خ-1S {w,T(1|UYZd`Yx]#+G1q$ Je CQe!YLP;+>3t-۞c@(#69EhlBϬG ܖ2zbegYRVqgT}X +l.Hx6棶di6*,s&(ሰj %G@[CA$`tmi F]rTx,%4EjH lt5-M4^Ҥl~=[=Z1 t4dtdl$C]0@aWGui]b29@T˱:P&8ꊀ^ΣsY.YLP6<*.uA2%>BИi:?CiZrHE/- W#S%z-N&U=_ӄ> A> uT$ix4đ Ge+:猱cµLn~Yrm9`sL8֍"Iu{u$bҖxTȺ[2AY)z"do {\3ڕ^さne{nr#Tp|/ nE3HiX%t-O p&AoBrS Qs uX&u*1+PLĄq@O24l_Re~d}|u1eU>mM=|WrњX2I)+!.юϻ@ $t]j|妓>2|8$2D,Ϟ1RHU.3᭵׌ߚN:MS,tLRL c])x7enIRklbukPL9BGD̸&.Õ;ѨM uܰetˑ<l{sv4{w~e[l١[9x:fΓѿmE;ďwiĨ=إ>?ZC(u-C<^K&~諶Ծ%Onb!o~W[W7 nC/Ê#  K7mvWVd%/ڽߥcy ]Mw,GM((E rϔV4"dց. G}Lؐ-(;In[jJ6lUF,{?nzߚjCCغ[/kYhJk$JdgɁ0±P-WR<:C|BƆU2RgLlN,9Er}h9@K7!<MQ2THF{C&'oyLfyF#҆ v9{Q7UeM7D6^hޕ,TZ\NmDWln n1`Sx'w,gzefJ?H6/),SLPr$yZ|J=Ĺ)u@S[ ilw.Q ~k7Ƀ*9CM{+T!= ^Vu;U0rr~SN?x*IJ uBozm!1U>xP"ٯAh1,xK[/gIlG5\l.&Z)4\B$ЖO!tDݢ )M Qҙe2RT:& >1 < QouFWb\Y4DbbZ_Wۃxv _KX|jE_PoPX DOfZ nVurJ],ޞp5lo _:w'#Uϗ)],d&O.IhjΙd0uU}[o⫃xfh~|C]f8V{<#ܕ`Bj%Oƃޒw>^W[;߶2̞U[` |tT4 <@SJ+LgX")>uSRv36Օz(TW>فb,FwZ.kXKVؒpӌ"v ŸPB*i GsowנrnO9˭zaZz@qmj^jL5$*{mC:J!~P8cx!u͔ݙmv dA϶\lRJݪPxkrfDDtY,RX3l7[7on?Sμ>Ǿi ?O/O'֣yswF-u#+17yHX MK[|T~v[hei'M1Y%r=E  O2kq!k='pN:cPѨXƃ@ @:2h=* ~ǩh``x;ODDt`w.8*O$qak;C=ꑗ]pJ«SԸڪ[?Cl8'OLXjŞ ?(ۇ?a^Z2;]KaQ{k2:2dХ Es2pXͲ4M֭)a^.dKywRDh&P.M~6+pL6OEz~^سAkY@fg7gc Q䀘 1biYДG~LBS\E 6r47f, "n:o@z$ ʝ7u]2\D`ɕ`\>{Al_Sqr 6H"p 8ͫl2h׃G3`}qt}sרb0ZrU;VȘ1r֑TKugoP\܃̀OYƌsQIV'<ׁK5?4nLiryC d]BXfMR$VJbh]#?YgV+#"3a8I- hKlK'\ A,ڮKEyW'o/ >qe>y,Jh"9% {=!yo?Oz>{^E{-?X{D# §2If1'|j[o22ܧ? V=)),U&Ia>?P^51(KgRA%+a8!UhIdg`#Zm4@`?-'.DFtӱ zF[_Afgl!+{\I蠸=:>q O;Îvpgjw5yQVfeԄ/X y9ǕoC$>q g縷]mw-n,))7IпqT)Єp58J1 cD0D9L؅ǯIF P:*R_?{F],e I`;Ιy@IGl)?ŖldVl9dlWU,u|ƉB1M"6w yTףE9o Ff~u>0EtRE|BvlxH.P(Yn+x6;4;W&ӳA(YO;* 1'EL0tPTu(~w|[}*PtI4]ހ!>ZNσn@M@?w|I+/@yeV."ݑ-8nh2Y v85~aW'\og0\'Q،xuNg"Lv_Qf) |tu5ȁ @qKM_ ĪGYHJ C {Nl8[ 51I.QrQa= bh8\>t_>~Ӄ~~ϽX/ʯie`hCZJһ8fBc=5tRb><EVwLeLt)Nw}0_{Q^~nو !:=m)Mui%1e4 TMΡc*͋2!1`$"Hf+G{|eZI@ngzeJEjRQ])WߕFq4FRoҮx}%4~G+ەpِ:+OC$({qf4OFm5oP, E"s\ͪ[*aY[[׵{@yfマe66#e>@?Вo1k8;Ƒu܃xXBv{[6deA[V&Id: "rlƕ,"Ɨb ̭?ZWٿ]mpֶ|*mUYS[) 3%[W1V*BzMydz8r֑v5w|&GȑJ9lg 70Jyy#ӍE47FoEs[aF[V$|v6H^%2kK|t9  n}]^N+8b+fwXKubL8+K1g-|ph+scn=,}J%<ʒWW5Da)VCFs6tc-43%~USƩ83+dG t;Ph+y-z~Ģur>>9a>:=exLBk^YtQ%;Xa[\>}g9rL3.nCdS"/Y6ޗ}_G5Dm%]wU\\СӨ|*˭2r4_3'u^0 }nN eI'y<T2~e(Ta2> *10*\6h]J0[l#R cyjfxcN';#>~:vbl_+In52{&$+bfm5Ctc ntkGcW_\&W?,`(vQ?c,?Bޝ8-DV\FH6i ф3QPm6@%rj8T+1KQH`WhWgG -kƶaXtbW Q‘r*ć Nڒǁ1 ЈI4hG$p%BR1z׿]9>X. !ȋ3{m,"en#.AbkSM3C:h˯T Ŏ GԄ#[\j/wYJ) G<ˮcn".0hO{~h"󥷦p: " |3DLp6(/P)&:&  v5mkD}X;5#BqeiNnċD+>e#ZXFi>nB5 R)'6ghG\ms)f& CR?~r6A Ψ%ZV 6xVs8*ccuP}Qc'Nݫ!p B$,eG'L@;ll8shB axQ}ST9 bdi|痁*xH2lAwFcp {Uϫ-^#k.Sۢ5fmnJ,S< "O[uMMzaio_B4sr-H(gO|]+lN=9zqW #㮔 ](keZZI ؙ{jI Xfx<9nce`+Vs>"Th;"GFE8jc6,SL7؎PmMO\>is4Y &a6%8';pPViKHʵ l-zrL)M_nL:d*&UtD (Q8ͼŭq8ٟJӢy<_Ee//79fy&f}V. WK %ߎ Zr!laIEi?K6a p,m@%>9=Y̑b43L go _侹/r9c\?Llrzɮ>Y/i=i ;ƕI?{zs‚54r4x=ݷ.3#~a&òCL=R%m.iPWIf ɋ@vc@u?N.*HH+QVj£CHd}Wu$Mw^}MGQTi!V+ZAQ]_^o _G&4u)W'% *$f>)pc` mXTk11(X jұ +w,/OhLd[{i漨*%(i KI>h- K5ynRC~2/yl-wln_ Ďp_ݛ 'q DqIR,4eEƞZbZ^“Y\.ڐ MT`"M4KǥGj A ]$r<{}b oE|M[] ݨ|;ZӪߪ5AeZ 9Bh/E&T(yP[QԘ~!W`LQy1+ 9<$yŔP*Dhm %xqN&$54 *@TT)( Fo6E(Y[7e- K|?ץ>9n+Sp5wO,zR-EFD7};zݛz""Uu9İ`v.)^ZлyFlٸzf UN[ݍ.9u4%ތ8 s*ڏ`v#mDD+V8H}Pd^k5ƜNox_:N[ko&kEw47Ua^bI>^Bߺh0aƠ"X^_&kNX5(Jo9OˮVy3rO'ʟ,JpChRfH*&zzpK_3%jrK>%(ƹ~-1llrɾۜb:uvx5#9c֦㶎{}z7\;_L)M1tQ% , 8G4$N2Vu޳5B<u}w{l=FGaǢSG`we~ɽ'5$˙8rYo$/ْf-h c.VWU@%olxjMeds4am!$ʔ=ok,=HϏ{u UKx N;鿬Ыk5ZJ*ooبIzvtqJzW14*<(FQ4AER͋I5/9{]E @(O@6";d|I=(fhxm 3%J n"4NSXi|SaǾz[T&;~y:|I;2C L2-{1:˰snuG8_ /5$ጺ{AMɜ8a*ht'Ԑ+)gA/5fESWn~Z5pq-vb#Ztvq ~a#J)fNxd\ǯ:*z%2e&r4E* id DA;_gOoOm`x}ڬd+7g_C+Ml@')n<>0(𓩷.s{(o{<@6kYấ1 i,'oed ޔ}grOi酺 [jo;z2zj Qߒ(q;2_TSx6"nfw͘/_fNA[edSu_t_ѐ5W;q& pA3r% nz&<:=@|U؞jD jz$huz%㤥Vtﴦ:L_+TKPX WE:uKhtv2S^:ģ&VE{v t3R1ѥˀ3={]T4=qx$OƪTp_XQ,ꧺ90՞(h=i\E;ߩWuAt}G붿Wi-bmiАv)F8W";֕9M>̺uuȴn}h\EugvKd(ZǺ]yv!BQΕM&(i$GK':py3ՀjQϳ̺@=EGM( 6^h6Êbyjϗ|}vhk?b*Z`#ݱ'F,ZPr NB- VE"jD*;W:IB*VDWSz4i c~ݢr:Ob;\;X~P^o:Ś@D'm M#Y4H?pZ?~ 5+SxX&4\7V h҉X ?z;J8<6jpY}Vس3dSy6Ж Aﱫqh]R+QK EkL0 _ڼMֶ6ض AԻuQ+iAjЖzeF6J+?h걨}m64`:d ;>'x 9գ}(A1ù6Zi^j%2pe" THgP%x +eXF@&!n%`rfDY*D%5p<2欳F_\S#tppMY|TR+;3C$x}73Ytn^|rEL/3XC0L38`EF/XFjr4bIL!S K#&Ƴˉs2WK_$Ԯ49qiTXyz(.ӊkрTҒ`I33%C"2F.X,e`('l@@#tY>uECT?9Wq*m8R&n4q[puY8T?&7Q۸S!zy\ﮢM)+z)RbM~~2?}&2fH?4՛i7 `b#*6vǦ~'׬Z](;Y&s^K|bjzէi8;3`M,{N4O=rjg -pŔT+Nk<#?~LdNv!EYLwÐ!l_RXF-2@)SF)C|!NTSJ]Qʹsޗ.(haS{#inRC\p$LSDJ2!3ʔ%[h\KC&s h\d\ |79 t~ ϑ6]Ǯ:LI%w`^>D0 !03I.־XSm1KsJ0zQ˨ j!$C(WwF͞|EɝXofŖ ?ÝF̠Vlb%ք 5!!K`9X4nHPXM]i )pb 8O'~=nדj?JoGf>Y])Yƿ' .0{#g+r҈g}O'{P|y{Q )\6\%USg_ژ׆-䋥:o8]8iRPW Fw%Qs&(O(p#$rCIY|]`GG 6b"ҲYK|jM$G-}Zer9L BK|j͏5k)f?JBn*w8etxӉi|3jigi|@Sva&-vOkv?j!ҞIĥZbyXCb ?lsrb ”/̸@-=G # xN E܊Bq)K:Z x1e6um]: !QQFdH&vŎi%֏* DU]JtU*QbX,!57"+4)WT\+骤) ԯSoG:sYx&y2 CoS]ݼ۞("w-֞bѳl<8-6%lFe\gt4kĜht /a҈֍_$y}J1wekzU<=p[w*ݻΠ" 9;fIm^ۑ>&oϢ(kۓA'3y|PU9;Ңw@io;f<݌e?m5Հ׋Xd!eTo,C=Q8_>\7> w5!4w~VX@k@/iL@oUhl'.54Q0#GΏK6syn]^% xtkqbX5y-xG]AiZװQ KЯ RC AWJm#(J9kz֟=OK)TQ!l)z8Ǣ.͟繐////B=+ A-14PG$AF4rQF7'!2k`KϏpJ>@puv}`Zvh;'oۢZ|59B^.z:~ E"e&ٱ+GebQ Hqeyٙb tW_"ZG}:*RTq@D(9xd}B `aGKBysa$%%:9,ubZQJgxN%2O4%"5 =ko9%ȷ[2Ycpw8ܗ]|4o;#)3Iߏ-)qK,%'`jA]@HϳL\C!heSd+s MjaqǵD >c4MlxG"vhr5!w%=u=TRO.ׯ MU*QKB YO?|[.{}Znb[b@`z|: zW.Bk "F11ݗ]R;ɩ,M5wA/Vd럏%_)2 :U$ 9g; EiA].*p|'`֖D3m a.[̵@hD)ڷaS{4< ̙_\5oQa>n-A^ZHygyu%{k~$vޯr*õ<oCfjќ~7{yfXKq5!HȑF3?g@9zR8X]e.G1I#3$ҍMVBULR)h6T>4IeMP *8@94Nٶl~sl[?3Ϥ#1:V(c~zt69F8XQo6sGޙ8CM2ޒTHzF0ѓ9͓l $$ 9 U)BY0 1O>2σB=G%+| yx_EA]pBu5[+x{[jè7O`mXaBƺ!ND|kmǩE7A cpV%thcOߧz,ĥչmUcd*&[*d؁w|bçSmT[A>b;.oEyg"5.bȎX_ezf͓_>,ijB)$JTVwqFD\$T껒"VՅ\3ιC\$Rhެp;l)@%sdOd~"(gPdc]uG-btN]iTҤrV$m' X`gbف:ic4>)Q1뫒G\jY<(og|1I/XC@'TT3:7kgX!*(g@IFdl=T'T">DA֒;hb-!fRpkb2ud (BKT;r*ͣB:|]xH9Ž!މZVF}m03TLZEy'(nSG۞~@u0E3aF*}zYhTJ"6gD]#(f$hcUOT]B)UVEqݱ"E-A|AuR3=esPم:(e.B{QL޼ދ&߭W_odUPVns&o+ġGP2G$F],8GFemDUn5/7(4҄Ď6x̗c]:cvm^dz_s{U>RI4vfz [Ugq+4n١2>UM|.,wDw!9p$1uU+ƛ&udh/;u, z|bS TvQmJ>T a| C@/:]ht詠 G툯%;ύ]?:YrӷjI ƾ%4GU# idAf Z+ $!'a9-IH;|Ԉǖv§CmŐFOZq0K :˞FIOVDQG<ov֢1"+6=W6WW;kB&&:ͭk}%kM?χVMb.{Սw__^} ~]~b -g#S3O{JCXD?R}W|}4nDZ6XC~-iO5l=TieAuA6m:ݫI)ҭo.{q fVȄҭ+ JNwn=5O5Mu/i)n}|R[rykU\Ppdm#! *d9q ʺjGd£dBx^y*E 'D:ɣ*y:S O/ŒdVsW+iuE\Dc.ΔuʹA]GI{2MGQPq:R({t ף~ys;?K;Z%k4Պ䨇#ۇѸ{ o2 _ ł61s&ŀeD 6g w,N[?=F_~y@Ͳv&=-h{v/'UfaPBkS2k.vZ$*iΈPs+x8'!gyXz@ st Z23NjQ޾3L1\Fk@賻IU`է~-7၄zd ij<0tbJsrL9=l^OJK?צ80OȘaQNh䀏xBDA%p*S3FJ{QN_13ܐV;3o}bGϢV!IڡLÃUnAN7)?~oQRďx =:n>Thڥ5o[4F7ERVh<7//Z5]Rž6yI6U"{yAei/<y!e@,gKZ9_w5Kv+%B?$ j]i&\\m9)Gh+xsD"WII߁Q6G ڲ1VR8-a- vΨ4yfn\hSo@ =}iS۔إSߴ9M'hee{e A}_i*YQNS6 )zx6QͶCMN`֖9(iWuѡj4Oq~0J#+IF6heaŏ3C5zMv! K׀b#^ChxQiW6Ó`̈́~8a䧠oŠ+΁i 3v#;Qte}?Eo^Q~8I?kڎpǸٶc. . \N_jk̭r [BC-Вky̞jyJV8P# gGJ1#(꒻P">v' J۠UƈA裢hU⦲JWD'(#܅v$ eljv콈nr:S.3Z8=] z@P&BJՍuJ5p R22KЅJ[I.ֵUo4wa?P+qՀu jQVdeg{E Ks s$#iV/Sm6ķbHH}2LulE1xTtyY^?=FOQ5 u&[nwYH9l4JldtZ S+(ql&Pv4pӀv2< ~ @u}T(hHΤ>qJJԊ8gy*dhKރ^`$ a8؅AQ}a!Ny.[2Ζ9fyC^ ii\YX̳5 )X;;z7ZH!+H*TKfiYiJ6%**bfXhoLwYReA`}xu%}XxvA}fT nn@[=(+{nX [Nb` |@-Ňoސn7`R~_2;EitiwrYL³ 11,w4gj{5M ZIKh?侲.<:G(3tՅ7Fg<Qq)V]6-Fcq x_UەQ$ko_R~}5V}[ךʯN+e< 麝QrtJ;e#BzEI\[v`1^*ȹĎL(#Rsf:6S>wUضf]2+﮽|WIlcnMxTdEsOVq)MW|_d%4ƴ#~EW4Q 3'U,?bY_,e1;Ǘkوg1L)W1 c,QIc{m}ʦv{2k)v˗koYif fe P%"P@Hg< l샧6t/ɯEwն2<2Ρ"4rB{o&t:;ښV̋~;39OFi3\,kY?gpDzoϬ?3sktTvsƯ]TG Q"%)E j7U!#:wwþ슝@Ѝy cwMYH6v?(J(.nXxW1M&B&KOCe,x}Wj3)HOY2cU9ItlZ\=~3>4a%NJFmԥӼ-hРnknub,S:>̤i-:NQ%yLj{]ٔ-~ e't1Nӛ2`y[pi^2o[qH4QKۖ!myHL,@Bhƅ0~Z?+[[EX/ B&z ƶ4gZJ_̢P=qw{11qj"NԀi~d uԖ{z2Q=e 3 035aiۚ`|B&Iյ0X[,cVWU?g?4r୛*.rUO)Kǿ y!D0Y䨘3(1sC=MnBXZj ~3JN}S9AݼRuU`)r Sy yg B8ߎ<_0wD U`{E?蔥5tVP16jψ|ٷo^GЯJE]'j)/!]@SPz?ZHjukf?SNu5z{?NZ>.>E.&qz+#DsV2r60{7mx<!L}Y_WW?n/+IBdWul\Aqe%[L^u<-i4 wăʅ*:ŨȲ(" qaTE~̝ LMZXVުl-|lmVBow"MK<ht<S*6vQLsʲ+XBQJXAo=l{QvoV](?b}3EgZ"EG!@.M,6W#sYR SOZe$='vg^GUe]Ixو+Ϧ]Ig׿>\z/.ɩDďqϯ/!Mnsº[~ Å6Fgc<ʖ?snz}5"TsxŲG]ƴXζc[`2= : ە 5O>E'>X#h臋ҏJ*HՉ*Zkb[:8o`-F% gZP!GD 5 B&*Sz&hH$g6rC/8~q(r`j'H |&a!bDG`~;1~1u\#+{u[:FX⠆" Y'@&O'>>&"Ɉ[~76;Wb?FHV'Qqɾ1tįSƍ;D~Ob&S3\qIguͲ*S{R=^Ͷl=0NnoFC4XJqթ5*Kv5i&Cf>6 c5ê}$xEg["9jK$px6~i2^OfoPOM_oaP^D#&ۼ*zE\>Lu(z? 4,n[ϝkZ-lZF?VtEs(hHV5zr? ]G9g`dG|K>gVˁqϐjSuQ.䪙6`c\os*1hb9% hx:Y̹W(RcL |B5.98)XW`cHil=X/ݗ&b<9)dӰ4:'lmFAl=Qm30y!|F֎1̑N! t`҈"Oղ';6D=>X)xOJ_=&XۺZuwCԓGbc&_SաO-?.8 S(=?O+%葽 'kPbѳWf~}d ghi^՞#۸s烮=ĦlZ8%!) `[^J Nūy7{rbے zb3.1*Y=u@l0T b?t6{:P9xV7[->z#s#_R)nͪ RGZ zƷ'/+<hŕlc0 cn` i*)ݵAיj&EAmzWMdpfMۏWAB_#d@?7E_!8c&P˘S"X6z@ v-2M~\%pVAS FR{ALM/d$=>?;$Ї!+w·64ZI?jn3(0zvh6|яv8COx(1( \w" m#o4=qBK^n if:f1ϻ}^ HCZ^BtZ;Sx*v;*!MrZ{ު6iVPLS\@Yj7zHKn/ť{_bbޗvgG졋"dy"}4{\vCX BX9DT3%MV3 Ά19%~^_d%xnE|Ddc&K gh< .mMcE$8 >hJ&|d. )}6"ޮ+b1Y}Ou8{ݞ>4n(a=$ >UJ*8.驰kw4Og+x ;v2.`2A?K&SwSb=3q2x /]h4p0ݽΉC(6WXfMg[D̞DڍGjDT*FjS/bӟ@W}GZ>ڦu!v8ރijYrH5"ސ3f{$6hy bI]K?xc*ND'{&6J3#~V8ZlӅݟ-zUGct*#\-?~kqxW}?LWC*DzVQVdP8d~L&۽b}1T$M]!nQFw77hdS\wNuX6}X<=1ݿ)F\B 1f{/];*=s5vWmVcr5~\0!3qʶ{%+րŸ%nGS,͹ʮ4l"X٥6qz#K3-r-r^GƍYɨiuo]+\|qx/FF?SYUᖟt/, x-o!o:dL͍.&?w??uoqK}W[{e7߶i/ڿ~@cN?^&oڤIz67Yr7ע=ǿ9Yr_GgyO&'_<vtN=vcg"v?Xtk\ȶZofv*Rι(4d z7FF_ώOJ0cNk&fּ?:CHȐcښu_Qew e 6Nڙ\өRlHv&d-K6nْdKAIkI?x̪/_DŽ3]`*Βw춠f%`~RٌT~eYMN$_F68a(O_oU1Sǰ7"lF{ZFn_הP6JnK]ؾIx@XַD*o}~xsҫgGL6mYT5BRL\h} .K(ytU)C>O PCࠗ_[ކF2)k&\k:52x_DOl?&x@qkV a?wM[ #-U̇rLdͫܣjFO}]T| Be{i|} cQǜ6We<[RN2t Y$Tnvma:q|P uE^4F- ZOEL>BZ#DijVX1̯~eܤ;慶 ]aЮuH2Pi`äH:v1!w!6b~ʓ딄#W{W]MP*}d”~zD4lr=Aބl8XLkLc^B}{0ӽIO/?vgEZX)jEa<PH9EӨu=jtV1? f4g[veƯ7Xcٞ݉k_^NʫWë˷]Θ?4 oz*pe/8i*M#,:%[;:Aw!^X6O߆1)vĦT7.^\(7"6,irwĎ|-YHkelkDAĒs ?%6·/5sg: nD&'QMc_fb|ԣaO{c53{C. +RPPcvYNS""g,,F+(WN\JZ)bNP~|+f9&؊iυe`+F4Nkʹ  ľ )Sޮ C)DHF00U3eW+)Q׹9%c42uׂs9VIhZXVk ֮aY[f$8#w=?uĉ_.h߆)Ns;3f-\|-{~_w7_}<::>| {~\ӟ^ѓ-e8|$?*M/<]孿\ 6]\KwOʋxHoҩZSRlLݍrOy;,T{r'm16)4Zw=^EUP:ZG>j[ i1IjQk. U1*"mB6[m]iUrx)RƚQ865q#,Vb49Ui} SqYlBCީ뇟3+iĩ+ep}3ؖHLNX ^UzX_, c fS4stWhiu{y\qyzgp8{)]Na*'%BXY ~Q>H]px +@l>ٳ*WڳgGt`׆*ú΅g>wqpQvX2Y(0ZIEVw#{s NX'6z 9\t:&#VIrUovsoYyW6li-&%<dU3O&L:}87B0D|O7&+&&Tf7ބ) [ͤmsj2-P-e-7(,E^;dG5Q'HGLv}S =jrE]'akf ]:zER}@sd>HH(!!,IssϒVBiVEy )yy(٨ȳG J H/s+G΄(~2D*_iW-Mztqֻ~pxpgEE#fH˃kn쇏>8E%fdvC=ZF'~_s?4DC@rki00yŸ-yMn#W56m6fcU]2FFVʋ{٬%ZCD!\Ο)'XZ Zg|7D'* jb^jMm%Z Mب>%tM,"B>~."j:9< l2 ]L 82cơ9L-gʙd)ppG|Dǽ,8Z V.l׈ݜaG#vcp -DT>њԉy- 6aq[\(T/zkp酯ګ_mֶZ)& /ȺHPU2iP[#ɐK^W"޳|Tymz,?_ 4ZM*՗Ed \\8k nĝ\n+CJ̕Qx;ntJp0ücfM':ZC %uWu.T"'n~݋RaV~#yiBar\U;!̫jF(R{g F(v]#W0{F56F)㌘݄mˋQ~vB6Ȉ|UPj+! rT_{ >.(@TΡ6?z4*)8k U>A͌\p U7;5;Έ.hn'4F`!#0 \ |5t^h^,g0]q}fbu`*&LS*3$D%S{CURMۇTm=Y h7F=WVY\MJ^y5|,F[Sf4xe KuR8^YxF!' 8ȭ]-_ Z!Qrkm\h@kmFk)dux_f: Πa_QFy]3kg"#(VHBQ.-zRKt|٠k (+`PZrU.:܈rwlXI3mµ ,vi@+wla)))cիΛr "URk{sF :cr ˂$׹u{jƹќѨWH'_ [{5` J:+h vKkђSX@+@mECTZFy}. «SVbG=feE  i,luxݵx#Je>4-u!g M(ƛc]LgCZu+bL0EH٫X#ENRv5/2*N>X'{pqtAv,{I.$ʣ.>j;^hG&do e\ `o֑ZՊieNRr0@Mۥ7[c0qBz]n1lU@-=J V4ۤdL1[_#QW@]Þ,e"T»#[ƹ.~[gVW ;WwύG,~I yl>.]v-yъ:ufRAR`@P.k=G9o %8GV^K1FUT`=3+\QZc9)'xx1Pgk,3 M=3SI*@s)bƏ+~(ˈfQڶ׭牡 ;QG Z%2NHIހj y]`sC u,^7L0圔-̓ȓu=Mi uS~ v mc=8XJ_|\Ij[Ʒ:,d竣_Y#  bJQX(A4k#؞=f+qN(V j_e4LF3N '_e3H6r;Dsm68`H DƋg gɜOmdp,ťE((9 nˌ8yi&t@/I=Av2: '\+µKZ^6Rb# f2勺B"P,r)uJRK# "[Ⱥga0h !D'ˊ;özCNx< f4Wo3(T6"<1` S@XһHnZATzo[|p ЭH~frl8˺InLێyR]s>4=v7qz%LJUs- V(B#z$g\SopeEC/1ʔS;:LLHAŁ!h 4ף C#)E4W!@q !+%jb;d(=􃀱1ݮQqM_>}$ 'sJA dhZ)3Dj&5$?C4ItT_H)0* 'y9JFkNhH,NEbgh[#qy*eNL<ٞ@ HveS{]2@N !|D+QtOG]@,C|rnЊ]6 9]ثU~ #xƐG^" VV@x `FK@Tm 4ʋE1j1seWoBy&uqv(&cPv*JZZb??6/-gZQnߔ?֍Ϛlō9d$n&mfnlrvoցqomrM5/b_?on;1XPHtsGq]?BRZE;GO18^apUւѶw 3{wGn7 7_Ek ߰<VA)}FvF#rAdڭ!SB͗5B20vԻ:phO GE{(gMd&,2E~i~V i>d[P)U3ʙq.Z[؃D CBqu;ƞ)E" l 'T3JZK/mz a!0)'QuXF:El`{-@QiƤ3LO-I6GCJ֠):j0 W]w&u~x_.w{;w+?k‹OM:*ſ[)|4OW^Qoޮ7Txwz/2KE ȼbަfZ 3LպdSfW4S4ɂV{Ӏ^ eʂkOֻNp]w/?!YK,Dp̆_CsPO4g@~T?G I}Gnүabiˠmk@5%hĦ(`ULR&o8G 6<ݣp 5u;<1lŊt-G :,Q+`**I-PюcеrT<4΁ _X|uy$Y": .Nqb&=Zm<?<^UX #5zW=e ƱV!r\,W=(6:zL&P̟NyP/LK lJrrU7M`nSü ,0   a(mBRYsB/dqn~vOω)D" *1k@KJxʒk8#lM2hE T6ć-Q˥۞Вf?tUogUܔח}w-tL~]LI&[D&,+ `&VaƜF: #@# -m\\FiI_(F!CX0)/Ci Q@6* .D9񈫤SRP0kcF@T]Aw7Rkۇ4_[=LIJ\=sw <VdBHoYM?O4%V}M:0"BjB?nvwuߎCI5 I=|'@-hoϮz[rBB5 8^B|܀uWbRc1X0wH:)%KJ.\(:.dPp 0):ܤ\7nM]78bRebjޮ% D#-~_O΁bDMu xqk4:ޓ& d(nqL 1E(Z. ǍpEEN){ݜ ݜ"@_V(AzFj/ %a0;:V2)s79%V|@>'wք'H*Ĺ=9ź_\$owzIv=!7I3꙰c{i?1χTr]Q=N.SF(U#AP>存װՉY8 E7 Es eb.iPPʘ,\BssOwvC)`+h4pT?%6us _GoWq@&9HUѷru +>2RB@!Iyڑ[-q *A%MOr0XQ`*#wJ!_E;ztIK⊹+a!Sdqdd 镼 n}%= M)9`Ipɑs45rl*XOGU4D&99w8 oJj %x {o"HԌhN 0&Z`pѡ20qvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000005163510215163501532017701 0ustar rootrootApr 02 13:37:20 crc systemd[1]: Starting Kubernetes Kubelet... Apr 02 13:37:20 crc restorecon[4666]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:20 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:21 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Apr 02 13:37:22 crc restorecon[4666]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Apr 02 13:37:22 crc restorecon[4666]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Apr 02 13:37:24 crc kubenswrapper[4732]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 02 13:37:24 crc kubenswrapper[4732]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 02 13:37:24 crc kubenswrapper[4732]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 02 13:37:24 crc kubenswrapper[4732]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 02 13:37:24 crc kubenswrapper[4732]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Apr 02 13:37:24 crc kubenswrapper[4732]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.107517 4732 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132213 4732 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132252 4732 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132259 4732 feature_gate.go:330] unrecognized feature gate: InsightsConfig Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132264 4732 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132269 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132274 4732 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132281 4732 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132288 4732 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132293 4732 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132298 4732 feature_gate.go:330] unrecognized feature gate: OVNObservability Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132303 4732 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132309 4732 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132315 4732 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132321 4732 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132327 4732 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132333 4732 feature_gate.go:330] unrecognized feature gate: SignatureStores Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132338 4732 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132344 4732 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132349 4732 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132354 4732 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132359 4732 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132364 4732 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132382 4732 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132387 4732 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132392 4732 feature_gate.go:330] unrecognized feature gate: NewOLM Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132397 4732 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132402 4732 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132407 4732 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132412 4732 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132418 4732 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132425 4732 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132430 4732 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132436 4732 feature_gate.go:330] unrecognized feature gate: Example Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132442 4732 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132447 4732 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132453 4732 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132457 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132463 4732 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132468 4732 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132472 4732 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132477 4732 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132482 4732 feature_gate.go:330] unrecognized feature gate: PlatformOperators Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132487 4732 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132492 4732 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132497 4732 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132503 4732 feature_gate.go:330] unrecognized feature gate: PinnedImages Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132510 4732 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132516 4732 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132522 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132527 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132533 4732 feature_gate.go:330] unrecognized feature gate: GatewayAPI Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132538 4732 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132543 4732 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132548 4732 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132553 4732 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132558 4732 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132563 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132567 4732 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132572 4732 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132579 4732 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132609 4732 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132648 4732 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132658 4732 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132665 4732 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132673 4732 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132680 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132687 4732 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132692 4732 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132697 4732 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132702 4732 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.132708 4732 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133304 4732 flags.go:64] FLAG: --address="0.0.0.0" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133324 4732 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133333 4732 flags.go:64] FLAG: --anonymous-auth="true" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133341 4732 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133350 4732 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133356 4732 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133364 4732 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133374 4732 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133381 4732 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133388 4732 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133395 4732 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133402 4732 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133408 4732 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133414 4732 flags.go:64] FLAG: --cgroup-root="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133420 4732 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133426 4732 flags.go:64] FLAG: --client-ca-file="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133431 4732 flags.go:64] FLAG: --cloud-config="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133437 4732 flags.go:64] FLAG: --cloud-provider="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133454 4732 flags.go:64] FLAG: --cluster-dns="[]" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133462 4732 flags.go:64] FLAG: --cluster-domain="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133468 4732 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133474 4732 flags.go:64] FLAG: --config-dir="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133479 4732 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133485 4732 flags.go:64] FLAG: --container-log-max-files="5" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133493 4732 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133498 4732 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133504 4732 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133510 4732 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133516 4732 flags.go:64] FLAG: --contention-profiling="false" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133522 4732 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133528 4732 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133534 4732 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133540 4732 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133546 4732 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133554 4732 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133560 4732 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133566 4732 flags.go:64] FLAG: --enable-load-reader="false" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133572 4732 flags.go:64] FLAG: --enable-server="true" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133578 4732 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133585 4732 flags.go:64] FLAG: --event-burst="100" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133591 4732 flags.go:64] FLAG: --event-qps="50" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133597 4732 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133603 4732 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133630 4732 flags.go:64] FLAG: --eviction-hard="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133645 4732 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133652 4732 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133659 4732 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133666 4732 flags.go:64] FLAG: --eviction-soft="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133681 4732 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133691 4732 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133698 4732 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133706 4732 flags.go:64] FLAG: --experimental-mounter-path="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133713 4732 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133720 4732 flags.go:64] FLAG: --fail-swap-on="true" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133727 4732 flags.go:64] FLAG: --feature-gates="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133735 4732 flags.go:64] FLAG: --file-check-frequency="20s" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133741 4732 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133747 4732 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133753 4732 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133759 4732 flags.go:64] FLAG: --healthz-port="10248" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133766 4732 flags.go:64] FLAG: --help="false" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133771 4732 flags.go:64] FLAG: --hostname-override="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133777 4732 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133783 4732 flags.go:64] FLAG: --http-check-frequency="20s" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133788 4732 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133794 4732 flags.go:64] FLAG: --image-credential-provider-config="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133800 4732 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133805 4732 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133813 4732 flags.go:64] FLAG: --image-service-endpoint="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133818 4732 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133824 4732 flags.go:64] FLAG: --kube-api-burst="100" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133830 4732 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133836 4732 flags.go:64] FLAG: --kube-api-qps="50" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133841 4732 flags.go:64] FLAG: --kube-reserved="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133847 4732 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133853 4732 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133858 4732 flags.go:64] FLAG: --kubelet-cgroups="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133864 4732 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133870 4732 flags.go:64] FLAG: --lock-file="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133878 4732 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133884 4732 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133890 4732 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133899 4732 flags.go:64] FLAG: --log-json-split-stream="false" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133904 4732 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133910 4732 flags.go:64] FLAG: --log-text-split-stream="false" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133916 4732 flags.go:64] FLAG: --logging-format="text" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133921 4732 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133927 4732 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133933 4732 flags.go:64] FLAG: --manifest-url="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133938 4732 flags.go:64] FLAG: --manifest-url-header="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133946 4732 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133951 4732 flags.go:64] FLAG: --max-open-files="1000000" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133958 4732 flags.go:64] FLAG: --max-pods="110" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133965 4732 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133970 4732 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133976 4732 flags.go:64] FLAG: --memory-manager-policy="None" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133981 4732 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133987 4732 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133993 4732 flags.go:64] FLAG: --node-ip="192.168.126.11" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.133999 4732 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134014 4732 flags.go:64] FLAG: --node-status-max-images="50" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134019 4732 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134025 4732 flags.go:64] FLAG: --oom-score-adj="-999" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134031 4732 flags.go:64] FLAG: --pod-cidr="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134037 4732 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134047 4732 flags.go:64] FLAG: --pod-manifest-path="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134052 4732 flags.go:64] FLAG: --pod-max-pids="-1" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134058 4732 flags.go:64] FLAG: --pods-per-core="0" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134064 4732 flags.go:64] FLAG: --port="10250" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134070 4732 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134076 4732 flags.go:64] FLAG: --provider-id="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134081 4732 flags.go:64] FLAG: --qos-reserved="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134087 4732 flags.go:64] FLAG: --read-only-port="10255" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134093 4732 flags.go:64] FLAG: --register-node="true" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134099 4732 flags.go:64] FLAG: --register-schedulable="true" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134105 4732 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134115 4732 flags.go:64] FLAG: --registry-burst="10" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134121 4732 flags.go:64] FLAG: --registry-qps="5" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134126 4732 flags.go:64] FLAG: --reserved-cpus="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134132 4732 flags.go:64] FLAG: --reserved-memory="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134140 4732 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134146 4732 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134152 4732 flags.go:64] FLAG: --rotate-certificates="false" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134157 4732 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134163 4732 flags.go:64] FLAG: --runonce="false" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134170 4732 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134176 4732 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134181 4732 flags.go:64] FLAG: --seccomp-default="false" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134187 4732 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134193 4732 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134199 4732 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134205 4732 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134211 4732 flags.go:64] FLAG: --storage-driver-password="root" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134217 4732 flags.go:64] FLAG: --storage-driver-secure="false" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134223 4732 flags.go:64] FLAG: --storage-driver-table="stats" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134229 4732 flags.go:64] FLAG: --storage-driver-user="root" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134235 4732 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134243 4732 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134249 4732 flags.go:64] FLAG: --system-cgroups="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134255 4732 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134266 4732 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134271 4732 flags.go:64] FLAG: --tls-cert-file="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134277 4732 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134285 4732 flags.go:64] FLAG: --tls-min-version="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134290 4732 flags.go:64] FLAG: --tls-private-key-file="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134296 4732 flags.go:64] FLAG: --topology-manager-policy="none" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134302 4732 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134307 4732 flags.go:64] FLAG: --topology-manager-scope="container" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134313 4732 flags.go:64] FLAG: --v="2" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134321 4732 flags.go:64] FLAG: --version="false" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134328 4732 flags.go:64] FLAG: --vmodule="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134335 4732 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134341 4732 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134472 4732 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134480 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134486 4732 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134491 4732 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134496 4732 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134501 4732 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134507 4732 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134513 4732 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134517 4732 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134522 4732 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134527 4732 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134533 4732 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134538 4732 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134543 4732 feature_gate.go:330] unrecognized feature gate: GatewayAPI Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134547 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134552 4732 feature_gate.go:330] unrecognized feature gate: NewOLM Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134557 4732 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134562 4732 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134567 4732 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134572 4732 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134577 4732 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134582 4732 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134587 4732 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134592 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134597 4732 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134602 4732 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134606 4732 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134640 4732 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134646 4732 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134651 4732 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134656 4732 feature_gate.go:330] unrecognized feature gate: SignatureStores Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134660 4732 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134665 4732 feature_gate.go:330] unrecognized feature gate: Example Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134670 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134685 4732 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134691 4732 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134697 4732 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134701 4732 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134707 4732 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134712 4732 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134717 4732 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134729 4732 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134736 4732 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134742 4732 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134747 4732 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134752 4732 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134757 4732 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134762 4732 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134767 4732 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134771 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134777 4732 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134782 4732 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134787 4732 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134792 4732 feature_gate.go:330] unrecognized feature gate: PlatformOperators Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134797 4732 feature_gate.go:330] unrecognized feature gate: InsightsConfig Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134801 4732 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134806 4732 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134811 4732 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134816 4732 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134820 4732 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134825 4732 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134830 4732 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134835 4732 feature_gate.go:330] unrecognized feature gate: PinnedImages Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134840 4732 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134845 4732 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134850 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134854 4732 feature_gate.go:330] unrecognized feature gate: OVNObservability Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134860 4732 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134866 4732 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134872 4732 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.134879 4732 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.134897 4732 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.146788 4732 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.146820 4732 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.146917 4732 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.146929 4732 feature_gate.go:330] unrecognized feature gate: NewOLM Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.146936 4732 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.146944 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.146951 4732 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.146957 4732 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.146964 4732 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.146972 4732 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.146979 4732 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.146985 4732 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.146990 4732 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.146995 4732 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147000 4732 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147005 4732 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147011 4732 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147016 4732 feature_gate.go:330] unrecognized feature gate: GatewayAPI Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147021 4732 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147026 4732 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147033 4732 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147041 4732 feature_gate.go:330] unrecognized feature gate: OVNObservability Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147047 4732 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147053 4732 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147061 4732 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147067 4732 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147073 4732 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147081 4732 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147087 4732 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147092 4732 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147097 4732 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147102 4732 feature_gate.go:330] unrecognized feature gate: SignatureStores Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147107 4732 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147111 4732 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147116 4732 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147121 4732 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147126 4732 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147130 4732 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147136 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147142 4732 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147148 4732 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147155 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147160 4732 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147166 4732 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147171 4732 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147176 4732 feature_gate.go:330] unrecognized feature gate: Example Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147180 4732 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147186 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147191 4732 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147197 4732 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147202 4732 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147207 4732 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147213 4732 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147219 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147224 4732 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147229 4732 feature_gate.go:330] unrecognized feature gate: PlatformOperators Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147234 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147240 4732 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147245 4732 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147249 4732 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147254 4732 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147259 4732 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147264 4732 feature_gate.go:330] unrecognized feature gate: PinnedImages Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147270 4732 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147275 4732 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147280 4732 feature_gate.go:330] unrecognized feature gate: InsightsConfig Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147284 4732 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147289 4732 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147294 4732 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147299 4732 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147304 4732 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147309 4732 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147313 4732 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.147323 4732 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147478 4732 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147487 4732 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147494 4732 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147499 4732 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147504 4732 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147510 4732 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147515 4732 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147520 4732 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147525 4732 feature_gate.go:330] unrecognized feature gate: SignatureStores Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147530 4732 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147535 4732 feature_gate.go:330] unrecognized feature gate: GatewayAPI Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147540 4732 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147545 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147551 4732 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147557 4732 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147563 4732 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147568 4732 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147573 4732 feature_gate.go:330] unrecognized feature gate: PlatformOperators Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147578 4732 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147583 4732 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147588 4732 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147593 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147597 4732 feature_gate.go:330] unrecognized feature gate: NewOLM Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147602 4732 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147607 4732 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147648 4732 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147661 4732 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147669 4732 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147676 4732 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147683 4732 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147690 4732 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147695 4732 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147700 4732 feature_gate.go:330] unrecognized feature gate: Example Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147706 4732 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147711 4732 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147716 4732 feature_gate.go:330] unrecognized feature gate: OVNObservability Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147721 4732 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147726 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147731 4732 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147736 4732 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147741 4732 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147746 4732 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147751 4732 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147755 4732 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147761 4732 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147766 4732 feature_gate.go:330] unrecognized feature gate: InsightsConfig Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147771 4732 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147776 4732 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147781 4732 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147785 4732 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147790 4732 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147795 4732 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147800 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147805 4732 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147810 4732 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147814 4732 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147819 4732 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147824 4732 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147829 4732 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147834 4732 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147840 4732 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147847 4732 feature_gate.go:330] unrecognized feature gate: PinnedImages Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147852 4732 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147857 4732 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147863 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147871 4732 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147885 4732 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147897 4732 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147906 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147913 4732 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.147920 4732 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.147931 4732 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.149004 4732 server.go:940] "Client rotation is on, will bootstrap in background" Apr 02 13:37:24 crc kubenswrapper[4732]: E0402 13:37:24.152783 4732 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.158042 4732 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.164069 4732 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.174122 4732 server.go:997] "Starting client certificate rotation" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.174182 4732 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.174421 4732 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.243213 4732 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 02 13:37:24 crc kubenswrapper[4732]: E0402 13:37:24.246151 4732 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.255706 4732 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.300718 4732 log.go:25] "Validated CRI v1 runtime API" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.391803 4732 log.go:25] "Validated CRI v1 image API" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.394540 4732 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.402532 4732 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-04-02-13-32-42-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.402581 4732 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.435415 4732 manager.go:217] Machine: {Timestamp:2026-04-02 13:37:24.43154651 +0000 UTC m=+1.335954143 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:3be5867b-5df6-4c65-8d4b-c54c471927ff BootID:69443537-6792-4af0-92ca-9d3f256e3009 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:7b:ea:85 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:7b:ea:85 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:d0:fa:b6 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:25:97:9b Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:96:4e:aa Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:76:d0:45 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:f6:40:49:90:2a:60 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:02:0a:33:ea:59:b8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.435850 4732 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.436034 4732 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.437419 4732 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.437805 4732 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.437860 4732 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.438156 4732 topology_manager.go:138] "Creating topology manager with none policy" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.438175 4732 container_manager_linux.go:303] "Creating device plugin manager" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.438686 4732 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.438732 4732 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.441040 4732 state_mem.go:36] "Initialized new in-memory state store" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.441259 4732 server.go:1245] "Using root directory" path="/var/lib/kubelet" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.444927 4732 kubelet.go:418] "Attempting to sync node with API server" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.444966 4732 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.445056 4732 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.445086 4732 kubelet.go:324] "Adding apiserver pod source" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.445110 4732 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.465192 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.465284 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:24 crc kubenswrapper[4732]: E0402 13:37:24.465345 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Apr 02 13:37:24 crc kubenswrapper[4732]: E0402 13:37:24.465413 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.478876 4732 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.509798 4732 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.531787 4732 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.534518 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.534565 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.534579 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.534595 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.534671 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.534690 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.534710 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.534736 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.534754 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.534778 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.534852 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.534869 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.536029 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.538292 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.539224 4732 server.go:1280] "Started kubelet" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.540947 4732 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.540942 4732 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.541746 4732 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 02 13:37:24 crc systemd[1]: Started Kubernetes Kubelet. Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.543792 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.543831 4732 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.544101 4732 volume_manager.go:287] "The desired_state_of_world populator starts" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.544125 4732 volume_manager.go:289] "Starting Kubelet Volume Manager" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.544171 4732 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Apr 02 13:37:24 crc kubenswrapper[4732]: E0402 13:37:24.544183 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.544819 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:24 crc kubenswrapper[4732]: E0402 13:37:24.544920 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.546030 4732 server.go:460] "Adding debug handlers to kubelet server" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.546128 4732 factory.go:55] Registering systemd factory Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.546254 4732 factory.go:221] Registration of the systemd container factory successfully Apr 02 13:37:24 crc kubenswrapper[4732]: E0402 13:37:24.547333 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="200ms" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.551183 4732 factory.go:153] Registering CRI-O factory Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.551220 4732 factory.go:221] Registration of the crio container factory successfully Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.551369 4732 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.551417 4732 factory.go:103] Registering Raw factory Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.551445 4732 manager.go:1196] Started watching for new ooms in manager Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.552243 4732 manager.go:319] Starting recovery of all containers Apr 02 13:37:24 crc kubenswrapper[4732]: E0402 13:37:24.550978 4732 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.138:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18a28dbca7652dff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.539190783 +0000 UTC m=+1.443598346,LastTimestamp:2026-04-02 13:37:24.539190783 +0000 UTC m=+1.443598346,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.571514 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.571655 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.571682 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.571704 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.571730 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.571755 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.571793 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.571852 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.571896 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.571923 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572008 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572089 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572150 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572193 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572223 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572256 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572284 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572313 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572338 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572364 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572393 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572427 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572456 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572485 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572573 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572601 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572690 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572724 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572748 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572804 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572825 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572849 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572869 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572888 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572907 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572929 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572954 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572974 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.572995 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.573017 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.573052 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.573075 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.573097 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.573120 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.573140 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.573161 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.573180 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.573202 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.573224 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.573251 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.573289 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.573314 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.573344 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.573364 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.573385 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.573407 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.573442 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.573466 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.573488 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.573510 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.573528 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.573552 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.577703 4732 manager.go:324] Recovery completed Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.579300 4732 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.579404 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.579428 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.579447 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.579469 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.579483 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.579506 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.579528 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.579547 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.579568 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.579590 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.579605 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.579644 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.579737 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.579823 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.580189 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.580241 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.580268 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.580304 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.580332 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.580367 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.580395 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.580420 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.580452 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.580478 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.580514 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.580541 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.580568 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.580674 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.580705 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.580742 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.580772 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.580799 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.580833 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.580861 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.580895 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.580921 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.580953 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.580986 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.581013 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.581041 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.581077 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.581105 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.581159 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.581227 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.581264 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.581310 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.581353 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.581385 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.581425 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.581466 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.581501 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.581538 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.581569 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.581603 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.581761 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.581860 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.581922 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.581939 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.581996 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582012 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582036 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582050 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582064 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582082 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582096 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582114 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582130 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582144 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582164 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582177 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582191 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582209 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582225 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582244 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582260 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582277 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582298 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582312 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582333 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582346 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582362 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582380 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582395 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582413 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582427 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582440 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582459 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582472 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582494 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582508 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582522 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582542 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582555 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582572 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582586 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582599 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582715 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582732 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582745 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582763 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582777 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582800 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582815 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582830 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582858 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582873 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582890 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582904 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582920 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582940 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582957 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582974 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.582988 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583001 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583021 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583033 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583052 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583065 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583080 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583099 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583136 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583154 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583169 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583184 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583206 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583373 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583442 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583459 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583480 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583515 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583531 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583550 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583565 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583583 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583645 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583666 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583683 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583697 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583711 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583727 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583742 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583758 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583772 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583787 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583804 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583819 4732 reconstruct.go:97] "Volume reconstruction finished" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.583828 4732 reconciler.go:26] "Reconciler: start to sync state" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.594505 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.596217 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.596258 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.596267 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.597033 4732 cpu_manager.go:225] "Starting CPU manager" policy="none" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.597059 4732 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.597086 4732 state_mem.go:36] "Initialized new in-memory state store" Apr 02 13:37:24 crc kubenswrapper[4732]: E0402 13:37:24.645068 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.674820 4732 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.678866 4732 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.678933 4732 status_manager.go:217] "Starting to sync pod status with apiserver" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.678972 4732 kubelet.go:2335] "Starting kubelet main sync loop" Apr 02 13:37:24 crc kubenswrapper[4732]: E0402 13:37:24.679251 4732 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 02 13:37:24 crc kubenswrapper[4732]: W0402 13:37:24.680508 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:24 crc kubenswrapper[4732]: E0402 13:37:24.680770 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Apr 02 13:37:24 crc kubenswrapper[4732]: E0402 13:37:24.745778 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:37:24 crc kubenswrapper[4732]: E0402 13:37:24.748547 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="400ms" Apr 02 13:37:24 crc kubenswrapper[4732]: E0402 13:37:24.779958 4732 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.787170 4732 policy_none.go:49] "None policy: Start" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.788377 4732 memory_manager.go:170] "Starting memorymanager" policy="None" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.788404 4732 state_mem.go:35] "Initializing new in-memory state store" Apr 02 13:37:24 crc kubenswrapper[4732]: E0402 13:37:24.846867 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.899942 4732 manager.go:334] "Starting Device Plugin manager" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.900012 4732 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.900032 4732 server.go:79] "Starting device plugin registration server" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.900904 4732 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.900930 4732 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.901129 4732 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.901312 4732 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.901350 4732 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 02 13:37:24 crc kubenswrapper[4732]: E0402 13:37:24.907220 4732 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.980357 4732 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.980510 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.981641 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.981688 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.981704 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.981889 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.982653 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.982704 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.983447 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.983493 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.983501 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.983636 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.983759 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.983808 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.984182 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.984216 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.984230 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.985135 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.985167 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.985178 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.985145 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.985275 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.985283 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.985288 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.985439 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.985468 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.986417 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.986448 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.986457 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.986465 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.986485 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.986495 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.986557 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.986739 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.986760 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.988949 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.988965 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.988972 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.989020 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.989052 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.989066 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.989308 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.989345 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.990341 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.990373 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:24 crc kubenswrapper[4732]: I0402 13:37:24.990386 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.001290 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.002137 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.002202 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.002213 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.002240 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 02 13:37:25 crc kubenswrapper[4732]: E0402 13:37:25.002785 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.138:6443: connect: connection refused" node="crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.088242 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.088345 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.088404 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.088423 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.088501 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.088517 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.088647 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.088686 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.088705 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.088723 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.088757 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.088789 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.088838 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.088855 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.088870 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: E0402 13:37:25.149568 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="800ms" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.189998 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.190142 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.190174 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.190233 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.190264 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.190325 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.190352 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.190440 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.190470 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.190530 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.190561 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.191049 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.191096 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.191136 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.191233 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.191768 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.191939 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.191980 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.192018 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.192057 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.192096 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.192140 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.192174 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.192211 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.192276 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.192366 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.192413 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.192457 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.192503 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.194777 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.203019 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.204576 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.204634 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.204646 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.204672 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 02 13:37:25 crc kubenswrapper[4732]: E0402 13:37:25.205110 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.138:6443: connect: connection refused" node="crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.317450 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.337995 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.354443 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.360130 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.363997 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:37:25 crc kubenswrapper[4732]: W0402 13:37:25.423697 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-a13c7b3c22cb8c918dd27619d80784051c2ea4d1bdf2344919bea132c59daa7b WatchSource:0}: Error finding container a13c7b3c22cb8c918dd27619d80784051c2ea4d1bdf2344919bea132c59daa7b: Status 404 returned error can't find the container with id a13c7b3c22cb8c918dd27619d80784051c2ea4d1bdf2344919bea132c59daa7b Apr 02 13:37:25 crc kubenswrapper[4732]: W0402 13:37:25.487025 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-adc6a39b3a87baa5e2697c7ae52d611529aec86ef97bb5f760d038aa4cfd0c95 WatchSource:0}: Error finding container adc6a39b3a87baa5e2697c7ae52d611529aec86ef97bb5f760d038aa4cfd0c95: Status 404 returned error can't find the container with id adc6a39b3a87baa5e2697c7ae52d611529aec86ef97bb5f760d038aa4cfd0c95 Apr 02 13:37:25 crc kubenswrapper[4732]: W0402 13:37:25.495045 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-a7752dc4b62e108115cc6bb44d1c09f1eaeeae6ab4f184bd9f67a1a264993bf7 WatchSource:0}: Error finding container a7752dc4b62e108115cc6bb44d1c09f1eaeeae6ab4f184bd9f67a1a264993bf7: Status 404 returned error can't find the container with id a7752dc4b62e108115cc6bb44d1c09f1eaeeae6ab4f184bd9f67a1a264993bf7 Apr 02 13:37:25 crc kubenswrapper[4732]: W0402 13:37:25.495889 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-f29582bba7f128e15f3a77f67999944e4fbba4963053d0cd9f64f180518b6bf1 WatchSource:0}: Error finding container f29582bba7f128e15f3a77f67999944e4fbba4963053d0cd9f64f180518b6bf1: Status 404 returned error can't find the container with id f29582bba7f128e15f3a77f67999944e4fbba4963053d0cd9f64f180518b6bf1 Apr 02 13:37:25 crc kubenswrapper[4732]: W0402 13:37:25.506399 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-ba9c8a2ddcd65e6b64f85508ac42bf1c387e8cbe4ee6f4bda28b8492fe0c56e6 WatchSource:0}: Error finding container ba9c8a2ddcd65e6b64f85508ac42bf1c387e8cbe4ee6f4bda28b8492fe0c56e6: Status 404 returned error can't find the container with id ba9c8a2ddcd65e6b64f85508ac42bf1c387e8cbe4ee6f4bda28b8492fe0c56e6 Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.540715 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.606001 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.607674 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.607729 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.607747 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.607816 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 02 13:37:25 crc kubenswrapper[4732]: E0402 13:37:25.608381 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.138:6443: connect: connection refused" node="crc" Apr 02 13:37:25 crc kubenswrapper[4732]: W0402 13:37:25.679595 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:25 crc kubenswrapper[4732]: E0402 13:37:25.679768 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.684119 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"a7752dc4b62e108115cc6bb44d1c09f1eaeeae6ab4f184bd9f67a1a264993bf7"} Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.685476 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"adc6a39b3a87baa5e2697c7ae52d611529aec86ef97bb5f760d038aa4cfd0c95"} Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.686559 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a13c7b3c22cb8c918dd27619d80784051c2ea4d1bdf2344919bea132c59daa7b"} Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.688491 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f29582bba7f128e15f3a77f67999944e4fbba4963053d0cd9f64f180518b6bf1"} Apr 02 13:37:25 crc kubenswrapper[4732]: I0402 13:37:25.689862 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ba9c8a2ddcd65e6b64f85508ac42bf1c387e8cbe4ee6f4bda28b8492fe0c56e6"} Apr 02 13:37:25 crc kubenswrapper[4732]: W0402 13:37:25.831198 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:25 crc kubenswrapper[4732]: E0402 13:37:25.831299 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Apr 02 13:37:25 crc kubenswrapper[4732]: W0402 13:37:25.857392 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:25 crc kubenswrapper[4732]: E0402 13:37:25.857485 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Apr 02 13:37:25 crc kubenswrapper[4732]: W0402 13:37:25.901385 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:25 crc kubenswrapper[4732]: E0402 13:37:25.901491 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Apr 02 13:37:25 crc kubenswrapper[4732]: E0402 13:37:25.950346 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="1.6s" Apr 02 13:37:26 crc kubenswrapper[4732]: I0402 13:37:26.335419 4732 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Apr 02 13:37:26 crc kubenswrapper[4732]: E0402 13:37:26.336860 4732 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Apr 02 13:37:26 crc kubenswrapper[4732]: I0402 13:37:26.408537 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:26 crc kubenswrapper[4732]: I0402 13:37:26.409945 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:26 crc kubenswrapper[4732]: I0402 13:37:26.409982 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:26 crc kubenswrapper[4732]: I0402 13:37:26.409995 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:26 crc kubenswrapper[4732]: I0402 13:37:26.410021 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 02 13:37:26 crc kubenswrapper[4732]: E0402 13:37:26.410467 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.138:6443: connect: connection refused" node="crc" Apr 02 13:37:26 crc kubenswrapper[4732]: I0402 13:37:26.540667 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:27 crc kubenswrapper[4732]: E0402 13:37:27.434096 4732 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.138:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18a28dbca7652dff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.539190783 +0000 UTC m=+1.443598346,LastTimestamp:2026-04-02 13:37:24.539190783 +0000 UTC m=+1.443598346,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:37:27 crc kubenswrapper[4732]: I0402 13:37:27.540158 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:27 crc kubenswrapper[4732]: E0402 13:37:27.551443 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="3.2s" Apr 02 13:37:27 crc kubenswrapper[4732]: W0402 13:37:27.720539 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:27 crc kubenswrapper[4732]: E0402 13:37:27.720694 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Apr 02 13:37:27 crc kubenswrapper[4732]: W0402 13:37:27.857472 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:27 crc kubenswrapper[4732]: E0402 13:37:27.857591 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.011490 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.013909 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.014263 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.014287 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.014331 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 02 13:37:28 crc kubenswrapper[4732]: E0402 13:37:28.015124 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.138:6443: connect: connection refused" node="crc" Apr 02 13:37:28 crc kubenswrapper[4732]: W0402 13:37:28.103831 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:28 crc kubenswrapper[4732]: E0402 13:37:28.103932 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.540715 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.698855 4732 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="890c62e37bef5a99941b91022b6cb3fb23e9a9910a937f8ba205a806278fa864" exitCode=0 Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.698979 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.698970 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"890c62e37bef5a99941b91022b6cb3fb23e9a9910a937f8ba205a806278fa864"} Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.700214 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.700286 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.700305 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.701903 4732 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02" exitCode=0 Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.702006 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02"} Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.702066 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.703487 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.703527 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.703539 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.704408 4732 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4" exitCode=0 Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.704472 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.704554 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4"} Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.705755 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.705810 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.705832 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.707032 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.707724 4732 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="18c38540c1e54421b3069733149cadd0da6b74c4f6aa0160090c8a14429797dd" exitCode=0 Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.707810 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"18c38540c1e54421b3069733149cadd0da6b74c4f6aa0160090c8a14429797dd"} Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.707821 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.708531 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.708553 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.708564 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.708877 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.708906 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.708917 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.711882 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ece1f4c1e4f39f8c7ee97788bc59b96de13c07e7bbbf06423016e4510a40fc62"} Apr 02 13:37:28 crc kubenswrapper[4732]: I0402 13:37:28.711924 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ef96f17edb5e968b50bab42d6ddcc7c4ffc0fae1b3c8bcdebc6150e18977740e"} Apr 02 13:37:28 crc kubenswrapper[4732]: W0402 13:37:28.753948 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:28 crc kubenswrapper[4732]: E0402 13:37:28.754084 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Apr 02 13:37:29 crc kubenswrapper[4732]: I0402 13:37:29.540347 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:29 crc kubenswrapper[4732]: I0402 13:37:29.718060 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1b40621c4cae0af5b1de321842441756156529e8dcb1aff14d7f5dc7db637127"} Apr 02 13:37:29 crc kubenswrapper[4732]: I0402 13:37:29.720813 4732 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a38338477586597973eafe96225e3fd49deba97373d4d0eda3f7bce9f78dd5b5" exitCode=0 Apr 02 13:37:29 crc kubenswrapper[4732]: I0402 13:37:29.720923 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a38338477586597973eafe96225e3fd49deba97373d4d0eda3f7bce9f78dd5b5"} Apr 02 13:37:29 crc kubenswrapper[4732]: I0402 13:37:29.724469 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6"} Apr 02 13:37:29 crc kubenswrapper[4732]: I0402 13:37:29.728001 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7d431bea9c2c5dcec52d15f7c9d0806d50d74ece8f3ba59562a6f3fd59e50a5f"} Apr 02 13:37:29 crc kubenswrapper[4732]: I0402 13:37:29.730409 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e6f4f8c5575530ca3fe7ef2e6cd25a4f63888841ae0564192385b7371093305d"} Apr 02 13:37:30 crc kubenswrapper[4732]: I0402 13:37:30.540027 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:30 crc kubenswrapper[4732]: I0402 13:37:30.734550 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:30 crc kubenswrapper[4732]: I0402 13:37:30.734550 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:30 crc kubenswrapper[4732]: I0402 13:37:30.734518 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e03db4bfdbd69007c0b4a4d3f465fd21957c5a829c84cac0dd39914617e44cf7"} Apr 02 13:37:30 crc kubenswrapper[4732]: I0402 13:37:30.736234 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:30 crc kubenswrapper[4732]: I0402 13:37:30.736286 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:30 crc kubenswrapper[4732]: I0402 13:37:30.736362 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:30 crc kubenswrapper[4732]: I0402 13:37:30.737372 4732 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Apr 02 13:37:30 crc kubenswrapper[4732]: I0402 13:37:30.737525 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:30 crc kubenswrapper[4732]: I0402 13:37:30.737554 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:30 crc kubenswrapper[4732]: I0402 13:37:30.737563 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:30 crc kubenswrapper[4732]: E0402 13:37:30.739220 4732 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Apr 02 13:37:30 crc kubenswrapper[4732]: E0402 13:37:30.752502 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="6.4s" Apr 02 13:37:31 crc kubenswrapper[4732]: I0402 13:37:31.216218 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:31 crc kubenswrapper[4732]: I0402 13:37:31.218087 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:31 crc kubenswrapper[4732]: I0402 13:37:31.218147 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:31 crc kubenswrapper[4732]: I0402 13:37:31.218170 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:31 crc kubenswrapper[4732]: I0402 13:37:31.218213 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 02 13:37:31 crc kubenswrapper[4732]: E0402 13:37:31.218984 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.138:6443: connect: connection refused" node="crc" Apr 02 13:37:31 crc kubenswrapper[4732]: I0402 13:37:31.540008 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:31 crc kubenswrapper[4732]: W0402 13:37:31.640823 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:31 crc kubenswrapper[4732]: E0402 13:37:31.640973 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Apr 02 13:37:31 crc kubenswrapper[4732]: I0402 13:37:31.740224 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a"} Apr 02 13:37:31 crc kubenswrapper[4732]: I0402 13:37:31.743107 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5272db3013bda1d5fcb66bd2669d7714381d3bd39802499b4a27873cbccd6ff9"} Apr 02 13:37:32 crc kubenswrapper[4732]: W0402 13:37:32.356043 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:32 crc kubenswrapper[4732]: E0402 13:37:32.356124 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Apr 02 13:37:32 crc kubenswrapper[4732]: I0402 13:37:32.540975 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:32 crc kubenswrapper[4732]: I0402 13:37:32.747180 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"dbb15ce78ba79858cf5ae05773b302a03fd75dd3a3cdd53d8ca5ff2bdbc6d48d"} Apr 02 13:37:32 crc kubenswrapper[4732]: I0402 13:37:32.747232 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:32 crc kubenswrapper[4732]: I0402 13:37:32.748021 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:32 crc kubenswrapper[4732]: I0402 13:37:32.748049 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:32 crc kubenswrapper[4732]: I0402 13:37:32.748059 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:32 crc kubenswrapper[4732]: I0402 13:37:32.750348 4732 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8499bb6d75a647adea985e455603f74d25d327c6738adf2acf5e621c77247c48" exitCode=0 Apr 02 13:37:32 crc kubenswrapper[4732]: I0402 13:37:32.750395 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8499bb6d75a647adea985e455603f74d25d327c6738adf2acf5e621c77247c48"} Apr 02 13:37:32 crc kubenswrapper[4732]: I0402 13:37:32.750501 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:32 crc kubenswrapper[4732]: I0402 13:37:32.751116 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:32 crc kubenswrapper[4732]: I0402 13:37:32.751146 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:32 crc kubenswrapper[4732]: I0402 13:37:32.751157 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:32 crc kubenswrapper[4732]: I0402 13:37:32.753377 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571"} Apr 02 13:37:32 crc kubenswrapper[4732]: I0402 13:37:32.753403 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c"} Apr 02 13:37:32 crc kubenswrapper[4732]: I0402 13:37:32.753452 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:32 crc kubenswrapper[4732]: I0402 13:37:32.754270 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:32 crc kubenswrapper[4732]: I0402 13:37:32.754294 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:32 crc kubenswrapper[4732]: I0402 13:37:32.754304 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:33 crc kubenswrapper[4732]: W0402 13:37:33.296682 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:33 crc kubenswrapper[4732]: E0402 13:37:33.296763 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Apr 02 13:37:33 crc kubenswrapper[4732]: I0402 13:37:33.539605 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:33 crc kubenswrapper[4732]: I0402 13:37:33.758356 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"934b0c14e04d9b1d043cf692c8195d6f093d1e40f0e5873dc489354895244800"} Apr 02 13:37:33 crc kubenswrapper[4732]: I0402 13:37:33.758423 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1dd7aa0f0f63ba0d0047b0ce4d9ce045e2647c090d5e27474d4eca5c40b97045"} Apr 02 13:37:33 crc kubenswrapper[4732]: I0402 13:37:33.760836 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"81451d8f3aca9dd0160f159a5261e11e057c46cca08c00b80ac341330ef6ae00"} Apr 02 13:37:33 crc kubenswrapper[4732]: I0402 13:37:33.760950 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:33 crc kubenswrapper[4732]: I0402 13:37:33.760987 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:33 crc kubenswrapper[4732]: I0402 13:37:33.761010 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:37:33 crc kubenswrapper[4732]: I0402 13:37:33.762185 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:33 crc kubenswrapper[4732]: I0402 13:37:33.762219 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:33 crc kubenswrapper[4732]: I0402 13:37:33.762229 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:33 crc kubenswrapper[4732]: I0402 13:37:33.762197 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:33 crc kubenswrapper[4732]: I0402 13:37:33.762349 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:33 crc kubenswrapper[4732]: I0402 13:37:33.762391 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:34 crc kubenswrapper[4732]: I0402 13:37:34.409548 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:37:34 crc kubenswrapper[4732]: W0402 13:37:34.419451 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:34 crc kubenswrapper[4732]: E0402 13:37:34.419545 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Apr 02 13:37:34 crc kubenswrapper[4732]: I0402 13:37:34.539851 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:34 crc kubenswrapper[4732]: I0402 13:37:34.765529 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2cf70a7e805d515372a0e5a88d43a670c0bcfbb5bed48de6518be6c42255e4a5"} Apr 02 13:37:34 crc kubenswrapper[4732]: I0402 13:37:34.765707 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:34 crc kubenswrapper[4732]: I0402 13:37:34.765736 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:37:34 crc kubenswrapper[4732]: I0402 13:37:34.766476 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:34 crc kubenswrapper[4732]: I0402 13:37:34.768867 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:34 crc kubenswrapper[4732]: I0402 13:37:34.768911 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:34 crc kubenswrapper[4732]: I0402 13:37:34.768934 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:34 crc kubenswrapper[4732]: I0402 13:37:34.769004 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:34 crc kubenswrapper[4732]: I0402 13:37:34.769050 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:34 crc kubenswrapper[4732]: I0402 13:37:34.769084 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:34 crc kubenswrapper[4732]: E0402 13:37:34.907944 4732 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Apr 02 13:37:35 crc kubenswrapper[4732]: I0402 13:37:35.541078 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Apr 02 13:37:35 crc kubenswrapper[4732]: I0402 13:37:35.770425 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Apr 02 13:37:35 crc kubenswrapper[4732]: I0402 13:37:35.772486 4732 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="81451d8f3aca9dd0160f159a5261e11e057c46cca08c00b80ac341330ef6ae00" exitCode=255 Apr 02 13:37:35 crc kubenswrapper[4732]: I0402 13:37:35.772582 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"81451d8f3aca9dd0160f159a5261e11e057c46cca08c00b80ac341330ef6ae00"} Apr 02 13:37:35 crc kubenswrapper[4732]: I0402 13:37:35.772788 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:35 crc kubenswrapper[4732]: I0402 13:37:35.774311 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:35 crc kubenswrapper[4732]: I0402 13:37:35.774351 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:35 crc kubenswrapper[4732]: I0402 13:37:35.774361 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:35 crc kubenswrapper[4732]: I0402 13:37:35.774879 4732 scope.go:117] "RemoveContainer" containerID="81451d8f3aca9dd0160f159a5261e11e057c46cca08c00b80ac341330ef6ae00" Apr 02 13:37:35 crc kubenswrapper[4732]: I0402 13:37:35.777262 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4d79199d36c34a73bc1959b79e0ce4c75b0ca34217c84040b93de082d60ccbfd"} Apr 02 13:37:35 crc kubenswrapper[4732]: I0402 13:37:35.777308 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"acc4b20f8f4f165cef3181c09e1a50a16a2b09f86e0b91c15c9bba453c7612ba"} Apr 02 13:37:35 crc kubenswrapper[4732]: I0402 13:37:35.777368 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:35 crc kubenswrapper[4732]: I0402 13:37:35.778331 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:35 crc kubenswrapper[4732]: I0402 13:37:35.778362 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:35 crc kubenswrapper[4732]: I0402 13:37:35.778372 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:35 crc kubenswrapper[4732]: I0402 13:37:35.926389 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Apr 02 13:37:36 crc kubenswrapper[4732]: I0402 13:37:36.785456 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Apr 02 13:37:36 crc kubenswrapper[4732]: I0402 13:37:36.788926 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"37d0290f801221a672f561ed91a3879043deb997a53252f263faa4b544bec066"} Apr 02 13:37:36 crc kubenswrapper[4732]: I0402 13:37:36.789012 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:36 crc kubenswrapper[4732]: I0402 13:37:36.789029 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:36 crc kubenswrapper[4732]: I0402 13:37:36.790510 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:36 crc kubenswrapper[4732]: I0402 13:37:36.790574 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:36 crc kubenswrapper[4732]: I0402 13:37:36.790593 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:36 crc kubenswrapper[4732]: I0402 13:37:36.790931 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:36 crc kubenswrapper[4732]: I0402 13:37:36.791037 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:36 crc kubenswrapper[4732]: I0402 13:37:36.791112 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:37 crc kubenswrapper[4732]: I0402 13:37:37.619904 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:37 crc kubenswrapper[4732]: I0402 13:37:37.621348 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:37 crc kubenswrapper[4732]: I0402 13:37:37.621452 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:37 crc kubenswrapper[4732]: I0402 13:37:37.621529 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:37 crc kubenswrapper[4732]: I0402 13:37:37.621673 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 02 13:37:37 crc kubenswrapper[4732]: I0402 13:37:37.791459 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:37 crc kubenswrapper[4732]: I0402 13:37:37.791571 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:37:37 crc kubenswrapper[4732]: I0402 13:37:37.791607 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:37 crc kubenswrapper[4732]: I0402 13:37:37.792455 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:37 crc kubenswrapper[4732]: I0402 13:37:37.792551 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:37 crc kubenswrapper[4732]: I0402 13:37:37.792630 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:37 crc kubenswrapper[4732]: I0402 13:37:37.792782 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:37 crc kubenswrapper[4732]: I0402 13:37:37.792823 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:37 crc kubenswrapper[4732]: I0402 13:37:37.792834 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:38 crc kubenswrapper[4732]: I0402 13:37:38.042840 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Apr 02 13:37:38 crc kubenswrapper[4732]: I0402 13:37:38.296093 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:37:38 crc kubenswrapper[4732]: I0402 13:37:38.796512 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:38 crc kubenswrapper[4732]: I0402 13:37:38.796705 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:38 crc kubenswrapper[4732]: I0402 13:37:38.798365 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:38 crc kubenswrapper[4732]: I0402 13:37:38.798430 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:38 crc kubenswrapper[4732]: I0402 13:37:38.798471 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:38 crc kubenswrapper[4732]: I0402 13:37:38.799701 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:38 crc kubenswrapper[4732]: I0402 13:37:38.799747 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:38 crc kubenswrapper[4732]: I0402 13:37:38.799764 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:38 crc kubenswrapper[4732]: I0402 13:37:38.979188 4732 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Apr 02 13:37:39 crc kubenswrapper[4732]: I0402 13:37:39.795738 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:39 crc kubenswrapper[4732]: I0402 13:37:39.796874 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:39 crc kubenswrapper[4732]: I0402 13:37:39.796924 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:39 crc kubenswrapper[4732]: I0402 13:37:39.796936 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:41 crc kubenswrapper[4732]: I0402 13:37:41.089035 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:37:41 crc kubenswrapper[4732]: I0402 13:37:41.089203 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:41 crc kubenswrapper[4732]: I0402 13:37:41.090247 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:41 crc kubenswrapper[4732]: I0402 13:37:41.090281 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:41 crc kubenswrapper[4732]: I0402 13:37:41.090292 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:41 crc kubenswrapper[4732]: I0402 13:37:41.096048 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:37:41 crc kubenswrapper[4732]: I0402 13:37:41.190021 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:37:41 crc kubenswrapper[4732]: I0402 13:37:41.720785 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:37:41 crc kubenswrapper[4732]: I0402 13:37:41.724201 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:37:41 crc kubenswrapper[4732]: I0402 13:37:41.801468 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:41 crc kubenswrapper[4732]: I0402 13:37:41.802969 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:41 crc kubenswrapper[4732]: I0402 13:37:41.803016 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:41 crc kubenswrapper[4732]: I0402 13:37:41.803029 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:42 crc kubenswrapper[4732]: I0402 13:37:42.670468 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:37:42 crc kubenswrapper[4732]: I0402 13:37:42.803520 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:42 crc kubenswrapper[4732]: I0402 13:37:42.804635 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:42 crc kubenswrapper[4732]: I0402 13:37:42.804680 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:42 crc kubenswrapper[4732]: I0402 13:37:42.804691 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:43 crc kubenswrapper[4732]: I0402 13:37:43.806091 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:43 crc kubenswrapper[4732]: I0402 13:37:43.806886 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:43 crc kubenswrapper[4732]: I0402 13:37:43.806932 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:43 crc kubenswrapper[4732]: I0402 13:37:43.806944 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:44 crc kubenswrapper[4732]: I0402 13:37:44.190860 4732 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 02 13:37:44 crc kubenswrapper[4732]: I0402 13:37:44.190970 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 02 13:37:44 crc kubenswrapper[4732]: E0402 13:37:44.908190 4732 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Apr 02 13:37:46 crc kubenswrapper[4732]: I0402 13:37:46.037417 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Apr 02 13:37:46 crc kubenswrapper[4732]: I0402 13:37:46.037549 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:46 crc kubenswrapper[4732]: I0402 13:37:46.038650 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:46 crc kubenswrapper[4732]: I0402 13:37:46.038694 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:46 crc kubenswrapper[4732]: I0402 13:37:46.038710 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:46 crc kubenswrapper[4732]: I0402 13:37:46.057787 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Apr 02 13:37:46 crc kubenswrapper[4732]: I0402 13:37:46.541097 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Apr 02 13:37:46 crc kubenswrapper[4732]: I0402 13:37:46.812058 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:46 crc kubenswrapper[4732]: I0402 13:37:46.813166 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:46 crc kubenswrapper[4732]: I0402 13:37:46.813214 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:46 crc kubenswrapper[4732]: I0402 13:37:46.813226 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:47 crc kubenswrapper[4732]: E0402 13:37:47.153223 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="7s" Apr 02 13:37:47 crc kubenswrapper[4732]: E0402 13:37:47.181022 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:47Z is after 2026-02-23T05:33:13Z" node="crc" Apr 02 13:37:47 crc kubenswrapper[4732]: E0402 13:37:47.188934 4732 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:47Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Apr 02 13:37:47 crc kubenswrapper[4732]: W0402 13:37:47.190648 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:47Z is after 2026-02-23T05:33:13Z Apr 02 13:37:47 crc kubenswrapper[4732]: E0402 13:37:47.190747 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:47Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Apr 02 13:37:47 crc kubenswrapper[4732]: W0402 13:37:47.192060 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:47Z is after 2026-02-23T05:33:13Z Apr 02 13:37:47 crc kubenswrapper[4732]: E0402 13:37:47.192140 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:47Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Apr 02 13:37:47 crc kubenswrapper[4732]: W0402 13:37:47.195130 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:47Z is after 2026-02-23T05:33:13Z Apr 02 13:37:47 crc kubenswrapper[4732]: E0402 13:37:47.195198 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:47Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Apr 02 13:37:47 crc kubenswrapper[4732]: W0402 13:37:47.196701 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:47Z is after 2026-02-23T05:33:13Z Apr 02 13:37:47 crc kubenswrapper[4732]: E0402 13:37:47.196818 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:47Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Apr 02 13:37:47 crc kubenswrapper[4732]: E0402 13:37:47.207661 4732 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:47Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.18a28dbca7652dff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.539190783 +0000 UTC m=+1.443598346,LastTimestamp:2026-04-02 13:37:24.539190783 +0000 UTC m=+1.443598346,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:37:47 crc kubenswrapper[4732]: I0402 13:37:47.208308 4732 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Apr 02 13:37:47 crc kubenswrapper[4732]: I0402 13:37:47.208387 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Apr 02 13:37:47 crc kubenswrapper[4732]: I0402 13:37:47.212476 4732 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Apr 02 13:37:47 crc kubenswrapper[4732]: I0402 13:37:47.212762 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Apr 02 13:37:47 crc kubenswrapper[4732]: I0402 13:37:47.543020 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:47Z is after 2026-02-23T05:33:13Z Apr 02 13:37:47 crc kubenswrapper[4732]: I0402 13:37:47.816201 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Apr 02 13:37:47 crc kubenswrapper[4732]: I0402 13:37:47.817035 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Apr 02 13:37:47 crc kubenswrapper[4732]: I0402 13:37:47.818713 4732 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="37d0290f801221a672f561ed91a3879043deb997a53252f263faa4b544bec066" exitCode=255 Apr 02 13:37:47 crc kubenswrapper[4732]: I0402 13:37:47.818754 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"37d0290f801221a672f561ed91a3879043deb997a53252f263faa4b544bec066"} Apr 02 13:37:47 crc kubenswrapper[4732]: I0402 13:37:47.818805 4732 scope.go:117] "RemoveContainer" containerID="81451d8f3aca9dd0160f159a5261e11e057c46cca08c00b80ac341330ef6ae00" Apr 02 13:37:47 crc kubenswrapper[4732]: I0402 13:37:47.818966 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:47 crc kubenswrapper[4732]: I0402 13:37:47.820342 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:47 crc kubenswrapper[4732]: I0402 13:37:47.820371 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:47 crc kubenswrapper[4732]: I0402 13:37:47.820379 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:47 crc kubenswrapper[4732]: I0402 13:37:47.820935 4732 scope.go:117] "RemoveContainer" containerID="37d0290f801221a672f561ed91a3879043deb997a53252f263faa4b544bec066" Apr 02 13:37:47 crc kubenswrapper[4732]: E0402 13:37:47.821198 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 02 13:37:48 crc kubenswrapper[4732]: I0402 13:37:48.301112 4732 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]log ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]etcd ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-startkubeinformers ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/start-apiserver-admission-initializer ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-api-request-count-filter ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/generic-apiserver-start-informers ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/priority-and-fairness-config-consumer ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/priority-and-fairness-filter ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/storage-object-count-tracker-hook ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/start-apiextensions-informers ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/start-apiextensions-controllers ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/crd-informer-synced ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/start-system-namespaces-controller ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/start-cluster-authentication-info-controller ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/start-legacy-token-tracking-controller ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/start-service-ip-repair-controllers ok Apr 02 13:37:48 crc kubenswrapper[4732]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/priority-and-fairness-config-producer ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/bootstrap-controller ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/start-kube-aggregator-informers ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/apiservice-status-local-available-controller ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/apiservice-status-remote-available-controller ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/apiservice-registration-controller ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/apiservice-wait-for-first-sync ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/apiservice-discovery-controller ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/kube-apiserver-autoregistration ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]autoregister-completion ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/apiservice-openapi-controller ok Apr 02 13:37:48 crc kubenswrapper[4732]: [+]poststarthook/apiservice-openapiv3-controller ok Apr 02 13:37:48 crc kubenswrapper[4732]: livez check failed Apr 02 13:37:48 crc kubenswrapper[4732]: I0402 13:37:48.301208 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:37:48 crc kubenswrapper[4732]: I0402 13:37:48.543065 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:48Z is after 2026-02-23T05:33:13Z Apr 02 13:37:48 crc kubenswrapper[4732]: I0402 13:37:48.822251 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Apr 02 13:37:49 crc kubenswrapper[4732]: I0402 13:37:49.546310 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:49Z is after 2026-02-23T05:33:13Z Apr 02 13:37:50 crc kubenswrapper[4732]: I0402 13:37:50.114138 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:37:50 crc kubenswrapper[4732]: I0402 13:37:50.114312 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:50 crc kubenswrapper[4732]: I0402 13:37:50.115172 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:50 crc kubenswrapper[4732]: I0402 13:37:50.115203 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:50 crc kubenswrapper[4732]: I0402 13:37:50.115212 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:50 crc kubenswrapper[4732]: I0402 13:37:50.115639 4732 scope.go:117] "RemoveContainer" containerID="37d0290f801221a672f561ed91a3879043deb997a53252f263faa4b544bec066" Apr 02 13:37:50 crc kubenswrapper[4732]: E0402 13:37:50.115795 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 02 13:37:50 crc kubenswrapper[4732]: I0402 13:37:50.543272 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:50Z is after 2026-02-23T05:33:13Z Apr 02 13:37:51 crc kubenswrapper[4732]: I0402 13:37:51.542658 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:51Z is after 2026-02-23T05:33:13Z Apr 02 13:37:52 crc kubenswrapper[4732]: I0402 13:37:52.544279 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:52Z is after 2026-02-23T05:33:13Z Apr 02 13:37:53 crc kubenswrapper[4732]: I0402 13:37:53.303159 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:37:53 crc kubenswrapper[4732]: I0402 13:37:53.303314 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:53 crc kubenswrapper[4732]: I0402 13:37:53.304669 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:53 crc kubenswrapper[4732]: I0402 13:37:53.304786 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:53 crc kubenswrapper[4732]: I0402 13:37:53.304854 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:53 crc kubenswrapper[4732]: I0402 13:37:53.305421 4732 scope.go:117] "RemoveContainer" containerID="37d0290f801221a672f561ed91a3879043deb997a53252f263faa4b544bec066" Apr 02 13:37:53 crc kubenswrapper[4732]: E0402 13:37:53.305680 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 02 13:37:53 crc kubenswrapper[4732]: I0402 13:37:53.307931 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:37:53 crc kubenswrapper[4732]: I0402 13:37:53.543981 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:53Z is after 2026-02-23T05:33:13Z Apr 02 13:37:53 crc kubenswrapper[4732]: I0402 13:37:53.838969 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:53 crc kubenswrapper[4732]: I0402 13:37:53.840517 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:53 crc kubenswrapper[4732]: I0402 13:37:53.840552 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:53 crc kubenswrapper[4732]: I0402 13:37:53.840567 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:53 crc kubenswrapper[4732]: I0402 13:37:53.841369 4732 scope.go:117] "RemoveContainer" containerID="37d0290f801221a672f561ed91a3879043deb997a53252f263faa4b544bec066" Apr 02 13:37:53 crc kubenswrapper[4732]: E0402 13:37:53.841598 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 02 13:37:54 crc kubenswrapper[4732]: E0402 13:37:54.157447 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:54Z is after 2026-02-23T05:33:13Z" interval="7s" Apr 02 13:37:54 crc kubenswrapper[4732]: I0402 13:37:54.181419 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:37:54 crc kubenswrapper[4732]: I0402 13:37:54.182738 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:37:54 crc kubenswrapper[4732]: I0402 13:37:54.182796 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:37:54 crc kubenswrapper[4732]: I0402 13:37:54.182819 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:37:54 crc kubenswrapper[4732]: I0402 13:37:54.182860 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 02 13:37:54 crc kubenswrapper[4732]: E0402 13:37:54.186344 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:54Z is after 2026-02-23T05:33:13Z" node="crc" Apr 02 13:37:54 crc kubenswrapper[4732]: I0402 13:37:54.190714 4732 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 02 13:37:54 crc kubenswrapper[4732]: I0402 13:37:54.190786 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 02 13:37:54 crc kubenswrapper[4732]: I0402 13:37:54.543108 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:54Z is after 2026-02-23T05:33:13Z Apr 02 13:37:54 crc kubenswrapper[4732]: E0402 13:37:54.908338 4732 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Apr 02 13:37:55 crc kubenswrapper[4732]: I0402 13:37:55.542466 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:55Z is after 2026-02-23T05:33:13Z Apr 02 13:37:56 crc kubenswrapper[4732]: I0402 13:37:56.542860 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:56Z is after 2026-02-23T05:33:13Z Apr 02 13:37:57 crc kubenswrapper[4732]: E0402 13:37:57.215068 4732 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:57Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.18a28dbca7652dff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.539190783 +0000 UTC m=+1.443598346,LastTimestamp:2026-04-02 13:37:24.539190783 +0000 UTC m=+1.443598346,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:37:57 crc kubenswrapper[4732]: I0402 13:37:57.543414 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:57Z is after 2026-02-23T05:33:13Z Apr 02 13:37:58 crc kubenswrapper[4732]: I0402 13:37:58.543016 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:58Z is after 2026-02-23T05:33:13Z Apr 02 13:37:59 crc kubenswrapper[4732]: I0402 13:37:59.545862 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:37:59Z is after 2026-02-23T05:33:13Z Apr 02 13:38:00 crc kubenswrapper[4732]: I0402 13:38:00.541500 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:00Z is after 2026-02-23T05:33:13Z Apr 02 13:38:00 crc kubenswrapper[4732]: W0402 13:38:00.836117 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:00Z is after 2026-02-23T05:33:13Z Apr 02 13:38:00 crc kubenswrapper[4732]: E0402 13:38:00.836805 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Apr 02 13:38:01 crc kubenswrapper[4732]: W0402 13:38:01.091071 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:01Z is after 2026-02-23T05:33:13Z Apr 02 13:38:01 crc kubenswrapper[4732]: E0402 13:38:01.091133 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Apr 02 13:38:01 crc kubenswrapper[4732]: E0402 13:38:01.163432 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:01Z is after 2026-02-23T05:33:13Z" interval="7s" Apr 02 13:38:01 crc kubenswrapper[4732]: I0402 13:38:01.186941 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:01 crc kubenswrapper[4732]: I0402 13:38:01.188522 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:01 crc kubenswrapper[4732]: I0402 13:38:01.188567 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:01 crc kubenswrapper[4732]: I0402 13:38:01.188581 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:01 crc kubenswrapper[4732]: I0402 13:38:01.188606 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 02 13:38:01 crc kubenswrapper[4732]: E0402 13:38:01.191327 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:01Z is after 2026-02-23T05:33:13Z" node="crc" Apr 02 13:38:01 crc kubenswrapper[4732]: I0402 13:38:01.542785 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:01Z is after 2026-02-23T05:33:13Z Apr 02 13:38:02 crc kubenswrapper[4732]: I0402 13:38:02.545234 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:02Z is after 2026-02-23T05:33:13Z Apr 02 13:38:03 crc kubenswrapper[4732]: I0402 13:38:03.544933 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:03Z is after 2026-02-23T05:33:13Z Apr 02 13:38:03 crc kubenswrapper[4732]: I0402 13:38:03.614282 4732 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:41776->192.168.126.11:10357: read: connection reset by peer" start-of-body= Apr 02 13:38:03 crc kubenswrapper[4732]: I0402 13:38:03.614361 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:41776->192.168.126.11:10357: read: connection reset by peer" Apr 02 13:38:03 crc kubenswrapper[4732]: I0402 13:38:03.614417 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:38:03 crc kubenswrapper[4732]: I0402 13:38:03.614548 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:03 crc kubenswrapper[4732]: I0402 13:38:03.616426 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:03 crc kubenswrapper[4732]: I0402 13:38:03.616462 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:03 crc kubenswrapper[4732]: I0402 13:38:03.616470 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:03 crc kubenswrapper[4732]: I0402 13:38:03.616999 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"ece1f4c1e4f39f8c7ee97788bc59b96de13c07e7bbbf06423016e4510a40fc62"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Apr 02 13:38:03 crc kubenswrapper[4732]: I0402 13:38:03.617140 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://ece1f4c1e4f39f8c7ee97788bc59b96de13c07e7bbbf06423016e4510a40fc62" gracePeriod=30 Apr 02 13:38:03 crc kubenswrapper[4732]: I0402 13:38:03.865488 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Apr 02 13:38:03 crc kubenswrapper[4732]: I0402 13:38:03.866095 4732 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ece1f4c1e4f39f8c7ee97788bc59b96de13c07e7bbbf06423016e4510a40fc62" exitCode=255 Apr 02 13:38:03 crc kubenswrapper[4732]: I0402 13:38:03.866132 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ece1f4c1e4f39f8c7ee97788bc59b96de13c07e7bbbf06423016e4510a40fc62"} Apr 02 13:38:04 crc kubenswrapper[4732]: I0402 13:38:04.543458 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:04Z is after 2026-02-23T05:33:13Z Apr 02 13:38:04 crc kubenswrapper[4732]: I0402 13:38:04.749803 4732 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Apr 02 13:38:04 crc kubenswrapper[4732]: E0402 13:38:04.753174 4732 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:04Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Apr 02 13:38:04 crc kubenswrapper[4732]: E0402 13:38:04.755124 4732 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Apr 02 13:38:04 crc kubenswrapper[4732]: I0402 13:38:04.870556 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Apr 02 13:38:04 crc kubenswrapper[4732]: I0402 13:38:04.871324 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8320a4ebc7edb8ed6702aa82a986ae487f9903f3f52ca9e7388135c88a1a8dea"} Apr 02 13:38:04 crc kubenswrapper[4732]: I0402 13:38:04.871449 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:04 crc kubenswrapper[4732]: I0402 13:38:04.872703 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:04 crc kubenswrapper[4732]: I0402 13:38:04.872733 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:04 crc kubenswrapper[4732]: I0402 13:38:04.872787 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:04 crc kubenswrapper[4732]: E0402 13:38:04.908467 4732 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Apr 02 13:38:05 crc kubenswrapper[4732]: I0402 13:38:05.544701 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:05Z is after 2026-02-23T05:33:13Z Apr 02 13:38:05 crc kubenswrapper[4732]: I0402 13:38:05.879767 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:05 crc kubenswrapper[4732]: I0402 13:38:05.880949 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:05 crc kubenswrapper[4732]: I0402 13:38:05.880994 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:05 crc kubenswrapper[4732]: I0402 13:38:05.881018 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:06 crc kubenswrapper[4732]: I0402 13:38:06.545300 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:06Z is after 2026-02-23T05:33:13Z Apr 02 13:38:07 crc kubenswrapper[4732]: E0402 13:38:07.220674 4732 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:07Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.18a28dbca7652dff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.539190783 +0000 UTC m=+1.443598346,LastTimestamp:2026-04-02 13:37:24.539190783 +0000 UTC m=+1.443598346,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:07 crc kubenswrapper[4732]: I0402 13:38:07.541888 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:07Z is after 2026-02-23T05:33:13Z Apr 02 13:38:07 crc kubenswrapper[4732]: I0402 13:38:07.680826 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:07 crc kubenswrapper[4732]: I0402 13:38:07.682019 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:07 crc kubenswrapper[4732]: I0402 13:38:07.682069 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:07 crc kubenswrapper[4732]: I0402 13:38:07.682080 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:07 crc kubenswrapper[4732]: I0402 13:38:07.682717 4732 scope.go:117] "RemoveContainer" containerID="37d0290f801221a672f561ed91a3879043deb997a53252f263faa4b544bec066" Apr 02 13:38:08 crc kubenswrapper[4732]: E0402 13:38:08.168318 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:08Z is after 2026-02-23T05:33:13Z" interval="7s" Apr 02 13:38:08 crc kubenswrapper[4732]: I0402 13:38:08.191820 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:08 crc kubenswrapper[4732]: I0402 13:38:08.194161 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:08 crc kubenswrapper[4732]: I0402 13:38:08.194242 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:08 crc kubenswrapper[4732]: I0402 13:38:08.194266 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:08 crc kubenswrapper[4732]: I0402 13:38:08.194314 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 02 13:38:08 crc kubenswrapper[4732]: E0402 13:38:08.200798 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:08Z is after 2026-02-23T05:33:13Z" node="crc" Apr 02 13:38:08 crc kubenswrapper[4732]: I0402 13:38:08.542944 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:08Z is after 2026-02-23T05:33:13Z Apr 02 13:38:08 crc kubenswrapper[4732]: I0402 13:38:08.889324 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Apr 02 13:38:08 crc kubenswrapper[4732]: I0402 13:38:08.889894 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Apr 02 13:38:08 crc kubenswrapper[4732]: I0402 13:38:08.892135 4732 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b68bfa0db91a1c077fbe5e3c2695eea525f21b149667d23845099c3eb2e7c21d" exitCode=255 Apr 02 13:38:08 crc kubenswrapper[4732]: I0402 13:38:08.892183 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b68bfa0db91a1c077fbe5e3c2695eea525f21b149667d23845099c3eb2e7c21d"} Apr 02 13:38:08 crc kubenswrapper[4732]: I0402 13:38:08.892256 4732 scope.go:117] "RemoveContainer" containerID="37d0290f801221a672f561ed91a3879043deb997a53252f263faa4b544bec066" Apr 02 13:38:08 crc kubenswrapper[4732]: I0402 13:38:08.892447 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:08 crc kubenswrapper[4732]: I0402 13:38:08.893304 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:08 crc kubenswrapper[4732]: I0402 13:38:08.893333 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:08 crc kubenswrapper[4732]: I0402 13:38:08.893345 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:08 crc kubenswrapper[4732]: I0402 13:38:08.893924 4732 scope.go:117] "RemoveContainer" containerID="b68bfa0db91a1c077fbe5e3c2695eea525f21b149667d23845099c3eb2e7c21d" Apr 02 13:38:08 crc kubenswrapper[4732]: E0402 13:38:08.894176 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 02 13:38:08 crc kubenswrapper[4732]: W0402 13:38:08.909227 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:08Z is after 2026-02-23T05:33:13Z Apr 02 13:38:08 crc kubenswrapper[4732]: E0402 13:38:08.909279 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:08Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Apr 02 13:38:09 crc kubenswrapper[4732]: I0402 13:38:09.541886 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:09Z is after 2026-02-23T05:33:13Z Apr 02 13:38:09 crc kubenswrapper[4732]: I0402 13:38:09.896005 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Apr 02 13:38:10 crc kubenswrapper[4732]: I0402 13:38:10.114712 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:38:10 crc kubenswrapper[4732]: I0402 13:38:10.114948 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:10 crc kubenswrapper[4732]: I0402 13:38:10.116306 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:10 crc kubenswrapper[4732]: I0402 13:38:10.116337 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:10 crc kubenswrapper[4732]: I0402 13:38:10.116346 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:10 crc kubenswrapper[4732]: I0402 13:38:10.116806 4732 scope.go:117] "RemoveContainer" containerID="b68bfa0db91a1c077fbe5e3c2695eea525f21b149667d23845099c3eb2e7c21d" Apr 02 13:38:10 crc kubenswrapper[4732]: E0402 13:38:10.116978 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 02 13:38:10 crc kubenswrapper[4732]: I0402 13:38:10.542380 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:10Z is after 2026-02-23T05:33:13Z Apr 02 13:38:11 crc kubenswrapper[4732]: W0402 13:38:11.100806 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:11Z is after 2026-02-23T05:33:13Z Apr 02 13:38:11 crc kubenswrapper[4732]: E0402 13:38:11.100882 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Apr 02 13:38:11 crc kubenswrapper[4732]: I0402 13:38:11.189628 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:38:11 crc kubenswrapper[4732]: I0402 13:38:11.189784 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:11 crc kubenswrapper[4732]: I0402 13:38:11.190948 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:11 crc kubenswrapper[4732]: I0402 13:38:11.190991 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:11 crc kubenswrapper[4732]: I0402 13:38:11.191006 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:11 crc kubenswrapper[4732]: I0402 13:38:11.542433 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:38:11 crc kubenswrapper[4732]: I0402 13:38:11.542837 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:11 crc kubenswrapper[4732]: I0402 13:38:11.544208 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:11 crc kubenswrapper[4732]: I0402 13:38:11.544258 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:11 crc kubenswrapper[4732]: I0402 13:38:11.544272 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:11 crc kubenswrapper[4732]: I0402 13:38:11.544941 4732 scope.go:117] "RemoveContainer" containerID="b68bfa0db91a1c077fbe5e3c2695eea525f21b149667d23845099c3eb2e7c21d" Apr 02 13:38:11 crc kubenswrapper[4732]: E0402 13:38:11.545132 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 02 13:38:11 crc kubenswrapper[4732]: I0402 13:38:11.545181 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:11Z is after 2026-02-23T05:33:13Z Apr 02 13:38:12 crc kubenswrapper[4732]: I0402 13:38:12.544755 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:12Z is after 2026-02-23T05:33:13Z Apr 02 13:38:12 crc kubenswrapper[4732]: I0402 13:38:12.670563 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:38:12 crc kubenswrapper[4732]: I0402 13:38:12.670728 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:12 crc kubenswrapper[4732]: I0402 13:38:12.671681 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:12 crc kubenswrapper[4732]: I0402 13:38:12.671721 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:12 crc kubenswrapper[4732]: I0402 13:38:12.671732 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:13 crc kubenswrapper[4732]: I0402 13:38:13.545225 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:13Z is after 2026-02-23T05:33:13Z Apr 02 13:38:14 crc kubenswrapper[4732]: I0402 13:38:14.190297 4732 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 02 13:38:14 crc kubenswrapper[4732]: I0402 13:38:14.190401 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 02 13:38:14 crc kubenswrapper[4732]: I0402 13:38:14.543009 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:14Z is after 2026-02-23T05:33:13Z Apr 02 13:38:14 crc kubenswrapper[4732]: E0402 13:38:14.909460 4732 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Apr 02 13:38:15 crc kubenswrapper[4732]: E0402 13:38:15.172898 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:15Z is after 2026-02-23T05:33:13Z" interval="7s" Apr 02 13:38:15 crc kubenswrapper[4732]: I0402 13:38:15.201947 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:15 crc kubenswrapper[4732]: I0402 13:38:15.203484 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:15 crc kubenswrapper[4732]: I0402 13:38:15.203551 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:15 crc kubenswrapper[4732]: I0402 13:38:15.203574 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:15 crc kubenswrapper[4732]: I0402 13:38:15.203647 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 02 13:38:15 crc kubenswrapper[4732]: E0402 13:38:15.207446 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:15Z is after 2026-02-23T05:33:13Z" node="crc" Apr 02 13:38:15 crc kubenswrapper[4732]: I0402 13:38:15.544353 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:15Z is after 2026-02-23T05:33:13Z Apr 02 13:38:16 crc kubenswrapper[4732]: I0402 13:38:16.544921 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:16Z is after 2026-02-23T05:33:13Z Apr 02 13:38:17 crc kubenswrapper[4732]: E0402 13:38:17.225125 4732 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:17Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.18a28dbca7652dff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.539190783 +0000 UTC m=+1.443598346,LastTimestamp:2026-04-02 13:37:24.539190783 +0000 UTC m=+1.443598346,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:17 crc kubenswrapper[4732]: I0402 13:38:17.542673 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:17Z is after 2026-02-23T05:33:13Z Apr 02 13:38:18 crc kubenswrapper[4732]: I0402 13:38:18.543542 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:18Z is after 2026-02-23T05:33:13Z Apr 02 13:38:19 crc kubenswrapper[4732]: I0402 13:38:19.546784 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:19Z is after 2026-02-23T05:33:13Z Apr 02 13:38:20 crc kubenswrapper[4732]: I0402 13:38:20.543032 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:20Z is after 2026-02-23T05:33:13Z Apr 02 13:38:21 crc kubenswrapper[4732]: I0402 13:38:21.543946 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:21Z is after 2026-02-23T05:33:13Z Apr 02 13:38:21 crc kubenswrapper[4732]: I0402 13:38:21.727878 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:38:21 crc kubenswrapper[4732]: I0402 13:38:21.728147 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:21 crc kubenswrapper[4732]: I0402 13:38:21.729534 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:21 crc kubenswrapper[4732]: I0402 13:38:21.729591 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:21 crc kubenswrapper[4732]: I0402 13:38:21.729647 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:22 crc kubenswrapper[4732]: E0402 13:38:22.179948 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:22Z is after 2026-02-23T05:33:13Z" interval="7s" Apr 02 13:38:22 crc kubenswrapper[4732]: I0402 13:38:22.208242 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:22 crc kubenswrapper[4732]: I0402 13:38:22.209848 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:22 crc kubenswrapper[4732]: I0402 13:38:22.210102 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:22 crc kubenswrapper[4732]: I0402 13:38:22.210282 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:22 crc kubenswrapper[4732]: I0402 13:38:22.210459 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 02 13:38:22 crc kubenswrapper[4732]: E0402 13:38:22.214222 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:22Z is after 2026-02-23T05:33:13Z" node="crc" Apr 02 13:38:22 crc kubenswrapper[4732]: I0402 13:38:22.545191 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:22Z is after 2026-02-23T05:33:13Z Apr 02 13:38:23 crc kubenswrapper[4732]: I0402 13:38:23.544918 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:23Z is after 2026-02-23T05:33:13Z Apr 02 13:38:23 crc kubenswrapper[4732]: I0402 13:38:23.679865 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:23 crc kubenswrapper[4732]: I0402 13:38:23.681259 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:23 crc kubenswrapper[4732]: I0402 13:38:23.681289 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:23 crc kubenswrapper[4732]: I0402 13:38:23.681299 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:23 crc kubenswrapper[4732]: I0402 13:38:23.681877 4732 scope.go:117] "RemoveContainer" containerID="b68bfa0db91a1c077fbe5e3c2695eea525f21b149667d23845099c3eb2e7c21d" Apr 02 13:38:23 crc kubenswrapper[4732]: E0402 13:38:23.682057 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 02 13:38:24 crc kubenswrapper[4732]: I0402 13:38:24.190845 4732 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 02 13:38:24 crc kubenswrapper[4732]: I0402 13:38:24.190941 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 02 13:38:24 crc kubenswrapper[4732]: I0402 13:38:24.544986 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:24Z is after 2026-02-23T05:33:13Z Apr 02 13:38:24 crc kubenswrapper[4732]: E0402 13:38:24.910379 4732 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Apr 02 13:38:25 crc kubenswrapper[4732]: I0402 13:38:25.544426 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:25Z is after 2026-02-23T05:33:13Z Apr 02 13:38:26 crc kubenswrapper[4732]: I0402 13:38:26.547804 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:26Z is after 2026-02-23T05:33:13Z Apr 02 13:38:27 crc kubenswrapper[4732]: E0402 13:38:27.231440 4732 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:27Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.18a28dbca7652dff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.539190783 +0000 UTC m=+1.443598346,LastTimestamp:2026-04-02 13:37:24.539190783 +0000 UTC m=+1.443598346,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:27 crc kubenswrapper[4732]: I0402 13:38:27.545638 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:38:27Z is after 2026-02-23T05:33:13Z Apr 02 13:38:28 crc kubenswrapper[4732]: I0402 13:38:28.551295 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 02 13:38:29 crc kubenswrapper[4732]: E0402 13:38:29.188686 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 02 13:38:29 crc kubenswrapper[4732]: I0402 13:38:29.214811 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:29 crc kubenswrapper[4732]: I0402 13:38:29.215943 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:29 crc kubenswrapper[4732]: I0402 13:38:29.215974 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:29 crc kubenswrapper[4732]: I0402 13:38:29.215985 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:29 crc kubenswrapper[4732]: I0402 13:38:29.216006 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 02 13:38:29 crc kubenswrapper[4732]: E0402 13:38:29.223291 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Apr 02 13:38:29 crc kubenswrapper[4732]: I0402 13:38:29.547703 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 02 13:38:30 crc kubenswrapper[4732]: I0402 13:38:30.543447 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 02 13:38:31 crc kubenswrapper[4732]: I0402 13:38:31.543734 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 02 13:38:32 crc kubenswrapper[4732]: I0402 13:38:32.545500 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 02 13:38:33 crc kubenswrapper[4732]: I0402 13:38:33.544385 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 02 13:38:34 crc kubenswrapper[4732]: I0402 13:38:34.189854 4732 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 02 13:38:34 crc kubenswrapper[4732]: I0402 13:38:34.190007 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 02 13:38:34 crc kubenswrapper[4732]: I0402 13:38:34.190093 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:38:34 crc kubenswrapper[4732]: I0402 13:38:34.190306 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:34 crc kubenswrapper[4732]: I0402 13:38:34.191703 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:34 crc kubenswrapper[4732]: I0402 13:38:34.191754 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:34 crc kubenswrapper[4732]: I0402 13:38:34.191763 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:34 crc kubenswrapper[4732]: I0402 13:38:34.192223 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"8320a4ebc7edb8ed6702aa82a986ae487f9903f3f52ca9e7388135c88a1a8dea"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Apr 02 13:38:34 crc kubenswrapper[4732]: I0402 13:38:34.192316 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://8320a4ebc7edb8ed6702aa82a986ae487f9903f3f52ca9e7388135c88a1a8dea" gracePeriod=30 Apr 02 13:38:34 crc kubenswrapper[4732]: I0402 13:38:34.543365 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 02 13:38:34 crc kubenswrapper[4732]: E0402 13:38:34.911773 4732 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Apr 02 13:38:34 crc kubenswrapper[4732]: I0402 13:38:34.964522 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Apr 02 13:38:34 crc kubenswrapper[4732]: I0402 13:38:34.966070 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Apr 02 13:38:34 crc kubenswrapper[4732]: I0402 13:38:34.966561 4732 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="8320a4ebc7edb8ed6702aa82a986ae487f9903f3f52ca9e7388135c88a1a8dea" exitCode=255 Apr 02 13:38:34 crc kubenswrapper[4732]: I0402 13:38:34.966658 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"8320a4ebc7edb8ed6702aa82a986ae487f9903f3f52ca9e7388135c88a1a8dea"} Apr 02 13:38:34 crc kubenswrapper[4732]: I0402 13:38:34.966848 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e597b0db2773d2bb9d7e673759dee264ef036a174be679e2ccfa8ba03cf5c6c9"} Apr 02 13:38:34 crc kubenswrapper[4732]: I0402 13:38:34.966907 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:34 crc kubenswrapper[4732]: I0402 13:38:34.966928 4732 scope.go:117] "RemoveContainer" containerID="ece1f4c1e4f39f8c7ee97788bc59b96de13c07e7bbbf06423016e4510a40fc62" Apr 02 13:38:34 crc kubenswrapper[4732]: I0402 13:38:34.967973 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:34 crc kubenswrapper[4732]: I0402 13:38:34.968007 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:34 crc kubenswrapper[4732]: I0402 13:38:34.968018 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:35 crc kubenswrapper[4732]: I0402 13:38:35.544939 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 02 13:38:35 crc kubenswrapper[4732]: I0402 13:38:35.679668 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:35 crc kubenswrapper[4732]: I0402 13:38:35.680853 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:35 crc kubenswrapper[4732]: I0402 13:38:35.680904 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:35 crc kubenswrapper[4732]: I0402 13:38:35.680924 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:35 crc kubenswrapper[4732]: I0402 13:38:35.970579 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Apr 02 13:38:35 crc kubenswrapper[4732]: I0402 13:38:35.971654 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:35 crc kubenswrapper[4732]: I0402 13:38:35.972465 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:35 crc kubenswrapper[4732]: I0402 13:38:35.972510 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:35 crc kubenswrapper[4732]: I0402 13:38:35.972519 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:36 crc kubenswrapper[4732]: E0402 13:38:36.193525 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 02 13:38:36 crc kubenswrapper[4732]: I0402 13:38:36.223601 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:36 crc kubenswrapper[4732]: I0402 13:38:36.225586 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:36 crc kubenswrapper[4732]: I0402 13:38:36.225704 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:36 crc kubenswrapper[4732]: I0402 13:38:36.225732 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:36 crc kubenswrapper[4732]: I0402 13:38:36.225812 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 02 13:38:36 crc kubenswrapper[4732]: E0402 13:38:36.233125 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Apr 02 13:38:36 crc kubenswrapper[4732]: I0402 13:38:36.544938 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 02 13:38:36 crc kubenswrapper[4732]: I0402 13:38:36.757393 4732 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Apr 02 13:38:36 crc kubenswrapper[4732]: I0402 13:38:36.773265 4732 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.237504 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a28dbca7652dff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.539190783 +0000 UTC m=+1.443598346,LastTimestamp:2026-04-02 13:37:24.539190783 +0000 UTC m=+1.443598346,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.244339 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a28dbcaacbc3db default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.596245467 +0000 UTC m=+1.500653020,LastTimestamp:2026-04-02 13:37:24.596245467 +0000 UTC m=+1.500653020,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.250831 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a28dbcaacc0b6b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.596263787 +0000 UTC m=+1.500671340,LastTimestamp:2026-04-02 13:37:24.596263787 +0000 UTC m=+1.500671340,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.257685 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a28dbcaacc2bd7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.596272087 +0000 UTC m=+1.500679640,LastTimestamp:2026-04-02 13:37:24.596272087 +0000 UTC m=+1.500679640,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.261726 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a28dbcbd15f5fe default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.903097854 +0000 UTC m=+1.807505417,LastTimestamp:2026-04-02 13:37:24.903097854 +0000 UTC m=+1.807505417,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.267440 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a28dbcaacbc3db\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a28dbcaacbc3db default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.596245467 +0000 UTC m=+1.500653020,LastTimestamp:2026-04-02 13:37:24.981676848 +0000 UTC m=+1.886084411,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.274570 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a28dbcaacc0b6b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a28dbcaacc0b6b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.596263787 +0000 UTC m=+1.500671340,LastTimestamp:2026-04-02 13:37:24.981697858 +0000 UTC m=+1.886105431,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.281017 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a28dbcaacc2bd7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a28dbcaacc2bd7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.596272087 +0000 UTC m=+1.500679640,LastTimestamp:2026-04-02 13:37:24.981712619 +0000 UTC m=+1.886120182,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.288216 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a28dbcaacbc3db\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a28dbcaacbc3db default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.596245467 +0000 UTC m=+1.500653020,LastTimestamp:2026-04-02 13:37:24.983465993 +0000 UTC m=+1.887873546,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.295048 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a28dbcaacc0b6b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a28dbcaacc0b6b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.596263787 +0000 UTC m=+1.500671340,LastTimestamp:2026-04-02 13:37:24.983498973 +0000 UTC m=+1.887906526,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.300086 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a28dbcaacc2bd7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a28dbcaacc2bd7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.596272087 +0000 UTC m=+1.500679640,LastTimestamp:2026-04-02 13:37:24.983506214 +0000 UTC m=+1.887913767,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.309838 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a28dbcaacbc3db\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a28dbcaacbc3db default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.596245467 +0000 UTC m=+1.500653020,LastTimestamp:2026-04-02 13:37:24.984208981 +0000 UTC m=+1.888616554,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.314658 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a28dbcaacc0b6b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a28dbcaacc0b6b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.596263787 +0000 UTC m=+1.500671340,LastTimestamp:2026-04-02 13:37:24.984224662 +0000 UTC m=+1.888632225,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.318985 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a28dbcaacc2bd7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a28dbcaacc2bd7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.596272087 +0000 UTC m=+1.500679640,LastTimestamp:2026-04-02 13:37:24.984237562 +0000 UTC m=+1.888645125,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.325387 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a28dbcaacbc3db\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a28dbcaacbc3db default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.596245467 +0000 UTC m=+1.500653020,LastTimestamp:2026-04-02 13:37:24.985158315 +0000 UTC m=+1.889565868,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.331671 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a28dbcaacc0b6b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a28dbcaacc0b6b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.596263787 +0000 UTC m=+1.500671340,LastTimestamp:2026-04-02 13:37:24.985173526 +0000 UTC m=+1.889581079,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.335804 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a28dbcaacc2bd7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a28dbcaacc2bd7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.596272087 +0000 UTC m=+1.500679640,LastTimestamp:2026-04-02 13:37:24.985182926 +0000 UTC m=+1.889590479,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.339729 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a28dbcaacbc3db\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a28dbcaacbc3db default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.596245467 +0000 UTC m=+1.500653020,LastTimestamp:2026-04-02 13:37:24.985263878 +0000 UTC m=+1.889671441,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.341985 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a28dbcaacc0b6b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a28dbcaacc0b6b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.596263787 +0000 UTC m=+1.500671340,LastTimestamp:2026-04-02 13:37:24.985282838 +0000 UTC m=+1.889690401,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.343653 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a28dbcaacc2bd7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a28dbcaacc2bd7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.596272087 +0000 UTC m=+1.500679640,LastTimestamp:2026-04-02 13:37:24.985294989 +0000 UTC m=+1.889702552,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.347445 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a28dbcaacbc3db\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a28dbcaacbc3db default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.596245467 +0000 UTC m=+1.500653020,LastTimestamp:2026-04-02 13:37:24.986437197 +0000 UTC m=+1.890844750,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.353385 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a28dbcaacc0b6b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a28dbcaacc0b6b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.596263787 +0000 UTC m=+1.500671340,LastTimestamp:2026-04-02 13:37:24.986454148 +0000 UTC m=+1.890861701,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.358994 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a28dbcaacc2bd7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a28dbcaacc2bd7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.596272087 +0000 UTC m=+1.500679640,LastTimestamp:2026-04-02 13:37:24.986463788 +0000 UTC m=+1.890871341,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.364240 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a28dbcaacbc3db\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a28dbcaacbc3db default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.596245467 +0000 UTC m=+1.500653020,LastTimestamp:2026-04-02 13:37:24.986476628 +0000 UTC m=+1.890884181,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.368289 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a28dbcaacc0b6b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a28dbcaacc0b6b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:24.596263787 +0000 UTC m=+1.500671340,LastTimestamp:2026-04-02 13:37:24.986491559 +0000 UTC m=+1.890899112,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.373297 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a28dbce0ec1925 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:25.504334117 +0000 UTC m=+2.408741700,LastTimestamp:2026-04-02 13:37:25.504334117 +0000 UTC m=+2.408741700,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.377322 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a28dbce0ed8fce openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:25.50443003 +0000 UTC m=+2.408837623,LastTimestamp:2026-04-02 13:37:25.50443003 +0000 UTC m=+2.408837623,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.381099 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18a28dbce0eddd95 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:25.504449941 +0000 UTC m=+2.408857534,LastTimestamp:2026-04-02 13:37:25.504449941 +0000 UTC m=+2.408857534,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.384932 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18a28dbce0f4a68d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:25.504894605 +0000 UTC m=+2.409302198,LastTimestamp:2026-04-02 13:37:25.504894605 +0000 UTC m=+2.409302198,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.388352 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a28dbce618896c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:25.591132524 +0000 UTC m=+2.495540117,LastTimestamp:2026-04-02 13:37:25.591132524 +0000 UTC m=+2.495540117,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.392106 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18a28dbd6a99ad4a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:27.814188362 +0000 UTC m=+4.718595915,LastTimestamp:2026-04-02 13:37:27.814188362 +0000 UTC m=+4.718595915,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.396471 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a28dbd6aa370dc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:27.814828252 +0000 UTC m=+4.719235835,LastTimestamp:2026-04-02 13:37:27.814828252 +0000 UTC m=+4.719235835,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.399778 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18a28dbd6ab62965 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:27.816055141 +0000 UTC m=+4.720462734,LastTimestamp:2026-04-02 13:37:27.816055141 +0000 UTC m=+4.720462734,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.403147 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a28dbd6abd2e7b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:27.816515195 +0000 UTC m=+4.720922788,LastTimestamp:2026-04-02 13:37:27.816515195 +0000 UTC m=+4.720922788,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.407474 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a28dbd6ac25d0c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:27.816854796 +0000 UTC m=+4.721262379,LastTimestamp:2026-04-02 13:37:27.816854796 +0000 UTC m=+4.721262379,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.411026 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a28dbd6f17f2b2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:27.88957253 +0000 UTC m=+4.793980123,LastTimestamp:2026-04-02 13:37:27.88957253 +0000 UTC m=+4.793980123,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.415590 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a28dbd6fad2415 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:27.899350037 +0000 UTC m=+4.803757590,LastTimestamp:2026-04-02 13:37:27.899350037 +0000 UTC m=+4.803757590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.420146 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a28dbd745f0eaf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:27.978118831 +0000 UTC m=+4.882526404,LastTimestamp:2026-04-02 13:37:27.978118831 +0000 UTC m=+4.882526404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.424086 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18a28dbd7fc0173c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:28.169027388 +0000 UTC m=+5.073434941,LastTimestamp:2026-04-02 13:37:28.169027388 +0000 UTC m=+5.073434941,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.431137 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18a28dbd809bc23e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:28.18342355 +0000 UTC m=+5.087831093,LastTimestamp:2026-04-02 13:37:28.18342355 +0000 UTC m=+5.087831093,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.434978 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a28dbd809bc28e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:28.18342363 +0000 UTC m=+5.087831223,LastTimestamp:2026-04-02 13:37:28.18342363 +0000 UTC m=+5.087831223,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.438732 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a28dbd98449a1d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:28.580364829 +0000 UTC m=+5.484772402,LastTimestamp:2026-04-02 13:37:28.580364829 +0000 UTC m=+5.484772402,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.443597 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a28dbd9cb5ef12 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:28.65490101 +0000 UTC m=+5.559308573,LastTimestamp:2026-04-02 13:37:28.65490101 +0000 UTC m=+5.559308573,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.446934 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a28dbd9ccde5d5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:28.656471509 +0000 UTC m=+5.560879102,LastTimestamp:2026-04-02 13:37:28.656471509 +0000 UTC m=+5.560879102,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.451427 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a28dbd9fcd9362 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:28.70678205 +0000 UTC m=+5.611189653,LastTimestamp:2026-04-02 13:37:28.70678205 +0000 UTC m=+5.611189653,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.455996 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a28dbd9fce10d1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:28.706814161 +0000 UTC m=+5.611221714,LastTimestamp:2026-04-02 13:37:28.706814161 +0000 UTC m=+5.611221714,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.461524 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18a28dbd9fd60003 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:28.707334147 +0000 UTC m=+5.611741700,LastTimestamp:2026-04-02 13:37:28.707334147 +0000 UTC m=+5.611741700,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.466466 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18a28dbda000243e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:28.710095934 +0000 UTC m=+5.614503487,LastTimestamp:2026-04-02 13:37:28.710095934 +0000 UTC m=+5.614503487,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.470582 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18a28dbdb9506322 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:29.134785314 +0000 UTC m=+6.039192907,LastTimestamp:2026-04-02 13:37:29.134785314 +0000 UTC m=+6.039192907,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.474971 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a28dbdbb46fd8e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:29.167723918 +0000 UTC m=+6.072131481,LastTimestamp:2026-04-02 13:37:29.167723918 +0000 UTC m=+6.072131481,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.480931 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a28dbdbbeff5d6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:29.178797526 +0000 UTC m=+6.083205089,LastTimestamp:2026-04-02 13:37:29.178797526 +0000 UTC m=+6.083205089,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.484762 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a28dbdbbf26ded openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:29.178959341 +0000 UTC m=+6.083366904,LastTimestamp:2026-04-02 13:37:29.178959341 +0000 UTC m=+6.083366904,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.488492 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18a28dbdbbf3b728 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:29.179043624 +0000 UTC m=+6.083451187,LastTimestamp:2026-04-02 13:37:29.179043624 +0000 UTC m=+6.083451187,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.492605 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18a28dbdbedc3552 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:29.227834706 +0000 UTC m=+6.132242299,LastTimestamp:2026-04-02 13:37:29.227834706 +0000 UTC m=+6.132242299,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.497224 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18a28dbdbf0bde7d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:29.230958205 +0000 UTC m=+6.135365788,LastTimestamp:2026-04-02 13:37:29.230958205 +0000 UTC m=+6.135365788,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.501045 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a28dbdd0be8829 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:29.527879721 +0000 UTC m=+6.432287304,LastTimestamp:2026-04-02 13:37:29.527879721 +0000 UTC m=+6.432287304,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.504770 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a28dbdd0d5ee9f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:29.529413279 +0000 UTC m=+6.433820872,LastTimestamp:2026-04-02 13:37:29.529413279 +0000 UTC m=+6.433820872,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.509174 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a28dbdd93e271a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:29.67046121 +0000 UTC m=+6.574868793,LastTimestamp:2026-04-02 13:37:29.67046121 +0000 UTC m=+6.574868793,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.513220 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a28dbdd9d102b2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:29.680085682 +0000 UTC m=+6.584493275,LastTimestamp:2026-04-02 13:37:29.680085682 +0000 UTC m=+6.584493275,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.517511 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18a28dbde6885411 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:29.893426193 +0000 UTC m=+6.797833746,LastTimestamp:2026-04-02 13:37:29.893426193 +0000 UTC m=+6.797833746,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: W0402 13:38:37.517708 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.517802 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.524118 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18a28dbdfc103971 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:30.254653809 +0000 UTC m=+7.159061372,LastTimestamp:2026-04-02 13:37:30.254653809 +0000 UTC m=+7.159061372,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.528404 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a28dbdfdf1c602 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:30.28621261 +0000 UTC m=+7.190620193,LastTimestamp:2026-04-02 13:37:30.28621261 +0000 UTC m=+7.190620193,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.532741 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a28dbe18df554e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:30.737988942 +0000 UTC m=+7.642396535,LastTimestamp:2026-04-02 13:37:30.737988942 +0000 UTC m=+7.642396535,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.537154 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18a28dbe503001ee openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:31.666022894 +0000 UTC m=+8.570430457,LastTimestamp:2026-04-02 13:37:31.666022894 +0000 UTC m=+8.570430457,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: I0402 13:38:37.543586 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.543762 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18a28dbe5045f319 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:31.667460889 +0000 UTC m=+8.571868442,LastTimestamp:2026-04-02 13:37:31.667460889 +0000 UTC m=+8.571868442,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.547439 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a28dbe51589695 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:31.685459605 +0000 UTC m=+8.589867158,LastTimestamp:2026-04-02 13:37:31.685459605 +0000 UTC m=+8.589867158,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.550783 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a28dbe5168cd24 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:31.686522148 +0000 UTC m=+8.590929701,LastTimestamp:2026-04-02 13:37:31.686522148 +0000 UTC m=+8.590929701,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.554925 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a28dbe635a066f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:31.987543663 +0000 UTC m=+8.891951216,LastTimestamp:2026-04-02 13:37:31.987543663 +0000 UTC m=+8.891951216,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.560326 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a28dbe67e0d98f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:32.063488399 +0000 UTC m=+8.967895942,LastTimestamp:2026-04-02 13:37:32.063488399 +0000 UTC m=+8.967895942,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.564005 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a28dbe67f50c94 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:32.06481218 +0000 UTC m=+8.969219723,LastTimestamp:2026-04-02 13:37:32.06481218 +0000 UTC m=+8.969219723,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.568143 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a28dbe6ac6ad02 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:32.112104706 +0000 UTC m=+9.016512259,LastTimestamp:2026-04-02 13:37:32.112104706 +0000 UTC m=+9.016512259,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.572369 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18a28dbe737b03af openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:32.258141103 +0000 UTC m=+9.162548656,LastTimestamp:2026-04-02 13:37:32.258141103 +0000 UTC m=+9.162548656,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.576108 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a28dbe73e453c0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:32.26504288 +0000 UTC m=+9.169450433,LastTimestamp:2026-04-02 13:37:32.26504288 +0000 UTC m=+9.169450433,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.580363 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a28dbe7bf8cc56 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:32.400602198 +0000 UTC m=+9.305009751,LastTimestamp:2026-04-02 13:37:32.400602198 +0000 UTC m=+9.305009751,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.584816 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18a28dbe7d5aeb5f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:32.423809887 +0000 UTC m=+9.328217440,LastTimestamp:2026-04-02 13:37:32.423809887 +0000 UTC m=+9.328217440,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.589481 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a28dbe7e495d2c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:32.439436588 +0000 UTC m=+9.343844181,LastTimestamp:2026-04-02 13:37:32.439436588 +0000 UTC m=+9.343844181,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.594057 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a28dbe7e5b8ee5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:32.440628965 +0000 UTC m=+9.345036518,LastTimestamp:2026-04-02 13:37:32.440628965 +0000 UTC m=+9.345036518,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.598355 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a28dbe8e950359 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:32.712829785 +0000 UTC m=+9.617237338,LastTimestamp:2026-04-02 13:37:32.712829785 +0000 UTC m=+9.617237338,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.602531 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a28dbe90eeff6f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:32.752281455 +0000 UTC m=+9.656689008,LastTimestamp:2026-04-02 13:37:32.752281455 +0000 UTC m=+9.656689008,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.606163 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a28dbe9391d07a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:32.796506234 +0000 UTC m=+9.700913787,LastTimestamp:2026-04-02 13:37:32.796506234 +0000 UTC m=+9.700913787,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.610115 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a28dbe93a18116 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:32.797534486 +0000 UTC m=+9.701942039,LastTimestamp:2026-04-02 13:37:32.797534486 +0000 UTC m=+9.701942039,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.614516 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a28dbea2cbd40e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:33.051966478 +0000 UTC m=+9.956374031,LastTimestamp:2026-04-02 13:37:33.051966478 +0000 UTC m=+9.956374031,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.623775 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a28dbea8dbe889 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:33.153683593 +0000 UTC m=+10.058091166,LastTimestamp:2026-04-02 13:37:33.153683593 +0000 UTC m=+10.058091166,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.627746 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a28dbead57d4b6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:33.228913846 +0000 UTC m=+10.133321399,LastTimestamp:2026-04-02 13:37:33.228913846 +0000 UTC m=+10.133321399,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.631245 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a28dbead6572b8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:33.229806264 +0000 UTC m=+10.134213817,LastTimestamp:2026-04-02 13:37:33.229806264 +0000 UTC m=+10.134213817,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.634405 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a28dbeb46f69ba openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:33.347899834 +0000 UTC m=+10.252307387,LastTimestamp:2026-04-02 13:37:33.347899834 +0000 UTC m=+10.252307387,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.637471 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a28dbecbc2e3dc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:33.739246556 +0000 UTC m=+10.643654109,LastTimestamp:2026-04-02 13:37:33.739246556 +0000 UTC m=+10.643654109,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.641110 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a28dbed3089328 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:33.861253928 +0000 UTC m=+10.765661491,LastTimestamp:2026-04-02 13:37:33.861253928 +0000 UTC m=+10.765661491,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.644340 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a28dbed318c2b9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:33.862314681 +0000 UTC m=+10.766722234,LastTimestamp:2026-04-02 13:37:33.862314681 +0000 UTC m=+10.766722234,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.647739 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a28dbee7f57cd6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:34.212324566 +0000 UTC m=+11.116732119,LastTimestamp:2026-04-02 13:37:34.212324566 +0000 UTC m=+11.116732119,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.651970 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a28dbefbee67df openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:34.547404767 +0000 UTC m=+11.451812320,LastTimestamp:2026-04-02 13:37:34.547404767 +0000 UTC m=+11.451812320,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.654990 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a28dbefc03cdf2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:34.548807154 +0000 UTC m=+11.453214707,LastTimestamp:2026-04-02 13:37:34.548807154 +0000 UTC m=+11.453214707,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.658243 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a28dbf1285e0ae openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:34.926430382 +0000 UTC m=+11.830837975,LastTimestamp:2026-04-02 13:37:34.926430382 +0000 UTC m=+11.830837975,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.661151 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a28dbf18febf49 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:35.035014985 +0000 UTC m=+11.939422578,LastTimestamp:2026-04-02 13:37:35.035014985 +0000 UTC m=+11.939422578,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.664229 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a28dbf1914b545 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:35.036454213 +0000 UTC m=+11.940861806,LastTimestamp:2026-04-02 13:37:35.036454213 +0000 UTC m=+11.940861806,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.667224 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a28dbf2b9bfe2b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:35.347310123 +0000 UTC m=+12.251717676,LastTimestamp:2026-04-02 13:37:35.347310123 +0000 UTC m=+12.251717676,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.670196 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a28dbf2d9d86a5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:35.380965029 +0000 UTC m=+12.285372592,LastTimestamp:2026-04-02 13:37:35.380965029 +0000 UTC m=+12.285372592,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.673428 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18a28dbe93a18116\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a28dbe93a18116 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:32.797534486 +0000 UTC m=+9.701942039,LastTimestamp:2026-04-02 13:37:35.776427443 +0000 UTC m=+12.680835036,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.676735 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18a28dbea8dbe889\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a28dbea8dbe889 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:33.153683593 +0000 UTC m=+10.058091166,LastTimestamp:2026-04-02 13:37:36.211384399 +0000 UTC m=+13.115791952,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.679884 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18a28dbeb46f69ba\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a28dbeb46f69ba openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:33.347899834 +0000 UTC m=+10.252307387,LastTimestamp:2026-04-02 13:37:36.262316135 +0000 UTC m=+13.166723728,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.685595 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Apr 02 13:38:37 crc kubenswrapper[4732]: &Event{ObjectMeta:{kube-controller-manager-crc.18a28dc13abb0ea7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Apr 02 13:38:37 crc kubenswrapper[4732]: body: Apr 02 13:38:37 crc kubenswrapper[4732]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:44.190938791 +0000 UTC m=+21.095346364,LastTimestamp:2026-04-02 13:37:44.190938791 +0000 UTC m=+21.095346364,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Apr 02 13:38:37 crc kubenswrapper[4732]: > Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.688558 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a28dc13abc23d9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:44.191009753 +0000 UTC m=+21.095417316,LastTimestamp:2026-04-02 13:37:44.191009753 +0000 UTC m=+21.095417316,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.692388 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Apr 02 13:38:37 crc kubenswrapper[4732]: &Event{ObjectMeta:{kube-apiserver-crc.18a28dc1ee95538d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Apr 02 13:38:37 crc kubenswrapper[4732]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Apr 02 13:38:37 crc kubenswrapper[4732]: Apr 02 13:38:37 crc kubenswrapper[4732]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:47.208364941 +0000 UTC m=+24.112772524,LastTimestamp:2026-04-02 13:37:47.208364941 +0000 UTC m=+24.112772524,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Apr 02 13:38:37 crc kubenswrapper[4732]: > Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.695375 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a28dc1ee961a31 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:47.208415793 +0000 UTC m=+24.112823386,LastTimestamp:2026-04-02 13:37:47.208415793 +0000 UTC m=+24.112823386,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.698401 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18a28dc1ee95538d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Apr 02 13:38:37 crc kubenswrapper[4732]: &Event{ObjectMeta:{kube-apiserver-crc.18a28dc1ee95538d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Apr 02 13:38:37 crc kubenswrapper[4732]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Apr 02 13:38:37 crc kubenswrapper[4732]: Apr 02 13:38:37 crc kubenswrapper[4732]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:47.208364941 +0000 UTC m=+24.112772524,LastTimestamp:2026-04-02 13:37:47.212744918 +0000 UTC m=+24.117152511,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Apr 02 13:38:37 crc kubenswrapper[4732]: > Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.701445 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18a28dc1ee961a31\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a28dc1ee961a31 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:47.208415793 +0000 UTC m=+24.112823386,LastTimestamp:2026-04-02 13:37:47.212794389 +0000 UTC m=+24.117201972,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.705732 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18a28dc13abb0ea7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Apr 02 13:38:37 crc kubenswrapper[4732]: &Event{ObjectMeta:{kube-controller-manager-crc.18a28dc13abb0ea7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Apr 02 13:38:37 crc kubenswrapper[4732]: body: Apr 02 13:38:37 crc kubenswrapper[4732]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:44.190938791 +0000 UTC m=+21.095346364,LastTimestamp:2026-04-02 13:37:54.190768893 +0000 UTC m=+31.095176486,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Apr 02 13:38:37 crc kubenswrapper[4732]: > Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.710163 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18a28dc13abc23d9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a28dc13abc23d9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:44.191009753 +0000 UTC m=+21.095417316,LastTimestamp:2026-04-02 13:37:54.190819784 +0000 UTC m=+31.095227377,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.714093 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Apr 02 13:38:37 crc kubenswrapper[4732]: &Event{ObjectMeta:{kube-controller-manager-crc.18a28dc5c074b91f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:41776->192.168.126.11:10357: read: connection reset by peer Apr 02 13:38:37 crc kubenswrapper[4732]: body: Apr 02 13:38:37 crc kubenswrapper[4732]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:38:03.614345503 +0000 UTC m=+40.518753056,LastTimestamp:2026-04-02 13:38:03.614345503 +0000 UTC m=+40.518753056,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Apr 02 13:38:37 crc kubenswrapper[4732]: > Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.717633 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a28dc5c0755e02 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:41776->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:38:03.614387714 +0000 UTC m=+40.518795257,LastTimestamp:2026-04-02 13:38:03.614387714 +0000 UTC m=+40.518795257,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.723255 4732 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a28dc5c09f16e1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:38:03.617122017 +0000 UTC m=+40.521529570,LastTimestamp:2026-04-02 13:38:03.617122017 +0000 UTC m=+40.521529570,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.727136 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18a28dbd6fad2415\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a28dbd6fad2415 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:27.899350037 +0000 UTC m=+4.803757590,LastTimestamp:2026-04-02 13:38:04.132529025 +0000 UTC m=+41.036936598,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.735576 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18a28dbd98449a1d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a28dbd98449a1d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:28.580364829 +0000 UTC m=+5.484772402,LastTimestamp:2026-04-02 13:38:04.4788583 +0000 UTC m=+41.383265863,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.740135 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18a28dbd9cb5ef12\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a28dbd9cb5ef12 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:28.65490101 +0000 UTC m=+5.559308573,LastTimestamp:2026-04-02 13:38:04.492986167 +0000 UTC m=+41.397393720,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.749198 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18a28dc13abb0ea7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Apr 02 13:38:37 crc kubenswrapper[4732]: &Event{ObjectMeta:{kube-controller-manager-crc.18a28dc13abb0ea7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Apr 02 13:38:37 crc kubenswrapper[4732]: body: Apr 02 13:38:37 crc kubenswrapper[4732]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:44.190938791 +0000 UTC m=+21.095346364,LastTimestamp:2026-04-02 13:38:14.190375763 +0000 UTC m=+51.094783356,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Apr 02 13:38:37 crc kubenswrapper[4732]: > Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.753604 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18a28dc13abc23d9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a28dc13abc23d9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:44.191009753 +0000 UTC m=+21.095417316,LastTimestamp:2026-04-02 13:38:14.190438345 +0000 UTC m=+51.094845938,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:38:37 crc kubenswrapper[4732]: E0402 13:38:37.758112 4732 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18a28dc13abb0ea7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Apr 02 13:38:37 crc kubenswrapper[4732]: &Event{ObjectMeta:{kube-controller-manager-crc.18a28dc13abb0ea7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Apr 02 13:38:37 crc kubenswrapper[4732]: body: Apr 02 13:38:37 crc kubenswrapper[4732]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:37:44.190938791 +0000 UTC m=+21.095346364,LastTimestamp:2026-04-02 13:38:24.190920186 +0000 UTC m=+61.095327779,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Apr 02 13:38:37 crc kubenswrapper[4732]: > Apr 02 13:38:38 crc kubenswrapper[4732]: I0402 13:38:38.544155 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 02 13:38:38 crc kubenswrapper[4732]: I0402 13:38:38.679490 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:38 crc kubenswrapper[4732]: I0402 13:38:38.681174 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:38 crc kubenswrapper[4732]: I0402 13:38:38.681227 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:38 crc kubenswrapper[4732]: I0402 13:38:38.681237 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:38 crc kubenswrapper[4732]: I0402 13:38:38.681946 4732 scope.go:117] "RemoveContainer" containerID="b68bfa0db91a1c077fbe5e3c2695eea525f21b149667d23845099c3eb2e7c21d" Apr 02 13:38:38 crc kubenswrapper[4732]: I0402 13:38:38.983195 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Apr 02 13:38:38 crc kubenswrapper[4732]: I0402 13:38:38.985997 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8675a5215a5954016d0e75f25067050d5de0f1e5f52271df26f9d7ba5a04d5b0"} Apr 02 13:38:38 crc kubenswrapper[4732]: I0402 13:38:38.986270 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:38 crc kubenswrapper[4732]: I0402 13:38:38.987832 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:38 crc kubenswrapper[4732]: I0402 13:38:38.987886 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:38 crc kubenswrapper[4732]: I0402 13:38:38.987897 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:39 crc kubenswrapper[4732]: I0402 13:38:39.546240 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 02 13:38:39 crc kubenswrapper[4732]: I0402 13:38:39.996469 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Apr 02 13:38:39 crc kubenswrapper[4732]: I0402 13:38:39.997392 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Apr 02 13:38:39 crc kubenswrapper[4732]: I0402 13:38:39.999309 4732 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8675a5215a5954016d0e75f25067050d5de0f1e5f52271df26f9d7ba5a04d5b0" exitCode=255 Apr 02 13:38:39 crc kubenswrapper[4732]: I0402 13:38:39.999326 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8675a5215a5954016d0e75f25067050d5de0f1e5f52271df26f9d7ba5a04d5b0"} Apr 02 13:38:39 crc kubenswrapper[4732]: I0402 13:38:39.999402 4732 scope.go:117] "RemoveContainer" containerID="b68bfa0db91a1c077fbe5e3c2695eea525f21b149667d23845099c3eb2e7c21d" Apr 02 13:38:39 crc kubenswrapper[4732]: I0402 13:38:39.999559 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:40 crc kubenswrapper[4732]: I0402 13:38:40.000552 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:40 crc kubenswrapper[4732]: I0402 13:38:40.000582 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:40 crc kubenswrapper[4732]: I0402 13:38:40.000591 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:40 crc kubenswrapper[4732]: I0402 13:38:40.001041 4732 scope.go:117] "RemoveContainer" containerID="8675a5215a5954016d0e75f25067050d5de0f1e5f52271df26f9d7ba5a04d5b0" Apr 02 13:38:40 crc kubenswrapper[4732]: E0402 13:38:40.001201 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 02 13:38:40 crc kubenswrapper[4732]: I0402 13:38:40.114276 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:38:40 crc kubenswrapper[4732]: I0402 13:38:40.273832 4732 csr.go:261] certificate signing request csr-4vp5f is approved, waiting to be issued Apr 02 13:38:40 crc kubenswrapper[4732]: I0402 13:38:40.413236 4732 csr.go:257] certificate signing request csr-4vp5f is issued Apr 02 13:38:40 crc kubenswrapper[4732]: I0402 13:38:40.433970 4732 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Apr 02 13:38:40 crc kubenswrapper[4732]: I0402 13:38:40.697072 4732 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Apr 02 13:38:41 crc kubenswrapper[4732]: I0402 13:38:41.003080 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Apr 02 13:38:41 crc kubenswrapper[4732]: I0402 13:38:41.005249 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:41 crc kubenswrapper[4732]: I0402 13:38:41.006151 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:41 crc kubenswrapper[4732]: I0402 13:38:41.006186 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:41 crc kubenswrapper[4732]: I0402 13:38:41.006201 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:41 crc kubenswrapper[4732]: I0402 13:38:41.006835 4732 scope.go:117] "RemoveContainer" containerID="8675a5215a5954016d0e75f25067050d5de0f1e5f52271df26f9d7ba5a04d5b0" Apr 02 13:38:41 crc kubenswrapper[4732]: E0402 13:38:41.007019 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 02 13:38:41 crc kubenswrapper[4732]: I0402 13:38:41.171700 4732 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 02 13:38:41 crc kubenswrapper[4732]: W0402 13:38:41.171888 4732 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Apr 02 13:38:41 crc kubenswrapper[4732]: I0402 13:38:41.190028 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:38:41 crc kubenswrapper[4732]: I0402 13:38:41.190212 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:41 crc kubenswrapper[4732]: I0402 13:38:41.191173 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:41 crc kubenswrapper[4732]: I0402 13:38:41.191288 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:41 crc kubenswrapper[4732]: I0402 13:38:41.191371 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:41 crc kubenswrapper[4732]: I0402 13:38:41.205340 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:38:41 crc kubenswrapper[4732]: I0402 13:38:41.372659 4732 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Apr 02 13:38:41 crc kubenswrapper[4732]: I0402 13:38:41.414755 4732 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-02 01:54:44.863195323 +0000 UTC Apr 02 13:38:41 crc kubenswrapper[4732]: I0402 13:38:41.414791 4732 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 5844h16m3.448407025s for next certificate rotation Apr 02 13:38:41 crc kubenswrapper[4732]: I0402 13:38:41.542948 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:38:42 crc kubenswrapper[4732]: I0402 13:38:42.007523 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:42 crc kubenswrapper[4732]: I0402 13:38:42.007567 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:38:42 crc kubenswrapper[4732]: I0402 13:38:42.007523 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:42 crc kubenswrapper[4732]: I0402 13:38:42.012324 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:42 crc kubenswrapper[4732]: I0402 13:38:42.012545 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:42 crc kubenswrapper[4732]: I0402 13:38:42.012644 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:42 crc kubenswrapper[4732]: I0402 13:38:42.012823 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:42 crc kubenswrapper[4732]: I0402 13:38:42.013340 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:42 crc kubenswrapper[4732]: I0402 13:38:42.013434 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:42 crc kubenswrapper[4732]: I0402 13:38:42.014510 4732 scope.go:117] "RemoveContainer" containerID="8675a5215a5954016d0e75f25067050d5de0f1e5f52271df26f9d7ba5a04d5b0" Apr 02 13:38:42 crc kubenswrapper[4732]: E0402 13:38:42.014803 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.009564 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.010522 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.010563 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.010575 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.233266 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.234298 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.234335 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.234349 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.234462 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.241464 4732 kubelet_node_status.go:115] "Node was previously registered" node="crc" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.241814 4732 kubelet_node_status.go:79] "Successfully registered node" node="crc" Apr 02 13:38:43 crc kubenswrapper[4732]: E0402 13:38:43.241848 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.245243 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.245359 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.245463 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.245532 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.245593 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:38:43Z","lastTransitionTime":"2026-04-02T13:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:38:43 crc kubenswrapper[4732]: E0402 13:38:43.269006 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.272552 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.272593 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.272605 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.272637 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.272649 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:38:43Z","lastTransitionTime":"2026-04-02T13:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:38:43 crc kubenswrapper[4732]: E0402 13:38:43.280504 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.284153 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.284198 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.284236 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.284254 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.284265 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:38:43Z","lastTransitionTime":"2026-04-02T13:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:38:43 crc kubenswrapper[4732]: E0402 13:38:43.292554 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.295757 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.295789 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.295802 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.295818 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:38:43 crc kubenswrapper[4732]: I0402 13:38:43.295831 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:38:43Z","lastTransitionTime":"2026-04-02T13:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:38:43 crc kubenswrapper[4732]: E0402 13:38:43.307244 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:38:43 crc kubenswrapper[4732]: E0402 13:38:43.307392 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Apr 02 13:38:43 crc kubenswrapper[4732]: E0402 13:38:43.307441 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:43 crc kubenswrapper[4732]: E0402 13:38:43.407576 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:43 crc kubenswrapper[4732]: E0402 13:38:43.508303 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:43 crc kubenswrapper[4732]: E0402 13:38:43.609410 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:43 crc kubenswrapper[4732]: E0402 13:38:43.710260 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:43 crc kubenswrapper[4732]: E0402 13:38:43.811188 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:43 crc kubenswrapper[4732]: E0402 13:38:43.911955 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:44 crc kubenswrapper[4732]: E0402 13:38:44.012650 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:44 crc kubenswrapper[4732]: E0402 13:38:44.113444 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:44 crc kubenswrapper[4732]: E0402 13:38:44.213840 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:44 crc kubenswrapper[4732]: E0402 13:38:44.313958 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:44 crc kubenswrapper[4732]: E0402 13:38:44.414710 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:44 crc kubenswrapper[4732]: E0402 13:38:44.515420 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:44 crc kubenswrapper[4732]: E0402 13:38:44.616022 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:44 crc kubenswrapper[4732]: E0402 13:38:44.717131 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:44 crc kubenswrapper[4732]: E0402 13:38:44.817911 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:44 crc kubenswrapper[4732]: E0402 13:38:44.912834 4732 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Apr 02 13:38:44 crc kubenswrapper[4732]: E0402 13:38:44.918025 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:45 crc kubenswrapper[4732]: E0402 13:38:45.018742 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:45 crc kubenswrapper[4732]: E0402 13:38:45.119701 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:45 crc kubenswrapper[4732]: E0402 13:38:45.220528 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:45 crc kubenswrapper[4732]: E0402 13:38:45.321052 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:45 crc kubenswrapper[4732]: E0402 13:38:45.421797 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:45 crc kubenswrapper[4732]: E0402 13:38:45.522847 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:45 crc kubenswrapper[4732]: E0402 13:38:45.623941 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:45 crc kubenswrapper[4732]: E0402 13:38:45.724775 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:45 crc kubenswrapper[4732]: E0402 13:38:45.825370 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:45 crc kubenswrapper[4732]: E0402 13:38:45.926109 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:46 crc kubenswrapper[4732]: E0402 13:38:46.026676 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:46 crc kubenswrapper[4732]: E0402 13:38:46.127459 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:46 crc kubenswrapper[4732]: E0402 13:38:46.228191 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:46 crc kubenswrapper[4732]: E0402 13:38:46.328912 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:46 crc kubenswrapper[4732]: E0402 13:38:46.429655 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:46 crc kubenswrapper[4732]: E0402 13:38:46.530346 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:46 crc kubenswrapper[4732]: E0402 13:38:46.630594 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:46 crc kubenswrapper[4732]: E0402 13:38:46.731450 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:46 crc kubenswrapper[4732]: E0402 13:38:46.832493 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:46 crc kubenswrapper[4732]: E0402 13:38:46.933389 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:47 crc kubenswrapper[4732]: E0402 13:38:47.033766 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:47 crc kubenswrapper[4732]: E0402 13:38:47.134206 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:47 crc kubenswrapper[4732]: E0402 13:38:47.235289 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:47 crc kubenswrapper[4732]: E0402 13:38:47.336310 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:47 crc kubenswrapper[4732]: E0402 13:38:47.437120 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:47 crc kubenswrapper[4732]: E0402 13:38:47.537234 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:47 crc kubenswrapper[4732]: E0402 13:38:47.638361 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:47 crc kubenswrapper[4732]: E0402 13:38:47.739508 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:47 crc kubenswrapper[4732]: E0402 13:38:47.840488 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:47 crc kubenswrapper[4732]: E0402 13:38:47.941282 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:48 crc kubenswrapper[4732]: E0402 13:38:48.041978 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:48 crc kubenswrapper[4732]: E0402 13:38:48.142837 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:48 crc kubenswrapper[4732]: E0402 13:38:48.243951 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:48 crc kubenswrapper[4732]: E0402 13:38:48.344786 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:48 crc kubenswrapper[4732]: E0402 13:38:48.445764 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:48 crc kubenswrapper[4732]: E0402 13:38:48.546898 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:48 crc kubenswrapper[4732]: E0402 13:38:48.647551 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:48 crc kubenswrapper[4732]: E0402 13:38:48.748718 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:48 crc kubenswrapper[4732]: E0402 13:38:48.849486 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:48 crc kubenswrapper[4732]: E0402 13:38:48.950548 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:49 crc kubenswrapper[4732]: E0402 13:38:49.051678 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:49 crc kubenswrapper[4732]: E0402 13:38:49.151888 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:49 crc kubenswrapper[4732]: E0402 13:38:49.252436 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:49 crc kubenswrapper[4732]: E0402 13:38:49.353198 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:49 crc kubenswrapper[4732]: E0402 13:38:49.453888 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:49 crc kubenswrapper[4732]: E0402 13:38:49.554436 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:49 crc kubenswrapper[4732]: E0402 13:38:49.654686 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:49 crc kubenswrapper[4732]: E0402 13:38:49.754883 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:49 crc kubenswrapper[4732]: E0402 13:38:49.855147 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:49 crc kubenswrapper[4732]: E0402 13:38:49.955885 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:50 crc kubenswrapper[4732]: E0402 13:38:50.056690 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:50 crc kubenswrapper[4732]: E0402 13:38:50.156871 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:50 crc kubenswrapper[4732]: E0402 13:38:50.257653 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:50 crc kubenswrapper[4732]: E0402 13:38:50.358460 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:50 crc kubenswrapper[4732]: E0402 13:38:50.458816 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:50 crc kubenswrapper[4732]: E0402 13:38:50.559553 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:50 crc kubenswrapper[4732]: E0402 13:38:50.660012 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:50 crc kubenswrapper[4732]: E0402 13:38:50.760108 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:50 crc kubenswrapper[4732]: E0402 13:38:50.861110 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:50 crc kubenswrapper[4732]: E0402 13:38:50.961904 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:51 crc kubenswrapper[4732]: E0402 13:38:51.062428 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:51 crc kubenswrapper[4732]: E0402 13:38:51.163595 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:51 crc kubenswrapper[4732]: E0402 13:38:51.264530 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:51 crc kubenswrapper[4732]: E0402 13:38:51.365307 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:51 crc kubenswrapper[4732]: E0402 13:38:51.465477 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:51 crc kubenswrapper[4732]: E0402 13:38:51.565705 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:51 crc kubenswrapper[4732]: E0402 13:38:51.666192 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:51 crc kubenswrapper[4732]: E0402 13:38:51.766921 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:51 crc kubenswrapper[4732]: E0402 13:38:51.867753 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:51 crc kubenswrapper[4732]: E0402 13:38:51.968803 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:52 crc kubenswrapper[4732]: E0402 13:38:52.069927 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:52 crc kubenswrapper[4732]: E0402 13:38:52.171050 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:52 crc kubenswrapper[4732]: E0402 13:38:52.272079 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:52 crc kubenswrapper[4732]: E0402 13:38:52.372863 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:52 crc kubenswrapper[4732]: E0402 13:38:52.473583 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:52 crc kubenswrapper[4732]: E0402 13:38:52.574672 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:52 crc kubenswrapper[4732]: E0402 13:38:52.674801 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:52 crc kubenswrapper[4732]: I0402 13:38:52.674921 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:38:52 crc kubenswrapper[4732]: I0402 13:38:52.675077 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:52 crc kubenswrapper[4732]: I0402 13:38:52.676278 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:52 crc kubenswrapper[4732]: I0402 13:38:52.676331 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:52 crc kubenswrapper[4732]: I0402 13:38:52.676348 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:52 crc kubenswrapper[4732]: E0402 13:38:52.775836 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:52 crc kubenswrapper[4732]: E0402 13:38:52.876975 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:52 crc kubenswrapper[4732]: E0402 13:38:52.978008 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:53 crc kubenswrapper[4732]: E0402 13:38:53.079210 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:53 crc kubenswrapper[4732]: E0402 13:38:53.180169 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:53 crc kubenswrapper[4732]: E0402 13:38:53.280696 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:53 crc kubenswrapper[4732]: E0402 13:38:53.380881 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:53 crc kubenswrapper[4732]: E0402 13:38:53.482034 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:53 crc kubenswrapper[4732]: E0402 13:38:53.521494 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Apr 02 13:38:53 crc kubenswrapper[4732]: I0402 13:38:53.527096 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:53 crc kubenswrapper[4732]: I0402 13:38:53.527140 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:53 crc kubenswrapper[4732]: I0402 13:38:53.527151 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:53 crc kubenswrapper[4732]: I0402 13:38:53.527169 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:38:53 crc kubenswrapper[4732]: I0402 13:38:53.527182 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:38:53Z","lastTransitionTime":"2026-04-02T13:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:38:53 crc kubenswrapper[4732]: E0402 13:38:53.536292 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:38:53 crc kubenswrapper[4732]: I0402 13:38:53.540545 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:53 crc kubenswrapper[4732]: I0402 13:38:53.540585 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:53 crc kubenswrapper[4732]: I0402 13:38:53.540595 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:53 crc kubenswrapper[4732]: I0402 13:38:53.540641 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:38:53 crc kubenswrapper[4732]: I0402 13:38:53.540656 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:38:53Z","lastTransitionTime":"2026-04-02T13:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:38:53 crc kubenswrapper[4732]: E0402 13:38:53.554603 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:38:53 crc kubenswrapper[4732]: I0402 13:38:53.559406 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:53 crc kubenswrapper[4732]: I0402 13:38:53.559445 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:53 crc kubenswrapper[4732]: I0402 13:38:53.559457 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:53 crc kubenswrapper[4732]: I0402 13:38:53.559476 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:38:53 crc kubenswrapper[4732]: I0402 13:38:53.559489 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:38:53Z","lastTransitionTime":"2026-04-02T13:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:38:53 crc kubenswrapper[4732]: E0402 13:38:53.577376 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:38:53 crc kubenswrapper[4732]: I0402 13:38:53.581465 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:53 crc kubenswrapper[4732]: I0402 13:38:53.581531 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:53 crc kubenswrapper[4732]: I0402 13:38:53.581546 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:53 crc kubenswrapper[4732]: I0402 13:38:53.581563 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:38:53 crc kubenswrapper[4732]: I0402 13:38:53.581575 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:38:53Z","lastTransitionTime":"2026-04-02T13:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:38:53 crc kubenswrapper[4732]: E0402 13:38:53.599776 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:38:53 crc kubenswrapper[4732]: E0402 13:38:53.600107 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Apr 02 13:38:53 crc kubenswrapper[4732]: E0402 13:38:53.600162 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:53 crc kubenswrapper[4732]: E0402 13:38:53.701236 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:53 crc kubenswrapper[4732]: E0402 13:38:53.802310 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:53 crc kubenswrapper[4732]: E0402 13:38:53.903171 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:54 crc kubenswrapper[4732]: E0402 13:38:54.003818 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:54 crc kubenswrapper[4732]: E0402 13:38:54.104954 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:54 crc kubenswrapper[4732]: E0402 13:38:54.205463 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:54 crc kubenswrapper[4732]: E0402 13:38:54.305849 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:54 crc kubenswrapper[4732]: E0402 13:38:54.406529 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:54 crc kubenswrapper[4732]: E0402 13:38:54.507288 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:54 crc kubenswrapper[4732]: E0402 13:38:54.608068 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:54 crc kubenswrapper[4732]: E0402 13:38:54.709252 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:54 crc kubenswrapper[4732]: E0402 13:38:54.810098 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:54 crc kubenswrapper[4732]: E0402 13:38:54.910527 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:54 crc kubenswrapper[4732]: E0402 13:38:54.913766 4732 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Apr 02 13:38:55 crc kubenswrapper[4732]: E0402 13:38:55.011682 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:55 crc kubenswrapper[4732]: E0402 13:38:55.112568 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:55 crc kubenswrapper[4732]: E0402 13:38:55.213326 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:55 crc kubenswrapper[4732]: E0402 13:38:55.314450 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:55 crc kubenswrapper[4732]: E0402 13:38:55.415742 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:55 crc kubenswrapper[4732]: E0402 13:38:55.515891 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:55 crc kubenswrapper[4732]: E0402 13:38:55.617083 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:55 crc kubenswrapper[4732]: I0402 13:38:55.679890 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:38:55 crc kubenswrapper[4732]: I0402 13:38:55.680991 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:38:55 crc kubenswrapper[4732]: I0402 13:38:55.681052 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:38:55 crc kubenswrapper[4732]: I0402 13:38:55.681064 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:38:55 crc kubenswrapper[4732]: I0402 13:38:55.681727 4732 scope.go:117] "RemoveContainer" containerID="8675a5215a5954016d0e75f25067050d5de0f1e5f52271df26f9d7ba5a04d5b0" Apr 02 13:38:55 crc kubenswrapper[4732]: E0402 13:38:55.681888 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 02 13:38:55 crc kubenswrapper[4732]: E0402 13:38:55.717669 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:55 crc kubenswrapper[4732]: E0402 13:38:55.817951 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:55 crc kubenswrapper[4732]: E0402 13:38:55.918805 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:56 crc kubenswrapper[4732]: E0402 13:38:56.019931 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:56 crc kubenswrapper[4732]: E0402 13:38:56.120843 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:56 crc kubenswrapper[4732]: E0402 13:38:56.221917 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:56 crc kubenswrapper[4732]: E0402 13:38:56.322663 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:56 crc kubenswrapper[4732]: E0402 13:38:56.423331 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:56 crc kubenswrapper[4732]: E0402 13:38:56.524434 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:56 crc kubenswrapper[4732]: E0402 13:38:56.624570 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:56 crc kubenswrapper[4732]: E0402 13:38:56.725236 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:56 crc kubenswrapper[4732]: E0402 13:38:56.825892 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:56 crc kubenswrapper[4732]: E0402 13:38:56.926723 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:57 crc kubenswrapper[4732]: E0402 13:38:57.027758 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:57 crc kubenswrapper[4732]: E0402 13:38:57.128064 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:57 crc kubenswrapper[4732]: E0402 13:38:57.228666 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:57 crc kubenswrapper[4732]: E0402 13:38:57.329751 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:57 crc kubenswrapper[4732]: E0402 13:38:57.430643 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:57 crc kubenswrapper[4732]: E0402 13:38:57.531372 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:57 crc kubenswrapper[4732]: E0402 13:38:57.632063 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:57 crc kubenswrapper[4732]: E0402 13:38:57.732482 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:57 crc kubenswrapper[4732]: E0402 13:38:57.833226 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:57 crc kubenswrapper[4732]: I0402 13:38:57.920794 4732 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Apr 02 13:38:57 crc kubenswrapper[4732]: E0402 13:38:57.933339 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:58 crc kubenswrapper[4732]: E0402 13:38:58.033738 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:58 crc kubenswrapper[4732]: E0402 13:38:58.134867 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:58 crc kubenswrapper[4732]: E0402 13:38:58.235697 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:58 crc kubenswrapper[4732]: E0402 13:38:58.335834 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:58 crc kubenswrapper[4732]: E0402 13:38:58.437036 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:58 crc kubenswrapper[4732]: E0402 13:38:58.537770 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:58 crc kubenswrapper[4732]: E0402 13:38:58.638268 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:58 crc kubenswrapper[4732]: E0402 13:38:58.738856 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:58 crc kubenswrapper[4732]: E0402 13:38:58.839334 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:58 crc kubenswrapper[4732]: E0402 13:38:58.939880 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:59 crc kubenswrapper[4732]: E0402 13:38:59.040719 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:59 crc kubenswrapper[4732]: E0402 13:38:59.141511 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:59 crc kubenswrapper[4732]: E0402 13:38:59.242213 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:59 crc kubenswrapper[4732]: E0402 13:38:59.343125 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:59 crc kubenswrapper[4732]: E0402 13:38:59.444103 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:59 crc kubenswrapper[4732]: E0402 13:38:59.544742 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:59 crc kubenswrapper[4732]: E0402 13:38:59.645510 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:59 crc kubenswrapper[4732]: E0402 13:38:59.745653 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:59 crc kubenswrapper[4732]: E0402 13:38:59.846298 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:38:59 crc kubenswrapper[4732]: E0402 13:38:59.946739 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:00 crc kubenswrapper[4732]: E0402 13:39:00.047121 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:00 crc kubenswrapper[4732]: E0402 13:39:00.147854 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:00 crc kubenswrapper[4732]: E0402 13:39:00.248490 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:00 crc kubenswrapper[4732]: E0402 13:39:00.348918 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:00 crc kubenswrapper[4732]: E0402 13:39:00.449865 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:00 crc kubenswrapper[4732]: E0402 13:39:00.550257 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:00 crc kubenswrapper[4732]: E0402 13:39:00.651271 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:00 crc kubenswrapper[4732]: E0402 13:39:00.752407 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:00 crc kubenswrapper[4732]: E0402 13:39:00.853422 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:00 crc kubenswrapper[4732]: E0402 13:39:00.953821 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:01 crc kubenswrapper[4732]: E0402 13:39:01.053915 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:01 crc kubenswrapper[4732]: E0402 13:39:01.154032 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:01 crc kubenswrapper[4732]: E0402 13:39:01.254380 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:01 crc kubenswrapper[4732]: E0402 13:39:01.355499 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:01 crc kubenswrapper[4732]: E0402 13:39:01.455909 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:01 crc kubenswrapper[4732]: E0402 13:39:01.556565 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:01 crc kubenswrapper[4732]: E0402 13:39:01.657267 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:01 crc kubenswrapper[4732]: E0402 13:39:01.757822 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:01 crc kubenswrapper[4732]: E0402 13:39:01.858249 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:01 crc kubenswrapper[4732]: E0402 13:39:01.959008 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:02 crc kubenswrapper[4732]: E0402 13:39:02.059507 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:02 crc kubenswrapper[4732]: E0402 13:39:02.160418 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:02 crc kubenswrapper[4732]: E0402 13:39:02.261074 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:02 crc kubenswrapper[4732]: E0402 13:39:02.362238 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:02 crc kubenswrapper[4732]: E0402 13:39:02.463450 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:02 crc kubenswrapper[4732]: E0402 13:39:02.564373 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:02 crc kubenswrapper[4732]: E0402 13:39:02.664912 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:02 crc kubenswrapper[4732]: E0402 13:39:02.765962 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:02 crc kubenswrapper[4732]: E0402 13:39:02.867054 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:02 crc kubenswrapper[4732]: E0402 13:39:02.968072 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:03 crc kubenswrapper[4732]: E0402 13:39:03.068722 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:03 crc kubenswrapper[4732]: E0402 13:39:03.169808 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:03 crc kubenswrapper[4732]: E0402 13:39:03.270913 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:03 crc kubenswrapper[4732]: E0402 13:39:03.371075 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:03 crc kubenswrapper[4732]: E0402 13:39:03.472170 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:03 crc kubenswrapper[4732]: E0402 13:39:03.572529 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:03 crc kubenswrapper[4732]: E0402 13:39:03.673144 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:03 crc kubenswrapper[4732]: E0402 13:39:03.762490 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Apr 02 13:39:04 crc kubenswrapper[4732]: I0402 13:39:04.107508 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:04 crc kubenswrapper[4732]: I0402 13:39:04.107555 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:04 crc kubenswrapper[4732]: I0402 13:39:04.107572 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:04 crc kubenswrapper[4732]: I0402 13:39:04.107592 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:04 crc kubenswrapper[4732]: I0402 13:39:04.107608 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:04Z","lastTransitionTime":"2026-04-02T13:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:04 crc kubenswrapper[4732]: E0402 13:39:04.117103 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:04 crc kubenswrapper[4732]: I0402 13:39:04.121011 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:04 crc kubenswrapper[4732]: I0402 13:39:04.121041 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:04 crc kubenswrapper[4732]: I0402 13:39:04.121052 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:04 crc kubenswrapper[4732]: I0402 13:39:04.121068 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:04 crc kubenswrapper[4732]: I0402 13:39:04.121082 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:04Z","lastTransitionTime":"2026-04-02T13:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:04 crc kubenswrapper[4732]: E0402 13:39:04.134578 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:04 crc kubenswrapper[4732]: I0402 13:39:04.138941 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:04 crc kubenswrapper[4732]: I0402 13:39:04.138978 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:04 crc kubenswrapper[4732]: I0402 13:39:04.138990 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:04 crc kubenswrapper[4732]: I0402 13:39:04.139006 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:04 crc kubenswrapper[4732]: I0402 13:39:04.139018 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:04Z","lastTransitionTime":"2026-04-02T13:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:04 crc kubenswrapper[4732]: E0402 13:39:04.151429 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:04 crc kubenswrapper[4732]: I0402 13:39:04.155489 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:04 crc kubenswrapper[4732]: I0402 13:39:04.155537 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:04 crc kubenswrapper[4732]: I0402 13:39:04.155555 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:04 crc kubenswrapper[4732]: I0402 13:39:04.155575 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:04 crc kubenswrapper[4732]: I0402 13:39:04.155588 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:04Z","lastTransitionTime":"2026-04-02T13:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:04 crc kubenswrapper[4732]: E0402 13:39:04.168438 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:04 crc kubenswrapper[4732]: E0402 13:39:04.168545 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Apr 02 13:39:04 crc kubenswrapper[4732]: E0402 13:39:04.168569 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:04 crc kubenswrapper[4732]: E0402 13:39:04.269557 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:04 crc kubenswrapper[4732]: E0402 13:39:04.370515 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:04 crc kubenswrapper[4732]: E0402 13:39:04.471529 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:04 crc kubenswrapper[4732]: E0402 13:39:04.571989 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:04 crc kubenswrapper[4732]: E0402 13:39:04.672337 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:04 crc kubenswrapper[4732]: E0402 13:39:04.772930 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:04 crc kubenswrapper[4732]: E0402 13:39:04.873846 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:04 crc kubenswrapper[4732]: E0402 13:39:04.914586 4732 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Apr 02 13:39:04 crc kubenswrapper[4732]: E0402 13:39:04.975092 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:05 crc kubenswrapper[4732]: E0402 13:39:05.075687 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:05 crc kubenswrapper[4732]: E0402 13:39:05.176677 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:05 crc kubenswrapper[4732]: E0402 13:39:05.277816 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:05 crc kubenswrapper[4732]: E0402 13:39:05.377935 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:05 crc kubenswrapper[4732]: E0402 13:39:05.478090 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:05 crc kubenswrapper[4732]: E0402 13:39:05.578966 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:05 crc kubenswrapper[4732]: E0402 13:39:05.679322 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:05 crc kubenswrapper[4732]: E0402 13:39:05.779804 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:05 crc kubenswrapper[4732]: E0402 13:39:05.880997 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:05 crc kubenswrapper[4732]: E0402 13:39:05.986275 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:06 crc kubenswrapper[4732]: E0402 13:39:06.087400 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:06 crc kubenswrapper[4732]: E0402 13:39:06.187905 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:06 crc kubenswrapper[4732]: E0402 13:39:06.288483 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:06 crc kubenswrapper[4732]: E0402 13:39:06.389553 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:06 crc kubenswrapper[4732]: E0402 13:39:06.490420 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:06 crc kubenswrapper[4732]: E0402 13:39:06.591337 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:06 crc kubenswrapper[4732]: E0402 13:39:06.691906 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:06 crc kubenswrapper[4732]: E0402 13:39:06.792745 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:06 crc kubenswrapper[4732]: E0402 13:39:06.893803 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:06 crc kubenswrapper[4732]: E0402 13:39:06.994958 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:07 crc kubenswrapper[4732]: E0402 13:39:07.096119 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:07 crc kubenswrapper[4732]: E0402 13:39:07.196328 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:07 crc kubenswrapper[4732]: E0402 13:39:07.296722 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:07 crc kubenswrapper[4732]: E0402 13:39:07.397055 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:07 crc kubenswrapper[4732]: E0402 13:39:07.497350 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:07 crc kubenswrapper[4732]: E0402 13:39:07.597784 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:07 crc kubenswrapper[4732]: E0402 13:39:07.698664 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:07 crc kubenswrapper[4732]: E0402 13:39:07.799106 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:07 crc kubenswrapper[4732]: E0402 13:39:07.900131 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:08 crc kubenswrapper[4732]: E0402 13:39:08.000678 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:08 crc kubenswrapper[4732]: E0402 13:39:08.101861 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:08 crc kubenswrapper[4732]: E0402 13:39:08.202705 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:08 crc kubenswrapper[4732]: E0402 13:39:08.303046 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:08 crc kubenswrapper[4732]: E0402 13:39:08.404103 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:08 crc kubenswrapper[4732]: E0402 13:39:08.504217 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:08 crc kubenswrapper[4732]: E0402 13:39:08.604885 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:08 crc kubenswrapper[4732]: I0402 13:39:08.680299 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:39:08 crc kubenswrapper[4732]: I0402 13:39:08.682006 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:08 crc kubenswrapper[4732]: I0402 13:39:08.682095 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:08 crc kubenswrapper[4732]: I0402 13:39:08.682126 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:08 crc kubenswrapper[4732]: I0402 13:39:08.683064 4732 scope.go:117] "RemoveContainer" containerID="8675a5215a5954016d0e75f25067050d5de0f1e5f52271df26f9d7ba5a04d5b0" Apr 02 13:39:08 crc kubenswrapper[4732]: E0402 13:39:08.683335 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 02 13:39:08 crc kubenswrapper[4732]: E0402 13:39:08.705693 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:08 crc kubenswrapper[4732]: E0402 13:39:08.805803 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:08 crc kubenswrapper[4732]: E0402 13:39:08.906283 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:09 crc kubenswrapper[4732]: E0402 13:39:09.006831 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:09 crc kubenswrapper[4732]: E0402 13:39:09.107552 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:09 crc kubenswrapper[4732]: E0402 13:39:09.208686 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:09 crc kubenswrapper[4732]: E0402 13:39:09.309725 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:09 crc kubenswrapper[4732]: E0402 13:39:09.409900 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:09 crc kubenswrapper[4732]: E0402 13:39:09.511047 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:09 crc kubenswrapper[4732]: E0402 13:39:09.611532 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:09 crc kubenswrapper[4732]: I0402 13:39:09.679642 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:39:09 crc kubenswrapper[4732]: I0402 13:39:09.681123 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:09 crc kubenswrapper[4732]: I0402 13:39:09.681163 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:09 crc kubenswrapper[4732]: I0402 13:39:09.681178 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:09 crc kubenswrapper[4732]: E0402 13:39:09.711794 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:09 crc kubenswrapper[4732]: E0402 13:39:09.812303 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:09 crc kubenswrapper[4732]: E0402 13:39:09.913117 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:10 crc kubenswrapper[4732]: E0402 13:39:10.013510 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:10 crc kubenswrapper[4732]: E0402 13:39:10.114100 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:10 crc kubenswrapper[4732]: E0402 13:39:10.215047 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:10 crc kubenswrapper[4732]: E0402 13:39:10.315995 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:10 crc kubenswrapper[4732]: E0402 13:39:10.416479 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:10 crc kubenswrapper[4732]: E0402 13:39:10.517540 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:10 crc kubenswrapper[4732]: E0402 13:39:10.618680 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:10 crc kubenswrapper[4732]: E0402 13:39:10.719695 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:10 crc kubenswrapper[4732]: E0402 13:39:10.820808 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:10 crc kubenswrapper[4732]: E0402 13:39:10.921820 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:11 crc kubenswrapper[4732]: E0402 13:39:11.022976 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:11 crc kubenswrapper[4732]: E0402 13:39:11.123368 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:11 crc kubenswrapper[4732]: E0402 13:39:11.223739 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:11 crc kubenswrapper[4732]: E0402 13:39:11.324089 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:11 crc kubenswrapper[4732]: E0402 13:39:11.424235 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:11 crc kubenswrapper[4732]: E0402 13:39:11.524773 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:11 crc kubenswrapper[4732]: E0402 13:39:11.624979 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:11 crc kubenswrapper[4732]: E0402 13:39:11.725170 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:11 crc kubenswrapper[4732]: E0402 13:39:11.825577 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:11 crc kubenswrapper[4732]: E0402 13:39:11.926292 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:12 crc kubenswrapper[4732]: E0402 13:39:12.026902 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:12 crc kubenswrapper[4732]: E0402 13:39:12.127792 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:12 crc kubenswrapper[4732]: E0402 13:39:12.228463 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:12 crc kubenswrapper[4732]: E0402 13:39:12.329127 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:12 crc kubenswrapper[4732]: E0402 13:39:12.429951 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:12 crc kubenswrapper[4732]: E0402 13:39:12.530815 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:12 crc kubenswrapper[4732]: E0402 13:39:12.631189 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:12 crc kubenswrapper[4732]: E0402 13:39:12.731885 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:12 crc kubenswrapper[4732]: E0402 13:39:12.832902 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:12 crc kubenswrapper[4732]: E0402 13:39:12.933691 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:13 crc kubenswrapper[4732]: E0402 13:39:13.033826 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:13 crc kubenswrapper[4732]: E0402 13:39:13.134912 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:13 crc kubenswrapper[4732]: E0402 13:39:13.235527 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:13 crc kubenswrapper[4732]: E0402 13:39:13.336457 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:13 crc kubenswrapper[4732]: E0402 13:39:13.437218 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:13 crc kubenswrapper[4732]: E0402 13:39:13.538081 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:13 crc kubenswrapper[4732]: E0402 13:39:13.638651 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:13 crc kubenswrapper[4732]: E0402 13:39:13.739399 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:13 crc kubenswrapper[4732]: E0402 13:39:13.840440 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:13 crc kubenswrapper[4732]: E0402 13:39:13.941296 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:14 crc kubenswrapper[4732]: E0402 13:39:14.041469 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:14 crc kubenswrapper[4732]: E0402 13:39:14.142421 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:14 crc kubenswrapper[4732]: E0402 13:39:14.243367 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:14 crc kubenswrapper[4732]: E0402 13:39:14.344048 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:14 crc kubenswrapper[4732]: E0402 13:39:14.444975 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:14 crc kubenswrapper[4732]: E0402 13:39:14.489806 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Apr 02 13:39:14 crc kubenswrapper[4732]: I0402 13:39:14.494141 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:14 crc kubenswrapper[4732]: I0402 13:39:14.494181 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:14 crc kubenswrapper[4732]: I0402 13:39:14.494189 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:14 crc kubenswrapper[4732]: I0402 13:39:14.494204 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:14 crc kubenswrapper[4732]: I0402 13:39:14.494212 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:14Z","lastTransitionTime":"2026-04-02T13:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:14 crc kubenswrapper[4732]: E0402 13:39:14.504900 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:14 crc kubenswrapper[4732]: I0402 13:39:14.508672 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:14 crc kubenswrapper[4732]: I0402 13:39:14.508721 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:14 crc kubenswrapper[4732]: I0402 13:39:14.508732 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:14 crc kubenswrapper[4732]: I0402 13:39:14.508751 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:14 crc kubenswrapper[4732]: I0402 13:39:14.508762 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:14Z","lastTransitionTime":"2026-04-02T13:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:14 crc kubenswrapper[4732]: E0402 13:39:14.520022 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:14 crc kubenswrapper[4732]: I0402 13:39:14.524068 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:14 crc kubenswrapper[4732]: I0402 13:39:14.524118 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:14 crc kubenswrapper[4732]: I0402 13:39:14.524129 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:14 crc kubenswrapper[4732]: I0402 13:39:14.524145 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:14 crc kubenswrapper[4732]: I0402 13:39:14.524156 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:14Z","lastTransitionTime":"2026-04-02T13:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:14 crc kubenswrapper[4732]: E0402 13:39:14.535023 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:14 crc kubenswrapper[4732]: I0402 13:39:14.538575 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:14 crc kubenswrapper[4732]: I0402 13:39:14.538631 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:14 crc kubenswrapper[4732]: I0402 13:39:14.538641 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:14 crc kubenswrapper[4732]: I0402 13:39:14.538656 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:14 crc kubenswrapper[4732]: I0402 13:39:14.538666 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:14Z","lastTransitionTime":"2026-04-02T13:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:14 crc kubenswrapper[4732]: E0402 13:39:14.549238 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:14 crc kubenswrapper[4732]: E0402 13:39:14.549373 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Apr 02 13:39:14 crc kubenswrapper[4732]: E0402 13:39:14.549395 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:14 crc kubenswrapper[4732]: E0402 13:39:14.650470 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:14 crc kubenswrapper[4732]: E0402 13:39:14.750593 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:14 crc kubenswrapper[4732]: E0402 13:39:14.851150 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:14 crc kubenswrapper[4732]: E0402 13:39:14.915705 4732 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Apr 02 13:39:14 crc kubenswrapper[4732]: E0402 13:39:14.951951 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:15 crc kubenswrapper[4732]: E0402 13:39:15.053030 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:15 crc kubenswrapper[4732]: E0402 13:39:15.153274 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:15 crc kubenswrapper[4732]: E0402 13:39:15.253648 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:15 crc kubenswrapper[4732]: E0402 13:39:15.354380 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:15 crc kubenswrapper[4732]: E0402 13:39:15.455140 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:15 crc kubenswrapper[4732]: E0402 13:39:15.555424 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:15 crc kubenswrapper[4732]: E0402 13:39:15.655935 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:15 crc kubenswrapper[4732]: E0402 13:39:15.757074 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:15 crc kubenswrapper[4732]: E0402 13:39:15.857878 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:15 crc kubenswrapper[4732]: E0402 13:39:15.958001 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:16 crc kubenswrapper[4732]: E0402 13:39:16.058960 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:16 crc kubenswrapper[4732]: E0402 13:39:16.159142 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:16 crc kubenswrapper[4732]: E0402 13:39:16.260004 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:16 crc kubenswrapper[4732]: E0402 13:39:16.360666 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:16 crc kubenswrapper[4732]: E0402 13:39:16.461359 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:16 crc kubenswrapper[4732]: E0402 13:39:16.562504 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:16 crc kubenswrapper[4732]: E0402 13:39:16.663439 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:16 crc kubenswrapper[4732]: E0402 13:39:16.764537 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:16 crc kubenswrapper[4732]: E0402 13:39:16.865199 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:16 crc kubenswrapper[4732]: E0402 13:39:16.966128 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:17 crc kubenswrapper[4732]: E0402 13:39:17.066435 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:17 crc kubenswrapper[4732]: E0402 13:39:17.166924 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:17 crc kubenswrapper[4732]: E0402 13:39:17.267869 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:17 crc kubenswrapper[4732]: E0402 13:39:17.368470 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:17 crc kubenswrapper[4732]: E0402 13:39:17.469108 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:17 crc kubenswrapper[4732]: E0402 13:39:17.570259 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:17 crc kubenswrapper[4732]: E0402 13:39:17.670392 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:17 crc kubenswrapper[4732]: E0402 13:39:17.770786 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:17 crc kubenswrapper[4732]: E0402 13:39:17.871494 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:17 crc kubenswrapper[4732]: E0402 13:39:17.972656 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:18 crc kubenswrapper[4732]: E0402 13:39:18.073368 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:18 crc kubenswrapper[4732]: E0402 13:39:18.174360 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:18 crc kubenswrapper[4732]: E0402 13:39:18.275047 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:18 crc kubenswrapper[4732]: E0402 13:39:18.375424 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:18 crc kubenswrapper[4732]: E0402 13:39:18.476143 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:18 crc kubenswrapper[4732]: E0402 13:39:18.577197 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:18 crc kubenswrapper[4732]: E0402 13:39:18.678254 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:18 crc kubenswrapper[4732]: E0402 13:39:18.778354 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:18 crc kubenswrapper[4732]: E0402 13:39:18.879081 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:18 crc kubenswrapper[4732]: E0402 13:39:18.979817 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:19 crc kubenswrapper[4732]: E0402 13:39:19.080786 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:19 crc kubenswrapper[4732]: E0402 13:39:19.181234 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:19 crc kubenswrapper[4732]: E0402 13:39:19.282013 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:19 crc kubenswrapper[4732]: E0402 13:39:19.382509 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:19 crc kubenswrapper[4732]: E0402 13:39:19.482663 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:19 crc kubenswrapper[4732]: E0402 13:39:19.582975 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:19 crc kubenswrapper[4732]: E0402 13:39:19.683854 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:19 crc kubenswrapper[4732]: E0402 13:39:19.784521 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:19 crc kubenswrapper[4732]: E0402 13:39:19.885217 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:19 crc kubenswrapper[4732]: E0402 13:39:19.985546 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:20 crc kubenswrapper[4732]: E0402 13:39:20.086220 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:20 crc kubenswrapper[4732]: E0402 13:39:20.186695 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:20 crc kubenswrapper[4732]: E0402 13:39:20.287531 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:20 crc kubenswrapper[4732]: E0402 13:39:20.388305 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:20 crc kubenswrapper[4732]: E0402 13:39:20.488570 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:20 crc kubenswrapper[4732]: E0402 13:39:20.589449 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:20 crc kubenswrapper[4732]: E0402 13:39:20.689651 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:20 crc kubenswrapper[4732]: E0402 13:39:20.790392 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:20 crc kubenswrapper[4732]: E0402 13:39:20.891229 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:20 crc kubenswrapper[4732]: E0402 13:39:20.991327 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:21 crc kubenswrapper[4732]: E0402 13:39:21.092099 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:21 crc kubenswrapper[4732]: E0402 13:39:21.192686 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:21 crc kubenswrapper[4732]: E0402 13:39:21.293441 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:21 crc kubenswrapper[4732]: E0402 13:39:21.393969 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:21 crc kubenswrapper[4732]: E0402 13:39:21.495034 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:21 crc kubenswrapper[4732]: E0402 13:39:21.595894 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:21 crc kubenswrapper[4732]: E0402 13:39:21.696101 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:21 crc kubenswrapper[4732]: E0402 13:39:21.796782 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:21 crc kubenswrapper[4732]: E0402 13:39:21.897730 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:21 crc kubenswrapper[4732]: E0402 13:39:21.998144 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:22 crc kubenswrapper[4732]: E0402 13:39:22.099164 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:22 crc kubenswrapper[4732]: E0402 13:39:22.200031 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:22 crc kubenswrapper[4732]: E0402 13:39:22.301143 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:22 crc kubenswrapper[4732]: E0402 13:39:22.401988 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:22 crc kubenswrapper[4732]: E0402 13:39:22.502667 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:22 crc kubenswrapper[4732]: E0402 13:39:22.603796 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:22 crc kubenswrapper[4732]: I0402 13:39:22.680248 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:39:22 crc kubenswrapper[4732]: I0402 13:39:22.681191 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:22 crc kubenswrapper[4732]: I0402 13:39:22.681231 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:22 crc kubenswrapper[4732]: I0402 13:39:22.681240 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:22 crc kubenswrapper[4732]: I0402 13:39:22.681791 4732 scope.go:117] "RemoveContainer" containerID="8675a5215a5954016d0e75f25067050d5de0f1e5f52271df26f9d7ba5a04d5b0" Apr 02 13:39:22 crc kubenswrapper[4732]: E0402 13:39:22.704689 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:22 crc kubenswrapper[4732]: E0402 13:39:22.805261 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:22 crc kubenswrapper[4732]: E0402 13:39:22.907594 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:23 crc kubenswrapper[4732]: E0402 13:39:23.008685 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:23 crc kubenswrapper[4732]: I0402 13:39:23.108144 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Apr 02 13:39:23 crc kubenswrapper[4732]: E0402 13:39:23.108776 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:23 crc kubenswrapper[4732]: I0402 13:39:23.110333 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a"} Apr 02 13:39:23 crc kubenswrapper[4732]: I0402 13:39:23.110553 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:39:23 crc kubenswrapper[4732]: I0402 13:39:23.111499 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:23 crc kubenswrapper[4732]: I0402 13:39:23.111532 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:23 crc kubenswrapper[4732]: I0402 13:39:23.111571 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:23 crc kubenswrapper[4732]: E0402 13:39:23.209461 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:23 crc kubenswrapper[4732]: E0402 13:39:23.310364 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:23 crc kubenswrapper[4732]: E0402 13:39:23.412118 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:23 crc kubenswrapper[4732]: E0402 13:39:23.513213 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:23 crc kubenswrapper[4732]: E0402 13:39:23.613665 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:23 crc kubenswrapper[4732]: E0402 13:39:23.714754 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:23 crc kubenswrapper[4732]: E0402 13:39:23.815512 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:23 crc kubenswrapper[4732]: E0402 13:39:23.916234 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:24 crc kubenswrapper[4732]: E0402 13:39:24.016343 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:24 crc kubenswrapper[4732]: E0402 13:39:24.116518 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.120348 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.120855 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.122236 4732 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a" exitCode=255 Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.122274 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a"} Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.122305 4732 scope.go:117] "RemoveContainer" containerID="8675a5215a5954016d0e75f25067050d5de0f1e5f52271df26f9d7ba5a04d5b0" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.122433 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.123311 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.123360 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.123374 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.124116 4732 scope.go:117] "RemoveContainer" containerID="5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a" Apr 02 13:39:24 crc kubenswrapper[4732]: E0402 13:39:24.124358 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 02 13:39:24 crc kubenswrapper[4732]: E0402 13:39:24.217534 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:24 crc kubenswrapper[4732]: E0402 13:39:24.318699 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:24 crc kubenswrapper[4732]: E0402 13:39:24.419413 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:24 crc kubenswrapper[4732]: E0402 13:39:24.519825 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 02 13:39:24 crc kubenswrapper[4732]: E0402 13:39:24.620517 4732 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Apr 02 13:39:24 crc kubenswrapper[4732]: E0402 13:39:24.916897 4732 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Apr 02 13:39:24 crc kubenswrapper[4732]: E0402 13:39:24.929048 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 02 13:39:24 crc kubenswrapper[4732]: E0402 13:39:24.936695 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.940755 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.940826 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.940839 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.940856 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.940888 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:24Z","lastTransitionTime":"2026-04-02T13:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:24 crc kubenswrapper[4732]: E0402 13:39:24.950679 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.953591 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.953664 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.953675 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.953694 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.953727 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:24Z","lastTransitionTime":"2026-04-02T13:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:24 crc kubenswrapper[4732]: E0402 13:39:24.963255 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.965903 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.965935 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.965943 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.965959 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.965984 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:24Z","lastTransitionTime":"2026-04-02T13:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:24 crc kubenswrapper[4732]: E0402 13:39:24.973768 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.976650 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.976694 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.976707 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.976723 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:24 crc kubenswrapper[4732]: I0402 13:39:24.976732 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:24Z","lastTransitionTime":"2026-04-02T13:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:24 crc kubenswrapper[4732]: E0402 13:39:24.985586 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:24 crc kubenswrapper[4732]: E0402 13:39:24.985718 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Apr 02 13:39:25 crc kubenswrapper[4732]: I0402 13:39:25.125739 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Apr 02 13:39:27 crc kubenswrapper[4732]: I0402 13:39:27.367814 4732 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Apr 02 13:39:29 crc kubenswrapper[4732]: I0402 13:39:29.680253 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:39:29 crc kubenswrapper[4732]: I0402 13:39:29.681409 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:29 crc kubenswrapper[4732]: I0402 13:39:29.681442 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:29 crc kubenswrapper[4732]: I0402 13:39:29.681453 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:29 crc kubenswrapper[4732]: E0402 13:39:29.931228 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 02 13:39:30 crc kubenswrapper[4732]: I0402 13:39:30.114307 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:39:30 crc kubenswrapper[4732]: I0402 13:39:30.114520 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 02 13:39:30 crc kubenswrapper[4732]: I0402 13:39:30.115587 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:30 crc kubenswrapper[4732]: I0402 13:39:30.115655 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:30 crc kubenswrapper[4732]: I0402 13:39:30.115667 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:30 crc kubenswrapper[4732]: I0402 13:39:30.116368 4732 scope.go:117] "RemoveContainer" containerID="5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a" Apr 02 13:39:30 crc kubenswrapper[4732]: E0402 13:39:30.116597 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 02 13:39:30 crc kubenswrapper[4732]: I0402 13:39:30.664372 4732 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.542712 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.559972 4732 apiserver.go:52] "Watching apiserver" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.562449 4732 scope.go:117] "RemoveContainer" containerID="5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a" Apr 02 13:39:31 crc kubenswrapper[4732]: E0402 13:39:31.562682 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.563197 4732 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.563634 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-glqvf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-machine-config-operator/machine-config-daemon-6vtmw","openshift-multus/multus-s52gj","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q","openshift-ovn-kubernetes/ovnkube-node-8qmgp","openshift-kube-apiserver/kube-apiserver-crc","openshift-multus/multus-additional-cni-plugins-nqhwm","openshift-dns/node-resolver-tg9vx","openshift-multus/network-metrics-daemon-crx2z","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.567350 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.567437 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.567471 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: E0402 13:39:31.567546 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.567597 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.567665 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.567669 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.567716 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.567734 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-glqvf" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.567678 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Apr 02 13:39:31 crc kubenswrapper[4732]: E0402 13:39:31.567706 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.567805 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.567845 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tg9vx" Apr 02 13:39:31 crc kubenswrapper[4732]: E0402 13:39:31.568111 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.568217 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.568234 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" Apr 02 13:39:31 crc kubenswrapper[4732]: E0402 13:39:31.568766 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.568834 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.575907 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.575984 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.576169 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.576190 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.576286 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.576303 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.576410 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.576441 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.576417 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.576415 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.576557 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.576765 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.576791 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.576874 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.576986 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.577013 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.577089 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.577143 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.577183 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.577328 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.577517 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.577596 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.577579 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.577663 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.577681 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.577664 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.577742 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.577763 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.577774 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.577805 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.577805 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.577811 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.577836 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.577837 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.577863 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.577883 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.583836 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.590938 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.601223 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.613049 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.622961 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.630220 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.640752 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.645024 4732 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.650466 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.660561 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.671449 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.677840 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.677922 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.677942 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.677957 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.677993 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678011 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678030 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678044 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678060 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678077 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678093 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678108 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678122 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678138 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678153 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678167 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678182 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678197 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678212 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678226 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678236 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678244 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678290 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678310 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678317 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678339 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678368 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678391 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678413 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678434 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678454 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678474 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678495 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678515 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678537 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678558 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678578 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678600 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678645 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678667 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678687 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678709 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678765 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678795 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678865 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678891 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678914 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.678937 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.679004 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.679011 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.679036 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.679185 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.679192 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.679216 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.679256 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.679352 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.679686 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.679749 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.679775 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.679800 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.679851 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.679876 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.679883 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.679901 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.679946 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.679992 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680021 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680042 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680091 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680139 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680163 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680186 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680233 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680282 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680327 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680392 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680441 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680601 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680696 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680744 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680791 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680838 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680887 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680932 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680977 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.681023 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.681070 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680236 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680269 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680257 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680388 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680392 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680642 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680670 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680680 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.680629 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.681095 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.681066 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.681251 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.681209 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.681275 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.681561 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.681656 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.681810 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.681846 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.681827 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.681183 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.681877 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.681911 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.681964 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.682281 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.682268 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.682291 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.682471 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.682493 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.682748 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: E0402 13:39:31.682858 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:39:32.182839275 +0000 UTC m=+129.087246908 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.683014 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.683098 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.683101 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.683122 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.682999 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.683250 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.683441 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.683497 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.682735 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.683557 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.683840 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.683962 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.683977 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.683902 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.684339 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.681115 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.684596 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.684641 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.684650 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.684665 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.684686 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.684738 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.684773 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.684736 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.684731 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.684758 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.684806 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.684832 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.684871 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.684773 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.684815 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685002 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.684893 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685109 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685146 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685178 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685219 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685258 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685283 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685308 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685333 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685368 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685401 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685422 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685443 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685466 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685489 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685512 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685536 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685575 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685602 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685649 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685671 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685703 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685724 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685745 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685767 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685788 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685810 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685854 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685877 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685899 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685945 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685977 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686009 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685189 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686023 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685244 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685395 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685421 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685559 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685772 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.685934 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686047 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686035 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686167 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686188 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686216 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686240 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686253 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686265 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686290 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686312 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686337 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686361 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686387 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686410 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686425 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686436 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686461 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686483 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686504 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686531 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686551 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686573 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686572 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686596 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686733 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686753 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686594 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686814 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686855 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686885 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686922 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686948 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686971 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686998 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687021 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687046 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687069 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687116 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687149 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687176 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687201 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687224 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687259 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687282 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687305 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687327 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687349 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687373 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687396 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687417 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687448 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687764 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687802 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687829 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687852 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687875 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687899 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687921 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687945 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687969 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687991 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688014 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688037 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688060 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688096 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688120 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688145 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688171 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688196 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688217 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688240 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688267 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688305 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688328 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688351 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688374 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688396 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688419 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688440 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688463 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688485 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688508 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688532 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688559 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688583 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688606 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688650 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688738 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688767 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-host-var-lib-cni-multus\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688797 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ad206957-df5c-4b3e-bd35-e798a07d2f4e-multus-daemon-config\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688831 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/658a9576-efdc-4e4b-937e-bd63032cbee6-cnibin\") pod \"multus-additional-cni-plugins-nqhwm\" (UID: \"658a9576-efdc-4e4b-937e-bd63032cbee6\") " pod="openshift-multus/multus-additional-cni-plugins-nqhwm" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688866 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688900 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688936 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-os-release\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688962 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-var-lib-openvswitch\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688989 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-etc-openvswitch\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689020 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/658a9576-efdc-4e4b-937e-bd63032cbee6-os-release\") pod \"multus-additional-cni-plugins-nqhwm\" (UID: \"658a9576-efdc-4e4b-937e-bd63032cbee6\") " pod="openshift-multus/multus-additional-cni-plugins-nqhwm" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689055 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/38409e5e-4545-49da-8f6c-4bfb30582878-rootfs\") pod \"machine-config-daemon-6vtmw\" (UID: \"38409e5e-4545-49da-8f6c-4bfb30582878\") " pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689082 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-run-openvswitch\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689115 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/658a9576-efdc-4e4b-937e-bd63032cbee6-system-cni-dir\") pod \"multus-additional-cni-plugins-nqhwm\" (UID: \"658a9576-efdc-4e4b-937e-bd63032cbee6\") " pod="openshift-multus/multus-additional-cni-plugins-nqhwm" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689151 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzv9m\" (UniqueName: \"kubernetes.io/projected/658a9576-efdc-4e4b-937e-bd63032cbee6-kube-api-access-pzv9m\") pod \"multus-additional-cni-plugins-nqhwm\" (UID: \"658a9576-efdc-4e4b-937e-bd63032cbee6\") " pod="openshift-multus/multus-additional-cni-plugins-nqhwm" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689191 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689223 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-host-run-netns\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689254 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-hostroot\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689283 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57616858-4140-40f0-83e5-388787b685b5-host\") pod \"node-ca-glqvf\" (UID: \"57616858-4140-40f0-83e5-388787b685b5\") " pod="openshift-image-registry/node-ca-glqvf" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689317 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-host-var-lib-cni-bin\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689347 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-host-run-multus-certs\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689378 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-cni-netd\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689417 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a6f5e483-7d6b-4d6d-be84-303d8f07643e-ovnkube-script-lib\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689452 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b7185760-c057-4c47-8da2-60572500a472-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-b2l5q\" (UID: \"b7185760-c057-4c47-8da2-60572500a472\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689483 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz4rn\" (UniqueName: \"kubernetes.io/projected/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-kube-api-access-vz4rn\") pod \"network-metrics-daemon-crx2z\" (UID: \"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\") " pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689515 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-systemd-units\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689548 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-run-ovn-kubernetes\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689581 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689658 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689681 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-host-var-lib-kubelet\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689708 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-cnibin\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689731 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-kubelet\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689751 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-run-netns\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689772 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a6f5e483-7d6b-4d6d-be84-303d8f07643e-ovn-node-metrics-cert\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689796 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/658a9576-efdc-4e4b-937e-bd63032cbee6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nqhwm\" (UID: \"658a9576-efdc-4e4b-937e-bd63032cbee6\") " pod="openshift-multus/multus-additional-cni-plugins-nqhwm" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689820 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b7185760-c057-4c47-8da2-60572500a472-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-b2l5q\" (UID: \"b7185760-c057-4c47-8da2-60572500a472\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689842 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689869 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689894 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689917 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-log-socket\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689940 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-run-systemd\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689965 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38409e5e-4545-49da-8f6c-4bfb30582878-proxy-tls\") pod \"machine-config-daemon-6vtmw\" (UID: \"38409e5e-4545-49da-8f6c-4bfb30582878\") " pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689988 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dssdt\" (UniqueName: \"kubernetes.io/projected/ad206957-df5c-4b3e-bd35-e798a07d2f4e-kube-api-access-dssdt\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690011 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/57616858-4140-40f0-83e5-388787b685b5-serviceca\") pod \"node-ca-glqvf\" (UID: \"57616858-4140-40f0-83e5-388787b685b5\") " pod="openshift-image-registry/node-ca-glqvf" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690034 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-etc-kubernetes\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690056 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs\") pod \"network-metrics-daemon-crx2z\" (UID: \"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\") " pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690082 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-cni-bin\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690105 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74dmp\" (UniqueName: \"kubernetes.io/projected/38409e5e-4545-49da-8f6c-4bfb30582878-kube-api-access-74dmp\") pod \"machine-config-daemon-6vtmw\" (UID: \"38409e5e-4545-49da-8f6c-4bfb30582878\") " pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690127 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b7185760-c057-4c47-8da2-60572500a472-env-overrides\") pod \"ovnkube-control-plane-749d76644c-b2l5q\" (UID: \"b7185760-c057-4c47-8da2-60572500a472\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690148 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-multus-socket-dir-parent\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690171 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a6f5e483-7d6b-4d6d-be84-303d8f07643e-ovnkube-config\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690196 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/658a9576-efdc-4e4b-937e-bd63032cbee6-cni-binary-copy\") pod \"multus-additional-cni-plugins-nqhwm\" (UID: \"658a9576-efdc-4e4b-937e-bd63032cbee6\") " pod="openshift-multus/multus-additional-cni-plugins-nqhwm" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690218 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/658a9576-efdc-4e4b-937e-bd63032cbee6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nqhwm\" (UID: \"658a9576-efdc-4e4b-937e-bd63032cbee6\") " pod="openshift-multus/multus-additional-cni-plugins-nqhwm" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690239 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-system-cni-dir\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690260 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-slash\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690283 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a6f5e483-7d6b-4d6d-be84-303d8f07643e-env-overrides\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690308 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690331 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ad206957-df5c-4b3e-bd35-e798a07d2f4e-cni-binary-copy\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690354 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9vgd\" (UniqueName: \"kubernetes.io/projected/9fcb1965-4cef-41a4-8894-3eb24e0ff80c-kube-api-access-w9vgd\") pod \"node-resolver-tg9vx\" (UID: \"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\") " pod="openshift-dns/node-resolver-tg9vx" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690376 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-node-log\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690375 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690398 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/38409e5e-4545-49da-8f6c-4bfb30582878-mcd-auth-proxy-config\") pod \"machine-config-daemon-6vtmw\" (UID: \"38409e5e-4545-49da-8f6c-4bfb30582878\") " pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690424 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690449 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-host-run-k8s-cni-cncf-io\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690475 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690505 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690530 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-run-ovn\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690561 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690587 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.691638 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-multus-cni-dir\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.691667 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9fcb1965-4cef-41a4-8894-3eb24e0ff80c-hosts-file\") pod \"node-resolver-tg9vx\" (UID: \"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\") " pod="openshift-dns/node-resolver-tg9vx" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.691686 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqn9v\" (UniqueName: \"kubernetes.io/projected/57616858-4140-40f0-83e5-388787b685b5-kube-api-access-hqn9v\") pod \"node-ca-glqvf\" (UID: \"57616858-4140-40f0-83e5-388787b685b5\") " pod="openshift-image-registry/node-ca-glqvf" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.691703 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jljzm\" (UniqueName: \"kubernetes.io/projected/a6f5e483-7d6b-4d6d-be84-303d8f07643e-kube-api-access-jljzm\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.691724 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c275t\" (UniqueName: \"kubernetes.io/projected/b7185760-c057-4c47-8da2-60572500a472-kube-api-access-c275t\") pod \"ovnkube-control-plane-749d76644c-b2l5q\" (UID: \"b7185760-c057-4c47-8da2-60572500a472\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.691741 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-multus-conf-dir\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.694053 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.694699 4732 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686848 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686862 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686959 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686956 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.686985 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687300 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687343 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687509 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687723 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687754 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687782 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.687825 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688069 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688092 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688237 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688460 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688649 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.688653 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689372 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689452 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.696058 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689555 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689687 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689770 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689917 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689949 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.689981 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690034 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.696334 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690239 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690760 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690780 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690801 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.690977 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.691210 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.691213 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.691359 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.691733 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.691796 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.692052 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.692107 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.692332 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.692361 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.692466 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.692472 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.692597 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.693071 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.693137 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.693196 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.693150 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.693376 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.693538 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.693545 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.693555 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.693697 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.693814 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.693824 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.693844 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.693956 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.694176 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.694314 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.694989 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.695129 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.695525 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.695823 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: E0402 13:39:31.695927 4732 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.694942 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.697088 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.697505 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.697686 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: E0402 13:39:31.697787 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-02 13:39:32.197707936 +0000 UTC m=+129.102115549 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.697997 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.698295 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.698679 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.698738 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.699025 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.699151 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.699257 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.699334 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.699363 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.699781 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.696345 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700243 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700262 4732 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700277 4732 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700293 4732 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700308 4732 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700322 4732 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700336 4732 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700350 4732 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700362 4732 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700376 4732 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700388 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700401 4732 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700414 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700428 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700441 4732 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700454 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700468 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700481 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700493 4732 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700509 4732 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700523 4732 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700540 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700553 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700567 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700580 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700593 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700606 4732 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700637 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700650 4732 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700664 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700676 4732 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700690 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700703 4732 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700717 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700730 4732 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700743 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700756 4732 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700769 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700783 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700798 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700811 4732 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700871 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700884 4732 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700897 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700911 4732 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700924 4732 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700938 4732 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700951 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700966 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700979 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.700994 4732 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701008 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701021 4732 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701033 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701047 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701062 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701075 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701088 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701101 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701114 4732 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701127 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701142 4732 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701155 4732 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701168 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701181 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701194 4732 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701207 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701221 4732 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701233 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701246 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701259 4732 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701272 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701285 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701298 4732 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701310 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701324 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701338 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701351 4732 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701363 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.701811 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: E0402 13:39:31.696031 4732 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.696041 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.696334 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: E0402 13:39:31.702058 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-02 13:39:32.202045623 +0000 UTC m=+129.106453176 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.702071 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.702327 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.702545 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.705705 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.705744 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.705806 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: E0402 13:39:31.705941 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 02 13:39:31 crc kubenswrapper[4732]: E0402 13:39:31.705956 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 02 13:39:31 crc kubenswrapper[4732]: E0402 13:39:31.705968 4732 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:39:31 crc kubenswrapper[4732]: E0402 13:39:31.706007 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-04-02 13:39:32.205996159 +0000 UTC m=+129.110403712 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.706023 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.706176 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.696349 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.708443 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.708834 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: E0402 13:39:31.709056 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 02 13:39:31 crc kubenswrapper[4732]: E0402 13:39:31.709069 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 02 13:39:31 crc kubenswrapper[4732]: E0402 13:39:31.709080 4732 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.709070 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: E0402 13:39:31.709113 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-04-02 13:39:32.209100073 +0000 UTC m=+129.113507626 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.709554 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.710178 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.710309 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.712053 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.712886 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.713264 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.717424 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.725664 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.727747 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.728062 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.728083 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.728136 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.728531 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.728882 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.728939 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.728987 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.729024 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.729442 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.729804 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.730544 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.733825 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.730621 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.730698 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.731054 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.733163 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.734778 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.743463 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.745188 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.746956 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.749874 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.751230 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.757090 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.757273 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.802636 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b7185760-c057-4c47-8da2-60572500a472-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-b2l5q\" (UID: \"b7185760-c057-4c47-8da2-60572500a472\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.802681 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz4rn\" (UniqueName: \"kubernetes.io/projected/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-kube-api-access-vz4rn\") pod \"network-metrics-daemon-crx2z\" (UID: \"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\") " pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.802698 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-host-var-lib-kubelet\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.802721 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-systemd-units\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.802736 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-run-ovn-kubernetes\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.802757 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.802778 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/658a9576-efdc-4e4b-937e-bd63032cbee6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nqhwm\" (UID: \"658a9576-efdc-4e4b-937e-bd63032cbee6\") " pod="openshift-multus/multus-additional-cni-plugins-nqhwm" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.802799 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b7185760-c057-4c47-8da2-60572500a472-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-b2l5q\" (UID: \"b7185760-c057-4c47-8da2-60572500a472\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.802821 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.802839 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-cnibin\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.802854 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-kubelet\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.802867 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-run-netns\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.802888 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a6f5e483-7d6b-4d6d-be84-303d8f07643e-ovn-node-metrics-cert\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.802902 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-log-socket\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.802916 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38409e5e-4545-49da-8f6c-4bfb30582878-proxy-tls\") pod \"machine-config-daemon-6vtmw\" (UID: \"38409e5e-4545-49da-8f6c-4bfb30582878\") " pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.802930 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dssdt\" (UniqueName: \"kubernetes.io/projected/ad206957-df5c-4b3e-bd35-e798a07d2f4e-kube-api-access-dssdt\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.802944 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/57616858-4140-40f0-83e5-388787b685b5-serviceca\") pod \"node-ca-glqvf\" (UID: \"57616858-4140-40f0-83e5-388787b685b5\") " pod="openshift-image-registry/node-ca-glqvf" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.802957 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-run-systemd\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.802972 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74dmp\" (UniqueName: \"kubernetes.io/projected/38409e5e-4545-49da-8f6c-4bfb30582878-kube-api-access-74dmp\") pod \"machine-config-daemon-6vtmw\" (UID: \"38409e5e-4545-49da-8f6c-4bfb30582878\") " pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.802988 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b7185760-c057-4c47-8da2-60572500a472-env-overrides\") pod \"ovnkube-control-plane-749d76644c-b2l5q\" (UID: \"b7185760-c057-4c47-8da2-60572500a472\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803004 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-multus-socket-dir-parent\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803018 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-etc-kubernetes\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803033 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs\") pod \"network-metrics-daemon-crx2z\" (UID: \"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\") " pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803050 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-cni-bin\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803065 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a6f5e483-7d6b-4d6d-be84-303d8f07643e-ovnkube-config\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803081 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-system-cni-dir\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803095 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-slash\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803131 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a6f5e483-7d6b-4d6d-be84-303d8f07643e-env-overrides\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803146 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/658a9576-efdc-4e4b-937e-bd63032cbee6-cni-binary-copy\") pod \"multus-additional-cni-plugins-nqhwm\" (UID: \"658a9576-efdc-4e4b-937e-bd63032cbee6\") " pod="openshift-multus/multus-additional-cni-plugins-nqhwm" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803161 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/658a9576-efdc-4e4b-937e-bd63032cbee6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nqhwm\" (UID: \"658a9576-efdc-4e4b-937e-bd63032cbee6\") " pod="openshift-multus/multus-additional-cni-plugins-nqhwm" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803175 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/38409e5e-4545-49da-8f6c-4bfb30582878-mcd-auth-proxy-config\") pod \"machine-config-daemon-6vtmw\" (UID: \"38409e5e-4545-49da-8f6c-4bfb30582878\") " pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803191 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ad206957-df5c-4b3e-bd35-e798a07d2f4e-cni-binary-copy\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803204 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9vgd\" (UniqueName: \"kubernetes.io/projected/9fcb1965-4cef-41a4-8894-3eb24e0ff80c-kube-api-access-w9vgd\") pod \"node-resolver-tg9vx\" (UID: \"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\") " pod="openshift-dns/node-resolver-tg9vx" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803217 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-node-log\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803241 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-multus-cni-dir\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803254 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-host-run-k8s-cni-cncf-io\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803279 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-run-ovn\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803298 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c275t\" (UniqueName: \"kubernetes.io/projected/b7185760-c057-4c47-8da2-60572500a472-kube-api-access-c275t\") pod \"ovnkube-control-plane-749d76644c-b2l5q\" (UID: \"b7185760-c057-4c47-8da2-60572500a472\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803319 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-multus-conf-dir\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803338 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9fcb1965-4cef-41a4-8894-3eb24e0ff80c-hosts-file\") pod \"node-resolver-tg9vx\" (UID: \"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\") " pod="openshift-dns/node-resolver-tg9vx" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803359 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqn9v\" (UniqueName: \"kubernetes.io/projected/57616858-4140-40f0-83e5-388787b685b5-kube-api-access-hqn9v\") pod \"node-ca-glqvf\" (UID: \"57616858-4140-40f0-83e5-388787b685b5\") " pod="openshift-image-registry/node-ca-glqvf" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803413 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jljzm\" (UniqueName: \"kubernetes.io/projected/a6f5e483-7d6b-4d6d-be84-303d8f07643e-kube-api-access-jljzm\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803436 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803456 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-host-var-lib-cni-multus\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803477 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ad206957-df5c-4b3e-bd35-e798a07d2f4e-multus-daemon-config\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803496 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/658a9576-efdc-4e4b-937e-bd63032cbee6-cnibin\") pod \"multus-additional-cni-plugins-nqhwm\" (UID: \"658a9576-efdc-4e4b-937e-bd63032cbee6\") " pod="openshift-multus/multus-additional-cni-plugins-nqhwm" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803515 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/658a9576-efdc-4e4b-937e-bd63032cbee6-os-release\") pod \"multus-additional-cni-plugins-nqhwm\" (UID: \"658a9576-efdc-4e4b-937e-bd63032cbee6\") " pod="openshift-multus/multus-additional-cni-plugins-nqhwm" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803537 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/38409e5e-4545-49da-8f6c-4bfb30582878-rootfs\") pod \"machine-config-daemon-6vtmw\" (UID: \"38409e5e-4545-49da-8f6c-4bfb30582878\") " pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803552 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-cni-bin\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803566 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-os-release\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803588 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-var-lib-openvswitch\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803634 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-etc-openvswitch\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803657 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-run-openvswitch\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803678 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/658a9576-efdc-4e4b-937e-bd63032cbee6-system-cni-dir\") pod \"multus-additional-cni-plugins-nqhwm\" (UID: \"658a9576-efdc-4e4b-937e-bd63032cbee6\") " pod="openshift-multus/multus-additional-cni-plugins-nqhwm" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803700 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzv9m\" (UniqueName: \"kubernetes.io/projected/658a9576-efdc-4e4b-937e-bd63032cbee6-kube-api-access-pzv9m\") pod \"multus-additional-cni-plugins-nqhwm\" (UID: \"658a9576-efdc-4e4b-937e-bd63032cbee6\") " pod="openshift-multus/multus-additional-cni-plugins-nqhwm" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803720 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-host-run-netns\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803741 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-hostroot\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803762 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57616858-4140-40f0-83e5-388787b685b5-host\") pod \"node-ca-glqvf\" (UID: \"57616858-4140-40f0-83e5-388787b685b5\") " pod="openshift-image-registry/node-ca-glqvf" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803512 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b7185760-c057-4c47-8da2-60572500a472-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-b2l5q\" (UID: \"b7185760-c057-4c47-8da2-60572500a472\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803803 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-host-var-lib-cni-bin\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803825 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-host-run-multus-certs\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803874 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-host-run-multus-certs\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803882 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-run-openvswitch\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803907 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-host-var-lib-cni-multus\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803900 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-cni-netd\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803915 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-host-var-lib-kubelet\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803945 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57616858-4140-40f0-83e5-388787b685b5-host\") pod \"node-ca-glqvf\" (UID: \"57616858-4140-40f0-83e5-388787b685b5\") " pod="openshift-image-registry/node-ca-glqvf" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803949 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-cni-netd\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803978 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-host-var-lib-cni-bin\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.804010 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.804041 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/658a9576-efdc-4e4b-937e-bd63032cbee6-os-release\") pod \"multus-additional-cni-plugins-nqhwm\" (UID: \"658a9576-efdc-4e4b-937e-bd63032cbee6\") " pod="openshift-multus/multus-additional-cni-plugins-nqhwm" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.804066 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-systemd-units\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.804233 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/658a9576-efdc-4e4b-937e-bd63032cbee6-cnibin\") pod \"multus-additional-cni-plugins-nqhwm\" (UID: \"658a9576-efdc-4e4b-937e-bd63032cbee6\") " pod="openshift-multus/multus-additional-cni-plugins-nqhwm" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.804255 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-run-ovn-kubernetes\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.804339 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.804497 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a6f5e483-7d6b-4d6d-be84-303d8f07643e-env-overrides\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.804545 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-slash\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.804564 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-system-cni-dir\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.804578 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/38409e5e-4545-49da-8f6c-4bfb30582878-rootfs\") pod \"machine-config-daemon-6vtmw\" (UID: \"38409e5e-4545-49da-8f6c-4bfb30582878\") " pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.804716 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-host-run-k8s-cni-cncf-io\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.804862 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ad206957-df5c-4b3e-bd35-e798a07d2f4e-multus-daemon-config\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.804892 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/658a9576-efdc-4e4b-937e-bd63032cbee6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nqhwm\" (UID: \"658a9576-efdc-4e4b-937e-bd63032cbee6\") " pod="openshift-multus/multus-additional-cni-plugins-nqhwm" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.804978 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/38409e5e-4545-49da-8f6c-4bfb30582878-mcd-auth-proxy-config\") pod \"machine-config-daemon-6vtmw\" (UID: \"38409e5e-4545-49da-8f6c-4bfb30582878\") " pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.804969 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-kubelet\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.805018 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-log-socket\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.805045 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-run-ovn\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.805064 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-etc-openvswitch\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.805087 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-os-release\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.805105 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/658a9576-efdc-4e4b-937e-bd63032cbee6-cni-binary-copy\") pod \"multus-additional-cni-plugins-nqhwm\" (UID: \"658a9576-efdc-4e4b-937e-bd63032cbee6\") " pod="openshift-multus/multus-additional-cni-plugins-nqhwm" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.805136 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-multus-conf-dir\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.805167 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a6f5e483-7d6b-4d6d-be84-303d8f07643e-ovnkube-config\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.805300 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-multus-socket-dir-parent\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.805315 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-etc-kubernetes\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.805355 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-run-systemd\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.805521 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-host-run-netns\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.805553 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-run-netns\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.805600 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-cnibin\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.805679 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-hostroot\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.805709 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-var-lib-openvswitch\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.803964 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a6f5e483-7d6b-4d6d-be84-303d8f07643e-ovnkube-script-lib\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.805913 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ad206957-df5c-4b3e-bd35-e798a07d2f4e-multus-cni-dir\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.805945 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/658a9576-efdc-4e4b-937e-bd63032cbee6-system-cni-dir\") pod \"multus-additional-cni-plugins-nqhwm\" (UID: \"658a9576-efdc-4e4b-937e-bd63032cbee6\") " pod="openshift-multus/multus-additional-cni-plugins-nqhwm" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.807918 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.807939 4732 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.806340 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.806526 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/658a9576-efdc-4e4b-937e-bd63032cbee6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nqhwm\" (UID: \"658a9576-efdc-4e4b-937e-bd63032cbee6\") " pod="openshift-multus/multus-additional-cni-plugins-nqhwm" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.806723 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a6f5e483-7d6b-4d6d-be84-303d8f07643e-ovnkube-script-lib\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.807506 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9fcb1965-4cef-41a4-8894-3eb24e0ff80c-hosts-file\") pod \"node-resolver-tg9vx\" (UID: \"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\") " pod="openshift-dns/node-resolver-tg9vx" Apr 02 13:39:31 crc kubenswrapper[4732]: E0402 13:39:31.806306 4732 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 02 13:39:31 crc kubenswrapper[4732]: E0402 13:39:31.808135 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs podName:386bd92b-c67e-4cc6-8a47-6f8d6e799bc7 nodeName:}" failed. No retries permitted until 2026-04-02 13:39:32.308112902 +0000 UTC m=+129.212520545 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs") pod "network-metrics-daemon-crx2z" (UID: "386bd92b-c67e-4cc6-8a47-6f8d6e799bc7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.806358 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-node-log\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.807982 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808194 4732 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808212 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808224 4732 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808235 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808247 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808258 4732 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808269 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808281 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808293 4732 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808305 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808326 4732 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808339 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808351 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808363 4732 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808375 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808385 4732 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808397 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808408 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808421 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808433 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808446 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808458 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808469 4732 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808481 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808491 4732 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808500 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808511 4732 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808520 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808530 4732 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808542 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808555 4732 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808463 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ad206957-df5c-4b3e-bd35-e798a07d2f4e-cni-binary-copy\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808566 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808579 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808592 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808608 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808646 4732 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808657 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808671 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808681 4732 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808692 4732 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808703 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808714 4732 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808724 4732 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808740 4732 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808750 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808748 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b7185760-c057-4c47-8da2-60572500a472-env-overrides\") pod \"ovnkube-control-plane-749d76644c-b2l5q\" (UID: \"b7185760-c057-4c47-8da2-60572500a472\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808761 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808842 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808869 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808895 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808936 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808961 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.808986 4732 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809012 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809035 4732 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809068 4732 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809092 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809116 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809154 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809178 4732 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809203 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809226 4732 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809250 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809274 4732 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809299 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809323 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809364 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809389 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809420 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809444 4732 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809469 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809502 4732 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809526 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809549 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809573 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809597 4732 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809657 4732 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809683 4732 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809707 4732 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809730 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809753 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809776 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809806 4732 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809830 4732 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809854 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809877 4732 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809900 4732 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809923 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809962 4732 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.809989 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.810019 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.810042 4732 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.810066 4732 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.810089 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.810123 4732 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.810147 4732 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.810171 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.810199 4732 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.810220 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.810242 4732 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.810265 4732 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.810288 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.810311 4732 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.810334 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.810531 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.810557 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.810582 4732 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.810644 4732 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.810683 4732 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.810706 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.810730 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.810759 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.816187 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/57616858-4140-40f0-83e5-388787b685b5-serviceca\") pod \"node-ca-glqvf\" (UID: \"57616858-4140-40f0-83e5-388787b685b5\") " pod="openshift-image-registry/node-ca-glqvf" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.822208 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b7185760-c057-4c47-8da2-60572500a472-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-b2l5q\" (UID: \"b7185760-c057-4c47-8da2-60572500a472\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.822224 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a6f5e483-7d6b-4d6d-be84-303d8f07643e-ovn-node-metrics-cert\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.822519 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38409e5e-4545-49da-8f6c-4bfb30582878-proxy-tls\") pod \"machine-config-daemon-6vtmw\" (UID: \"38409e5e-4545-49da-8f6c-4bfb30582878\") " pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.826895 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9vgd\" (UniqueName: \"kubernetes.io/projected/9fcb1965-4cef-41a4-8894-3eb24e0ff80c-kube-api-access-w9vgd\") pod \"node-resolver-tg9vx\" (UID: \"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\") " pod="openshift-dns/node-resolver-tg9vx" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.828308 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c275t\" (UniqueName: \"kubernetes.io/projected/b7185760-c057-4c47-8da2-60572500a472-kube-api-access-c275t\") pod \"ovnkube-control-plane-749d76644c-b2l5q\" (UID: \"b7185760-c057-4c47-8da2-60572500a472\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.829299 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqn9v\" (UniqueName: \"kubernetes.io/projected/57616858-4140-40f0-83e5-388787b685b5-kube-api-access-hqn9v\") pod \"node-ca-glqvf\" (UID: \"57616858-4140-40f0-83e5-388787b685b5\") " pod="openshift-image-registry/node-ca-glqvf" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.829731 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jljzm\" (UniqueName: \"kubernetes.io/projected/a6f5e483-7d6b-4d6d-be84-303d8f07643e-kube-api-access-jljzm\") pod \"ovnkube-node-8qmgp\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.830263 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz4rn\" (UniqueName: \"kubernetes.io/projected/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-kube-api-access-vz4rn\") pod \"network-metrics-daemon-crx2z\" (UID: \"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\") " pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.830308 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dssdt\" (UniqueName: \"kubernetes.io/projected/ad206957-df5c-4b3e-bd35-e798a07d2f4e-kube-api-access-dssdt\") pod \"multus-s52gj\" (UID: \"ad206957-df5c-4b3e-bd35-e798a07d2f4e\") " pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.835850 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74dmp\" (UniqueName: \"kubernetes.io/projected/38409e5e-4545-49da-8f6c-4bfb30582878-kube-api-access-74dmp\") pod \"machine-config-daemon-6vtmw\" (UID: \"38409e5e-4545-49da-8f6c-4bfb30582878\") " pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.837327 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzv9m\" (UniqueName: \"kubernetes.io/projected/658a9576-efdc-4e4b-937e-bd63032cbee6-kube-api-access-pzv9m\") pod \"multus-additional-cni-plugins-nqhwm\" (UID: \"658a9576-efdc-4e4b-937e-bd63032cbee6\") " pod="openshift-multus/multus-additional-cni-plugins-nqhwm" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.887922 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.895954 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.906599 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-glqvf" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.915723 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" Apr 02 13:39:31 crc kubenswrapper[4732]: W0402 13:39:31.918277 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57616858_4140_40f0_83e5_388787b685b5.slice/crio-c3ec0200cdf85960c2045eaa9e6d1db4244ad1f6f72c90d63a4393f675a85e29 WatchSource:0}: Error finding container c3ec0200cdf85960c2045eaa9e6d1db4244ad1f6f72c90d63a4393f675a85e29: Status 404 returned error can't find the container with id c3ec0200cdf85960c2045eaa9e6d1db4244ad1f6f72c90d63a4393f675a85e29 Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.923515 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.932588 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s52gj" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.941019 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.947964 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tg9vx" Apr 02 13:39:31 crc kubenswrapper[4732]: W0402 13:39:31.948495 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38409e5e_4545_49da_8f6c_4bfb30582878.slice/crio-bea834dc35fa2161596a1695dfb2f82a4f58aa82b0c7dd3b39d6835624778be2 WatchSource:0}: Error finding container bea834dc35fa2161596a1695dfb2f82a4f58aa82b0c7dd3b39d6835624778be2: Status 404 returned error can't find the container with id bea834dc35fa2161596a1695dfb2f82a4f58aa82b0c7dd3b39d6835624778be2 Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.955537 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:31 crc kubenswrapper[4732]: W0402 13:39:31.958402 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod658a9576_efdc_4e4b_937e_bd63032cbee6.slice/crio-3493674f06e843c98386736ecb11448b1e5f635fb8951f3bd283d4240bc53fd3 WatchSource:0}: Error finding container 3493674f06e843c98386736ecb11448b1e5f635fb8951f3bd283d4240bc53fd3: Status 404 returned error can't find the container with id 3493674f06e843c98386736ecb11448b1e5f635fb8951f3bd283d4240bc53fd3 Apr 02 13:39:31 crc kubenswrapper[4732]: W0402 13:39:31.959389 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad206957_df5c_4b3e_bd35_e798a07d2f4e.slice/crio-9186218f76e14259da23a90e768595464a27bfaac058ee5ffc113bf2f1129809 WatchSource:0}: Error finding container 9186218f76e14259da23a90e768595464a27bfaac058ee5ffc113bf2f1129809: Status 404 returned error can't find the container with id 9186218f76e14259da23a90e768595464a27bfaac058ee5ffc113bf2f1129809 Apr 02 13:39:31 crc kubenswrapper[4732]: I0402 13:39:31.962424 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" Apr 02 13:39:31 crc kubenswrapper[4732]: W0402 13:39:31.974807 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-68b7ce8c18d9d57e634878037dfdfb2bb7721dd65ec795e65de30f01adba1b77 WatchSource:0}: Error finding container 68b7ce8c18d9d57e634878037dfdfb2bb7721dd65ec795e65de30f01adba1b77: Status 404 returned error can't find the container with id 68b7ce8c18d9d57e634878037dfdfb2bb7721dd65ec795e65de30f01adba1b77 Apr 02 13:39:31 crc kubenswrapper[4732]: W0402 13:39:31.994043 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fcb1965_4cef_41a4_8894_3eb24e0ff80c.slice/crio-21956227ae05ed4a4dbdd4e55f1e86311f9229daed53f9efb36f155ebccec5c2 WatchSource:0}: Error finding container 21956227ae05ed4a4dbdd4e55f1e86311f9229daed53f9efb36f155ebccec5c2: Status 404 returned error can't find the container with id 21956227ae05ed4a4dbdd4e55f1e86311f9229daed53f9efb36f155ebccec5c2 Apr 02 13:39:31 crc kubenswrapper[4732]: W0402 13:39:31.995709 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6f5e483_7d6b_4d6d_be84_303d8f07643e.slice/crio-e6254ca05250a47e6db0dc5189460fd0b2288dbb8109713f6c0c6d1c1d07b06a WatchSource:0}: Error finding container e6254ca05250a47e6db0dc5189460fd0b2288dbb8109713f6c0c6d1c1d07b06a: Status 404 returned error can't find the container with id e6254ca05250a47e6db0dc5189460fd0b2288dbb8109713f6c0c6d1c1d07b06a Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.145175 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"68b7ce8c18d9d57e634878037dfdfb2bb7721dd65ec795e65de30f01adba1b77"} Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.147013 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerStarted","Data":"09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f"} Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.147045 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerStarted","Data":"bea834dc35fa2161596a1695dfb2f82a4f58aa82b0c7dd3b39d6835624778be2"} Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.149678 4732 generic.go:334] "Generic (PLEG): container finished" podID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerID="b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd" exitCode=0 Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.149755 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerDied","Data":"b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd"} Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.149798 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerStarted","Data":"e6254ca05250a47e6db0dc5189460fd0b2288dbb8109713f6c0c6d1c1d07b06a"} Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.161835 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" event={"ID":"658a9576-efdc-4e4b-937e-bd63032cbee6","Type":"ContainerStarted","Data":"3493674f06e843c98386736ecb11448b1e5f635fb8951f3bd283d4240bc53fd3"} Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.161902 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.163416 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-glqvf" event={"ID":"57616858-4140-40f0-83e5-388787b685b5","Type":"ContainerStarted","Data":"c3ec0200cdf85960c2045eaa9e6d1db4244ad1f6f72c90d63a4393f675a85e29"} Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.167975 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tg9vx" event={"ID":"9fcb1965-4cef-41a4-8894-3eb24e0ff80c","Type":"ContainerStarted","Data":"21956227ae05ed4a4dbdd4e55f1e86311f9229daed53f9efb36f155ebccec5c2"} Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.170640 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111"} Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.170676 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"300a8ad5dc7133467ec08da7f27c2cc8d7f1135f40e1fefb14dc7b026fdc9eaf"} Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.172598 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s52gj" event={"ID":"ad206957-df5c-4b3e-bd35-e798a07d2f4e","Type":"ContainerStarted","Data":"169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64"} Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.172711 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s52gj" event={"ID":"ad206957-df5c-4b3e-bd35-e798a07d2f4e","Type":"ContainerStarted","Data":"9186218f76e14259da23a90e768595464a27bfaac058ee5ffc113bf2f1129809"} Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.173404 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.174189 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" event={"ID":"b7185760-c057-4c47-8da2-60572500a472","Type":"ContainerStarted","Data":"b820270af5cc29e673b14d0f9b0c64f828c52be717d557c3e6e2faba6c37258d"} Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.178013 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e"} Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.178060 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d109dc2d4341596b3e0912152b16c31c4b634d49070ea023d030d933627e4084"} Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.178564 4732 scope.go:117] "RemoveContainer" containerID="5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a" Apr 02 13:39:32 crc kubenswrapper[4732]: E0402 13:39:32.178798 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.183901 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.193698 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.209738 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.214470 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:39:32 crc kubenswrapper[4732]: E0402 13:39:32.214820 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:39:33.214794378 +0000 UTC m=+130.119201971 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.215176 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.215269 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.215353 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:39:32 crc kubenswrapper[4732]: E0402 13:39:32.215404 4732 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Apr 02 13:39:32 crc kubenswrapper[4732]: E0402 13:39:32.215499 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-02 13:39:33.215486917 +0000 UTC m=+130.119894540 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Apr 02 13:39:32 crc kubenswrapper[4732]: E0402 13:39:32.215679 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 02 13:39:32 crc kubenswrapper[4732]: E0402 13:39:32.215768 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 02 13:39:32 crc kubenswrapper[4732]: E0402 13:39:32.215829 4732 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.215450 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:39:32 crc kubenswrapper[4732]: E0402 13:39:32.215900 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 02 13:39:32 crc kubenswrapper[4732]: E0402 13:39:32.215968 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 02 13:39:32 crc kubenswrapper[4732]: E0402 13:39:32.215983 4732 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:39:32 crc kubenswrapper[4732]: E0402 13:39:32.215439 4732 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 02 13:39:32 crc kubenswrapper[4732]: E0402 13:39:32.216092 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-04-02 13:39:33.215903058 +0000 UTC m=+130.120310611 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:39:32 crc kubenswrapper[4732]: E0402 13:39:32.216159 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-04-02 13:39:33.216150425 +0000 UTC m=+130.120557978 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:39:32 crc kubenswrapper[4732]: E0402 13:39:32.216220 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-02 13:39:33.216212886 +0000 UTC m=+130.120620429 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.219594 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.228193 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.241971 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.251124 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.261063 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.274122 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.285996 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.296819 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.309696 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.316291 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs\") pod \"network-metrics-daemon-crx2z\" (UID: \"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\") " pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:39:32 crc kubenswrapper[4732]: E0402 13:39:32.317288 4732 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 02 13:39:32 crc kubenswrapper[4732]: E0402 13:39:32.317356 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs podName:386bd92b-c67e-4cc6-8a47-6f8d6e799bc7 nodeName:}" failed. No retries permitted until 2026-04-02 13:39:33.317340183 +0000 UTC m=+130.221747736 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs") pod "network-metrics-daemon-crx2z" (UID: "386bd92b-c67e-4cc6-8a47-6f8d6e799bc7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.319118 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.330197 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.342306 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.350680 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.360580 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.375694 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.383945 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.392765 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.402747 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.410107 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.417654 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.426388 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.436022 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.443215 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.455542 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.473325 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.679475 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:39:32 crc kubenswrapper[4732]: E0402 13:39:32.679646 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.685987 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.687086 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.688502 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.689376 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.691148 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.692054 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.692856 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.694268 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.695561 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.696978 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.697854 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.699224 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.700077 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.700903 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.702113 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.702860 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.704509 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.706320 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.708141 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.709604 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.710368 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.712032 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.712759 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.714319 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.715225 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.716982 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.718458 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.719172 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.720249 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.720841 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.721951 4732 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.722124 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.724115 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.725464 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.725959 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.727830 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.728857 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.730175 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.731221 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.733246 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.733949 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.735456 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.736588 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.738112 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.738736 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.739736 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.740743 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.741712 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.742777 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.744017 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.744593 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.745781 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.746648 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Apr 02 13:39:32 crc kubenswrapper[4732]: I0402 13:39:32.747341 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.182200 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-glqvf" event={"ID":"57616858-4140-40f0-83e5-388787b685b5","Type":"ContainerStarted","Data":"af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f"} Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.183447 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tg9vx" event={"ID":"9fcb1965-4cef-41a4-8894-3eb24e0ff80c","Type":"ContainerStarted","Data":"9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d"} Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.184891 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerStarted","Data":"bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3"} Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.186124 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78"} Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.189825 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerStarted","Data":"9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04"} Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.189852 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerStarted","Data":"09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191"} Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.189861 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerStarted","Data":"975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39"} Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.189870 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerStarted","Data":"dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7"} Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.189879 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerStarted","Data":"39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e"} Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.189888 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerStarted","Data":"4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f"} Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.191127 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" event={"ID":"b7185760-c057-4c47-8da2-60572500a472","Type":"ContainerStarted","Data":"9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23"} Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.191154 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" event={"ID":"b7185760-c057-4c47-8da2-60572500a472","Type":"ContainerStarted","Data":"a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc"} Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.192605 4732 generic.go:334] "Generic (PLEG): container finished" podID="658a9576-efdc-4e4b-937e-bd63032cbee6" containerID="de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067" exitCode=0 Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.192661 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" event={"ID":"658a9576-efdc-4e4b-937e-bd63032cbee6","Type":"ContainerDied","Data":"de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067"} Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.197876 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.209288 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.223276 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.223381 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.223401 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.223418 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.223440 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:39:33 crc kubenswrapper[4732]: E0402 13:39:33.223547 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 02 13:39:33 crc kubenswrapper[4732]: E0402 13:39:33.223561 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 02 13:39:33 crc kubenswrapper[4732]: E0402 13:39:33.223571 4732 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:39:33 crc kubenswrapper[4732]: E0402 13:39:33.223643 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-04-02 13:39:35.223597929 +0000 UTC m=+132.128005482 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:39:33 crc kubenswrapper[4732]: E0402 13:39:33.223930 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:39:35.223921928 +0000 UTC m=+132.128329481 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:39:33 crc kubenswrapper[4732]: E0402 13:39:33.224162 4732 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Apr 02 13:39:33 crc kubenswrapper[4732]: E0402 13:39:33.224985 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 02 13:39:33 crc kubenswrapper[4732]: E0402 13:39:33.225003 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 02 13:39:33 crc kubenswrapper[4732]: E0402 13:39:33.225015 4732 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:39:33 crc kubenswrapper[4732]: E0402 13:39:33.225056 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-02 13:39:35.225026027 +0000 UTC m=+132.129433570 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.224194 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: E0402 13:39:33.225114 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-04-02 13:39:35.225099279 +0000 UTC m=+132.129506912 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:39:33 crc kubenswrapper[4732]: E0402 13:39:33.225071 4732 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 02 13:39:33 crc kubenswrapper[4732]: E0402 13:39:33.225159 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-02 13:39:35.225150951 +0000 UTC m=+132.129558604 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.236156 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.251216 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.263866 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.289665 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.306735 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.323016 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.324587 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs\") pod \"network-metrics-daemon-crx2z\" (UID: \"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\") " pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:39:33 crc kubenswrapper[4732]: E0402 13:39:33.325792 4732 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 02 13:39:33 crc kubenswrapper[4732]: E0402 13:39:33.325862 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs podName:386bd92b-c67e-4cc6-8a47-6f8d6e799bc7 nodeName:}" failed. No retries permitted until 2026-04-02 13:39:35.325842296 +0000 UTC m=+132.230249849 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs") pod "network-metrics-daemon-crx2z" (UID: "386bd92b-c67e-4cc6-8a47-6f8d6e799bc7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.340449 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.353326 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.375676 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.387065 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.402031 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.421225 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.431379 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.447520 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.459149 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.471461 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.500544 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.509731 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.527652 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.541739 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.555175 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.570591 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.590708 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.618017 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.635976 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.651196 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.677579 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:33Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.679803 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.679836 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:39:33 crc kubenswrapper[4732]: I0402 13:39:33.679846 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:39:33 crc kubenswrapper[4732]: E0402 13:39:33.679927 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:39:33 crc kubenswrapper[4732]: E0402 13:39:33.680092 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:39:33 crc kubenswrapper[4732]: E0402 13:39:33.680191 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.197777 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" event={"ID":"658a9576-efdc-4e4b-937e-bd63032cbee6","Type":"ContainerStarted","Data":"d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945"} Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.211048 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.245591 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.266484 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.281554 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.298219 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.309427 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.318725 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.328298 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.338913 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.352030 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.363825 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.374219 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.383363 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.397045 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.415692 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.680096 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:39:34 crc kubenswrapper[4732]: E0402 13:39:34.680227 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.696560 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.710973 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.723809 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.736604 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.753952 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.771806 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.787387 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.806119 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.818821 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.833955 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.852697 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.863180 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.874170 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.886867 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: I0402 13:39:34.899749 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:34 crc kubenswrapper[4732]: E0402 13:39:34.931701 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.113700 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.113898 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.113907 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.113921 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.113930 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:35Z","lastTransitionTime":"2026-04-02T13:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:35 crc kubenswrapper[4732]: E0402 13:39:35.127521 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.131184 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.131224 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.131235 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.131304 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.131321 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:35Z","lastTransitionTime":"2026-04-02T13:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:35 crc kubenswrapper[4732]: E0402 13:39:35.151016 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.154636 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.154691 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.154703 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.154727 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.154739 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:35Z","lastTransitionTime":"2026-04-02T13:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:35 crc kubenswrapper[4732]: E0402 13:39:35.167242 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.170782 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.170830 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.170843 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.170859 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.170870 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:35Z","lastTransitionTime":"2026-04-02T13:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:35 crc kubenswrapper[4732]: E0402 13:39:35.183725 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.186982 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.187020 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.187032 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.187051 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.187067 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:35Z","lastTransitionTime":"2026-04-02T13:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:35 crc kubenswrapper[4732]: E0402 13:39:35.198050 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: E0402 13:39:35.198290 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.201362 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69"} Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.203242 4732 generic.go:334] "Generic (PLEG): container finished" podID="658a9576-efdc-4e4b-937e-bd63032cbee6" containerID="d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945" exitCode=0 Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.203295 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" event={"ID":"658a9576-efdc-4e4b-937e-bd63032cbee6","Type":"ContainerDied","Data":"d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945"} Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.217760 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.230184 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.241717 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.243249 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.243346 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.243374 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.243394 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.243419 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:39:35 crc kubenswrapper[4732]: E0402 13:39:35.243554 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 02 13:39:35 crc kubenswrapper[4732]: E0402 13:39:35.243574 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 02 13:39:35 crc kubenswrapper[4732]: E0402 13:39:35.243586 4732 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:39:35 crc kubenswrapper[4732]: E0402 13:39:35.243696 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-04-02 13:39:39.243633288 +0000 UTC m=+136.148040841 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:39:35 crc kubenswrapper[4732]: E0402 13:39:35.243971 4732 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 02 13:39:35 crc kubenswrapper[4732]: E0402 13:39:35.244028 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-02 13:39:39.244013178 +0000 UTC m=+136.148420731 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 02 13:39:35 crc kubenswrapper[4732]: E0402 13:39:35.244082 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:39:39.244076319 +0000 UTC m=+136.148483872 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:39:35 crc kubenswrapper[4732]: E0402 13:39:35.244117 4732 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Apr 02 13:39:35 crc kubenswrapper[4732]: E0402 13:39:35.244136 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-02 13:39:39.244130731 +0000 UTC m=+136.148538284 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Apr 02 13:39:35 crc kubenswrapper[4732]: E0402 13:39:35.244237 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 02 13:39:35 crc kubenswrapper[4732]: E0402 13:39:35.244264 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 02 13:39:35 crc kubenswrapper[4732]: E0402 13:39:35.244275 4732 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:39:35 crc kubenswrapper[4732]: E0402 13:39:35.244323 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-04-02 13:39:39.244309135 +0000 UTC m=+136.148716678 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.258466 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.271994 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.285102 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.296445 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.309211 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.321370 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.339972 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.344288 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs\") pod \"network-metrics-daemon-crx2z\" (UID: \"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\") " pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:39:35 crc kubenswrapper[4732]: E0402 13:39:35.344501 4732 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 02 13:39:35 crc kubenswrapper[4732]: E0402 13:39:35.344641 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs podName:386bd92b-c67e-4cc6-8a47-6f8d6e799bc7 nodeName:}" failed. No retries permitted until 2026-04-02 13:39:39.344602996 +0000 UTC m=+136.249010549 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs") pod "network-metrics-daemon-crx2z" (UID: "386bd92b-c67e-4cc6-8a47-6f8d6e799bc7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.353121 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.363350 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.399755 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.438277 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.474581 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.514513 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.555851 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.597212 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.633185 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.674975 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.679311 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.679401 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:39:35 crc kubenswrapper[4732]: E0402 13:39:35.679472 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.679538 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:35 crc kubenswrapper[4732]: E0402 13:39:35.679598 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:39:35 crc kubenswrapper[4732]: E0402 13:39:35.679851 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.716064 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.759332 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.798716 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.836375 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.877163 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.915345 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.955159 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:35 crc kubenswrapper[4732]: I0402 13:39:35.995985 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:36 crc kubenswrapper[4732]: I0402 13:39:36.038518 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:36 crc kubenswrapper[4732]: I0402 13:39:36.076984 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:36 crc kubenswrapper[4732]: I0402 13:39:36.208200 4732 generic.go:334] "Generic (PLEG): container finished" podID="658a9576-efdc-4e4b-937e-bd63032cbee6" containerID="19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a" exitCode=0 Apr 02 13:39:36 crc kubenswrapper[4732]: I0402 13:39:36.208267 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" event={"ID":"658a9576-efdc-4e4b-937e-bd63032cbee6","Type":"ContainerDied","Data":"19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a"} Apr 02 13:39:36 crc kubenswrapper[4732]: I0402 13:39:36.212880 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerStarted","Data":"2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7"} Apr 02 13:39:36 crc kubenswrapper[4732]: I0402 13:39:36.220590 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:36 crc kubenswrapper[4732]: I0402 13:39:36.233427 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:36 crc kubenswrapper[4732]: I0402 13:39:36.244683 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:36 crc kubenswrapper[4732]: I0402 13:39:36.263694 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:36 crc kubenswrapper[4732]: I0402 13:39:36.275193 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:36 crc kubenswrapper[4732]: I0402 13:39:36.316718 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:36 crc kubenswrapper[4732]: I0402 13:39:36.354855 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:36 crc kubenswrapper[4732]: I0402 13:39:36.397592 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:36 crc kubenswrapper[4732]: I0402 13:39:36.436381 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:36 crc kubenswrapper[4732]: I0402 13:39:36.475297 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:36 crc kubenswrapper[4732]: I0402 13:39:36.519007 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:36 crc kubenswrapper[4732]: I0402 13:39:36.556398 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:36 crc kubenswrapper[4732]: I0402 13:39:36.596188 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:36 crc kubenswrapper[4732]: I0402 13:39:36.639410 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:36 crc kubenswrapper[4732]: I0402 13:39:36.673295 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:36 crc kubenswrapper[4732]: I0402 13:39:36.679558 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:39:36 crc kubenswrapper[4732]: E0402 13:39:36.679701 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:39:37 crc kubenswrapper[4732]: I0402 13:39:37.218448 4732 generic.go:334] "Generic (PLEG): container finished" podID="658a9576-efdc-4e4b-937e-bd63032cbee6" containerID="bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce" exitCode=0 Apr 02 13:39:37 crc kubenswrapper[4732]: I0402 13:39:37.218507 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" event={"ID":"658a9576-efdc-4e4b-937e-bd63032cbee6","Type":"ContainerDied","Data":"bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce"} Apr 02 13:39:37 crc kubenswrapper[4732]: I0402 13:39:37.240214 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:37Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:37 crc kubenswrapper[4732]: I0402 13:39:37.255554 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:37Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:37 crc kubenswrapper[4732]: I0402 13:39:37.275891 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:37Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:37 crc kubenswrapper[4732]: I0402 13:39:37.289346 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:37Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:37 crc kubenswrapper[4732]: I0402 13:39:37.302830 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:37Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:37 crc kubenswrapper[4732]: I0402 13:39:37.314701 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:37Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:37 crc kubenswrapper[4732]: I0402 13:39:37.331690 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:37Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:37 crc kubenswrapper[4732]: I0402 13:39:37.354698 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:37Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:37 crc kubenswrapper[4732]: I0402 13:39:37.375379 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:37Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:37 crc kubenswrapper[4732]: I0402 13:39:37.392295 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:37Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:37 crc kubenswrapper[4732]: I0402 13:39:37.408911 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:37Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:37 crc kubenswrapper[4732]: I0402 13:39:37.418975 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:37Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:37 crc kubenswrapper[4732]: I0402 13:39:37.428491 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:37Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:37 crc kubenswrapper[4732]: I0402 13:39:37.438962 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:37Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:37 crc kubenswrapper[4732]: I0402 13:39:37.448240 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:37Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:38 crc kubenswrapper[4732]: I0402 13:39:37.679360 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:39:38 crc kubenswrapper[4732]: E0402 13:39:37.679503 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:39:38 crc kubenswrapper[4732]: I0402 13:39:37.679572 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:39:38 crc kubenswrapper[4732]: I0402 13:39:37.679598 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:38 crc kubenswrapper[4732]: E0402 13:39:37.679744 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:39:38 crc kubenswrapper[4732]: E0402 13:39:37.679800 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:39:38 crc kubenswrapper[4732]: I0402 13:39:38.680184 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:39:38 crc kubenswrapper[4732]: E0402 13:39:38.681039 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:39:39 crc kubenswrapper[4732]: I0402 13:39:39.312883 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:39:39 crc kubenswrapper[4732]: I0402 13:39:39.313021 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:39 crc kubenswrapper[4732]: E0402 13:39:39.313060 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:39:47.313040611 +0000 UTC m=+144.217448164 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:39:39 crc kubenswrapper[4732]: I0402 13:39:39.313087 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:39 crc kubenswrapper[4732]: I0402 13:39:39.313118 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:39:39 crc kubenswrapper[4732]: I0402 13:39:39.313152 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:39:39 crc kubenswrapper[4732]: E0402 13:39:39.313093 4732 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Apr 02 13:39:39 crc kubenswrapper[4732]: E0402 13:39:39.313231 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 02 13:39:39 crc kubenswrapper[4732]: E0402 13:39:39.313242 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-02 13:39:47.313232726 +0000 UTC m=+144.217640279 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Apr 02 13:39:39 crc kubenswrapper[4732]: E0402 13:39:39.313249 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 02 13:39:39 crc kubenswrapper[4732]: E0402 13:39:39.313262 4732 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:39:39 crc kubenswrapper[4732]: E0402 13:39:39.313285 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 02 13:39:39 crc kubenswrapper[4732]: E0402 13:39:39.313316 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 02 13:39:39 crc kubenswrapper[4732]: E0402 13:39:39.313328 4732 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:39:39 crc kubenswrapper[4732]: E0402 13:39:39.313295 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-04-02 13:39:47.313283737 +0000 UTC m=+144.217691290 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:39:39 crc kubenswrapper[4732]: E0402 13:39:39.313400 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-04-02 13:39:47.31337815 +0000 UTC m=+144.217785793 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:39:39 crc kubenswrapper[4732]: E0402 13:39:39.313154 4732 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 02 13:39:39 crc kubenswrapper[4732]: E0402 13:39:39.313442 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-02 13:39:47.313435081 +0000 UTC m=+144.217842634 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 02 13:39:39 crc kubenswrapper[4732]: I0402 13:39:39.413664 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs\") pod \"network-metrics-daemon-crx2z\" (UID: \"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\") " pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:39:39 crc kubenswrapper[4732]: E0402 13:39:39.413919 4732 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 02 13:39:39 crc kubenswrapper[4732]: E0402 13:39:39.414053 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs podName:386bd92b-c67e-4cc6-8a47-6f8d6e799bc7 nodeName:}" failed. No retries permitted until 2026-04-02 13:39:47.414022609 +0000 UTC m=+144.318430202 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs") pod "network-metrics-daemon-crx2z" (UID: "386bd92b-c67e-4cc6-8a47-6f8d6e799bc7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 02 13:39:39 crc kubenswrapper[4732]: I0402 13:39:39.679554 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:39:39 crc kubenswrapper[4732]: I0402 13:39:39.679575 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:39 crc kubenswrapper[4732]: E0402 13:39:39.680044 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:39:39 crc kubenswrapper[4732]: I0402 13:39:39.679647 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:39:39 crc kubenswrapper[4732]: E0402 13:39:39.680177 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:39:39 crc kubenswrapper[4732]: E0402 13:39:39.680274 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:39:39 crc kubenswrapper[4732]: E0402 13:39:39.933527 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 02 13:39:40 crc kubenswrapper[4732]: I0402 13:39:40.421111 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerStarted","Data":"28d7b0869af498c3b2a34b70d9bf3e997eb93afc6c7ad7ab005b8cf1f08bc32a"} Apr 02 13:39:40 crc kubenswrapper[4732]: I0402 13:39:40.425955 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" event={"ID":"658a9576-efdc-4e4b-937e-bd63032cbee6","Type":"ContainerStarted","Data":"4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706"} Apr 02 13:39:40 crc kubenswrapper[4732]: I0402 13:39:40.440967 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:40Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:40 crc kubenswrapper[4732]: I0402 13:39:40.452704 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:40Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:40 crc kubenswrapper[4732]: I0402 13:39:40.470773 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:40Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:40 crc kubenswrapper[4732]: I0402 13:39:40.493685 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:40Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:40 crc kubenswrapper[4732]: I0402 13:39:40.507184 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:40Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:40 crc kubenswrapper[4732]: I0402 13:39:40.520567 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:40Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:40 crc kubenswrapper[4732]: I0402 13:39:40.532417 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:40Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:40 crc kubenswrapper[4732]: I0402 13:39:40.545022 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:40Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:40 crc kubenswrapper[4732]: I0402 13:39:40.557383 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:40Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:40 crc kubenswrapper[4732]: I0402 13:39:40.568502 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:40Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:40 crc kubenswrapper[4732]: I0402 13:39:40.581775 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:40Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:40 crc kubenswrapper[4732]: I0402 13:39:40.596102 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:40Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:40 crc kubenswrapper[4732]: I0402 13:39:40.605177 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:40Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:40 crc kubenswrapper[4732]: I0402 13:39:40.618641 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:40Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:40 crc kubenswrapper[4732]: I0402 13:39:40.630386 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:40Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:40 crc kubenswrapper[4732]: I0402 13:39:40.679737 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:39:40 crc kubenswrapper[4732]: E0402 13:39:40.679877 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:39:41 crc kubenswrapper[4732]: I0402 13:39:41.451647 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:41Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:41 crc kubenswrapper[4732]: I0402 13:39:41.470308 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:41Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:41 crc kubenswrapper[4732]: I0402 13:39:41.489141 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:41Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:41 crc kubenswrapper[4732]: I0402 13:39:41.513465 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:41Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:41 crc kubenswrapper[4732]: I0402 13:39:41.536042 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:41Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:41 crc kubenswrapper[4732]: I0402 13:39:41.554129 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:41Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:41 crc kubenswrapper[4732]: I0402 13:39:41.573257 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:41Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:41 crc kubenswrapper[4732]: I0402 13:39:41.585585 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:41Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:41 crc kubenswrapper[4732]: I0402 13:39:41.601025 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:41Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:41 crc kubenswrapper[4732]: I0402 13:39:41.620243 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28d7b0869af498c3b2a34b70d9bf3e997eb93afc6c7ad7ab005b8cf1f08bc32a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:41Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:41 crc kubenswrapper[4732]: I0402 13:39:41.631321 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:41Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:41 crc kubenswrapper[4732]: I0402 13:39:41.641451 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:41Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:41 crc kubenswrapper[4732]: I0402 13:39:41.653950 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:41Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:41 crc kubenswrapper[4732]: I0402 13:39:41.668019 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:41Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:41 crc kubenswrapper[4732]: I0402 13:39:41.679831 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:39:41 crc kubenswrapper[4732]: I0402 13:39:41.679876 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:39:41 crc kubenswrapper[4732]: I0402 13:39:41.679945 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:41 crc kubenswrapper[4732]: E0402 13:39:41.680251 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:39:41 crc kubenswrapper[4732]: E0402 13:39:41.680410 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:39:41 crc kubenswrapper[4732]: E0402 13:39:41.680480 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:39:41 crc kubenswrapper[4732]: I0402 13:39:41.681256 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:41Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:41 crc kubenswrapper[4732]: I0402 13:39:41.691347 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Apr 02 13:39:41 crc kubenswrapper[4732]: I0402 13:39:41.956660 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:41 crc kubenswrapper[4732]: I0402 13:39:41.956726 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:41 crc kubenswrapper[4732]: I0402 13:39:41.984681 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:41 crc kubenswrapper[4732]: I0402 13:39:41.993039 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.003082 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.020514 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.034138 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.046470 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.061439 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.074970 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.085949 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.102747 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.114488 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.124270 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb9093a-f17e-45d5-887a-6b693bbae3f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d431bea9c2c5dcec52d15f7c9d0806d50d74ece8f3ba59562a6f3fd59e50a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.136540 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.147247 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.168813 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.189941 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28d7b0869af498c3b2a34b70d9bf3e997eb93afc6c7ad7ab005b8cf1f08bc32a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.200978 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.213577 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.225964 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.236746 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.245968 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.255761 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.264949 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.281145 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.293523 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.301697 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb9093a-f17e-45d5-887a-6b693bbae3f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d431bea9c2c5dcec52d15f7c9d0806d50d74ece8f3ba59562a6f3fd59e50a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.313436 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.325802 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.336461 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.353526 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28d7b0869af498c3b2a34b70d9bf3e997eb93afc6c7ad7ab005b8cf1f08bc32a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.364828 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.380640 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.393024 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.403256 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:42Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.433955 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:39:42 crc kubenswrapper[4732]: I0402 13:39:42.680193 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:39:42 crc kubenswrapper[4732]: E0402 13:39:42.680397 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:39:43 crc kubenswrapper[4732]: I0402 13:39:43.439972 4732 generic.go:334] "Generic (PLEG): container finished" podID="658a9576-efdc-4e4b-937e-bd63032cbee6" containerID="4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706" exitCode=0 Apr 02 13:39:43 crc kubenswrapper[4732]: I0402 13:39:43.440037 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" event={"ID":"658a9576-efdc-4e4b-937e-bd63032cbee6","Type":"ContainerDied","Data":"4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706"} Apr 02 13:39:43 crc kubenswrapper[4732]: I0402 13:39:43.467009 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:43Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:43 crc kubenswrapper[4732]: I0402 13:39:43.484642 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:43Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:43 crc kubenswrapper[4732]: I0402 13:39:43.498770 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:43Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:43 crc kubenswrapper[4732]: I0402 13:39:43.509979 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:43Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:43 crc kubenswrapper[4732]: I0402 13:39:43.521549 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:43Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:43 crc kubenswrapper[4732]: I0402 13:39:43.533690 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:43Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:43 crc kubenswrapper[4732]: I0402 13:39:43.542766 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:43Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:43 crc kubenswrapper[4732]: I0402 13:39:43.555019 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:43Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:43 crc kubenswrapper[4732]: I0402 13:39:43.566179 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:43Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:43 crc kubenswrapper[4732]: I0402 13:39:43.576415 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb9093a-f17e-45d5-887a-6b693bbae3f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d431bea9c2c5dcec52d15f7c9d0806d50d74ece8f3ba59562a6f3fd59e50a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:43Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:43 crc kubenswrapper[4732]: I0402 13:39:43.589138 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:43Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:43 crc kubenswrapper[4732]: I0402 13:39:43.598792 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:43Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:43 crc kubenswrapper[4732]: I0402 13:39:43.610051 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:43Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:43 crc kubenswrapper[4732]: I0402 13:39:43.626166 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28d7b0869af498c3b2a34b70d9bf3e997eb93afc6c7ad7ab005b8cf1f08bc32a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:43Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:43 crc kubenswrapper[4732]: I0402 13:39:43.641126 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:43Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:43 crc kubenswrapper[4732]: I0402 13:39:43.654042 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:43Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:43 crc kubenswrapper[4732]: I0402 13:39:43.679260 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:39:43 crc kubenswrapper[4732]: I0402 13:39:43.679260 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:43 crc kubenswrapper[4732]: E0402 13:39:43.679394 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:39:43 crc kubenswrapper[4732]: E0402 13:39:43.679471 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:39:43 crc kubenswrapper[4732]: I0402 13:39:43.680335 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:39:43 crc kubenswrapper[4732]: E0402 13:39:43.680447 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:39:43 crc kubenswrapper[4732]: I0402 13:39:43.681019 4732 scope.go:117] "RemoveContainer" containerID="5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a" Apr 02 13:39:43 crc kubenswrapper[4732]: E0402 13:39:43.681178 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.444749 4732 generic.go:334] "Generic (PLEG): container finished" podID="658a9576-efdc-4e4b-937e-bd63032cbee6" containerID="b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f" exitCode=0 Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.444797 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" event={"ID":"658a9576-efdc-4e4b-937e-bd63032cbee6","Type":"ContainerDied","Data":"b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f"} Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.467659 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.490052 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.503395 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.519469 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.535921 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.548797 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.561114 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb9093a-f17e-45d5-887a-6b693bbae3f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d431bea9c2c5dcec52d15f7c9d0806d50d74ece8f3ba59562a6f3fd59e50a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.572660 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.586721 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.608799 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.630422 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28d7b0869af498c3b2a34b70d9bf3e997eb93afc6c7ad7ab005b8cf1f08bc32a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.642393 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.655758 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.671769 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.679938 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:39:44 crc kubenswrapper[4732]: E0402 13:39:44.680052 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.684030 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.696116 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.711070 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.723593 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.746338 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.761072 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.770425 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb9093a-f17e-45d5-887a-6b693bbae3f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d431bea9c2c5dcec52d15f7c9d0806d50d74ece8f3ba59562a6f3fd59e50a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.782097 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.794696 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.808640 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.828147 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28d7b0869af498c3b2a34b70d9bf3e997eb93afc6c7ad7ab005b8cf1f08bc32a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.857262 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.890759 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.904898 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.914235 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.924387 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: E0402 13:39:44.934671 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.935195 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:44 crc kubenswrapper[4732]: I0402 13:39:44.945158 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.451720 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qmgp_a6f5e483-7d6b-4d6d-be84-303d8f07643e/ovnkube-controller/0.log" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.454242 4732 generic.go:334] "Generic (PLEG): container finished" podID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerID="28d7b0869af498c3b2a34b70d9bf3e997eb93afc6c7ad7ab005b8cf1f08bc32a" exitCode=1 Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.454310 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerDied","Data":"28d7b0869af498c3b2a34b70d9bf3e997eb93afc6c7ad7ab005b8cf1f08bc32a"} Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.455097 4732 scope.go:117] "RemoveContainer" containerID="28d7b0869af498c3b2a34b70d9bf3e997eb93afc6c7ad7ab005b8cf1f08bc32a" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.459870 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" event={"ID":"658a9576-efdc-4e4b-937e-bd63032cbee6","Type":"ContainerStarted","Data":"b89438bcc57ca4588c00c0e9024fd7423eb6839c4c435c5c9bfca56d0fb1c33e"} Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.471356 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.484089 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.494543 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.504139 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.514037 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.528300 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.540507 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.549716 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.549756 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.549764 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.549778 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.549789 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:45Z","lastTransitionTime":"2026-04-02T13:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.551326 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb9093a-f17e-45d5-887a-6b693bbae3f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d431bea9c2c5dcec52d15f7c9d0806d50d74ece8f3ba59562a6f3fd59e50a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: E0402 13:39:45.563136 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.564061 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.566941 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.567025 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.567038 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.567055 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.567066 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:45Z","lastTransitionTime":"2026-04-02T13:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.578460 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: E0402 13:39:45.579906 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.583804 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.583852 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.583866 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.583883 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.583895 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:45Z","lastTransitionTime":"2026-04-02T13:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.592893 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: E0402 13:39:45.596708 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.600468 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.600523 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.600532 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.600546 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.600562 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:45Z","lastTransitionTime":"2026-04-02T13:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:45 crc kubenswrapper[4732]: E0402 13:39:45.614153 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.616582 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28d7b0869af498c3b2a34b70d9bf3e997eb93afc6c7ad7ab005b8cf1f08bc32a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28d7b0869af498c3b2a34b70d9bf3e997eb93afc6c7ad7ab005b8cf1f08bc32a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:39:44Z\\\",\\\"message\\\":\\\"lpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0402 13:39:44.595872 6680 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0402 13:39:44.595947 6680 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0402 13:39:44.596357 6680 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0402 13:39:44.597485 6680 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0402 13:39:44.597521 6680 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0402 13:39:44.597536 6680 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0402 13:39:44.597546 6680 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0402 13:39:44.597598 6680 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0402 13:39:44.597607 6680 factory.go:656] Stopping watch factory\\\\nI0402 13:39:44.597640 6680 ovnkube.go:599] Stopped ovnkube\\\\nI0402 13:39:44.597661 6680 handler.go:208] Removed *v1.Node event handler 7\\\\nI0402 13:39:44.597670 6680 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI04\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.619129 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.619171 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.619184 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.619201 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.619215 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:45Z","lastTransitionTime":"2026-04-02T13:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.629531 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: E0402 13:39:45.632079 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: E0402 13:39:45.632198 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.643135 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.663293 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.673378 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.679782 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:39:45 crc kubenswrapper[4732]: E0402 13:39:45.679874 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.680142 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:45 crc kubenswrapper[4732]: E0402 13:39:45.680201 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.680248 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:39:45 crc kubenswrapper[4732]: E0402 13:39:45.680297 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.686369 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.701825 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.715471 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.728726 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.750325 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28d7b0869af498c3b2a34b70d9bf3e997eb93afc6c7ad7ab005b8cf1f08bc32a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28d7b0869af498c3b2a34b70d9bf3e997eb93afc6c7ad7ab005b8cf1f08bc32a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:39:44Z\\\",\\\"message\\\":\\\"lpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0402 13:39:44.595872 6680 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0402 13:39:44.595947 6680 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0402 13:39:44.596357 6680 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0402 13:39:44.597485 6680 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0402 13:39:44.597521 6680 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0402 13:39:44.597536 6680 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0402 13:39:44.597546 6680 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0402 13:39:44.597598 6680 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0402 13:39:44.597607 6680 factory.go:656] Stopping watch factory\\\\nI0402 13:39:44.597640 6680 ovnkube.go:599] Stopped ovnkube\\\\nI0402 13:39:44.597661 6680 handler.go:208] Removed *v1.Node event handler 7\\\\nI0402 13:39:44.597670 6680 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI04\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.762173 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.775127 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.790525 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.806219 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.818518 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.839521 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb9093a-f17e-45d5-887a-6b693bbae3f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d431bea9c2c5dcec52d15f7c9d0806d50d74ece8f3ba59562a6f3fd59e50a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.858461 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.874313 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.885650 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.900852 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b89438bcc57ca4588c00c0e9024fd7423eb6839c4c435c5c9bfca56d0fb1c33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:45 crc kubenswrapper[4732]: I0402 13:39:45.913272 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:45Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:46 crc kubenswrapper[4732]: I0402 13:39:46.680031 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:39:46 crc kubenswrapper[4732]: E0402 13:39:46.680156 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:39:47 crc kubenswrapper[4732]: I0402 13:39:47.399699 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:39:47 crc kubenswrapper[4732]: E0402 13:39:47.399848 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:40:03.399820523 +0000 UTC m=+160.304228096 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:39:47 crc kubenswrapper[4732]: I0402 13:39:47.399930 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:39:47 crc kubenswrapper[4732]: I0402 13:39:47.399970 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:47 crc kubenswrapper[4732]: I0402 13:39:47.399998 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:47 crc kubenswrapper[4732]: I0402 13:39:47.400045 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:39:47 crc kubenswrapper[4732]: E0402 13:39:47.400173 4732 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Apr 02 13:39:47 crc kubenswrapper[4732]: E0402 13:39:47.400243 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-02 13:40:03.400225543 +0000 UTC m=+160.304633116 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Apr 02 13:39:47 crc kubenswrapper[4732]: E0402 13:39:47.400269 4732 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 02 13:39:47 crc kubenswrapper[4732]: E0402 13:39:47.400320 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-02 13:40:03.400306556 +0000 UTC m=+160.304714129 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 02 13:39:47 crc kubenswrapper[4732]: E0402 13:39:47.400377 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 02 13:39:47 crc kubenswrapper[4732]: E0402 13:39:47.400423 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 02 13:39:47 crc kubenswrapper[4732]: E0402 13:39:47.400439 4732 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:39:47 crc kubenswrapper[4732]: E0402 13:39:47.400496 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 02 13:39:47 crc kubenswrapper[4732]: E0402 13:39:47.400524 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 02 13:39:47 crc kubenswrapper[4732]: E0402 13:39:47.400542 4732 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:39:47 crc kubenswrapper[4732]: E0402 13:39:47.400496 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-04-02 13:40:03.40048094 +0000 UTC m=+160.304888483 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:39:47 crc kubenswrapper[4732]: E0402 13:39:47.400671 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-04-02 13:40:03.400588743 +0000 UTC m=+160.304996406 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:39:47 crc kubenswrapper[4732]: I0402 13:39:47.470351 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qmgp_a6f5e483-7d6b-4d6d-be84-303d8f07643e/ovnkube-controller/0.log" Apr 02 13:39:47 crc kubenswrapper[4732]: I0402 13:39:47.475021 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerStarted","Data":"cd694417e4511ced3c44daa2fd791011d9c9065df5dc3851f87ace5361529ae6"} Apr 02 13:39:47 crc kubenswrapper[4732]: I0402 13:39:47.500919 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs\") pod \"network-metrics-daemon-crx2z\" (UID: \"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\") " pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:39:47 crc kubenswrapper[4732]: E0402 13:39:47.501164 4732 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 02 13:39:47 crc kubenswrapper[4732]: E0402 13:39:47.501269 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs podName:386bd92b-c67e-4cc6-8a47-6f8d6e799bc7 nodeName:}" failed. No retries permitted until 2026-04-02 13:40:03.501247493 +0000 UTC m=+160.405655136 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs") pod "network-metrics-daemon-crx2z" (UID: "386bd92b-c67e-4cc6-8a47-6f8d6e799bc7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 02 13:39:47 crc kubenswrapper[4732]: I0402 13:39:47.679254 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:47 crc kubenswrapper[4732]: I0402 13:39:47.679337 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:39:47 crc kubenswrapper[4732]: E0402 13:39:47.679393 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:39:47 crc kubenswrapper[4732]: E0402 13:39:47.679515 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:39:47 crc kubenswrapper[4732]: I0402 13:39:47.679337 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:39:47 crc kubenswrapper[4732]: E0402 13:39:47.679760 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:39:48 crc kubenswrapper[4732]: I0402 13:39:48.480430 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qmgp_a6f5e483-7d6b-4d6d-be84-303d8f07643e/ovnkube-controller/1.log" Apr 02 13:39:48 crc kubenswrapper[4732]: I0402 13:39:48.481179 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qmgp_a6f5e483-7d6b-4d6d-be84-303d8f07643e/ovnkube-controller/0.log" Apr 02 13:39:48 crc kubenswrapper[4732]: I0402 13:39:48.484327 4732 generic.go:334] "Generic (PLEG): container finished" podID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerID="cd694417e4511ced3c44daa2fd791011d9c9065df5dc3851f87ace5361529ae6" exitCode=1 Apr 02 13:39:48 crc kubenswrapper[4732]: I0402 13:39:48.484377 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerDied","Data":"cd694417e4511ced3c44daa2fd791011d9c9065df5dc3851f87ace5361529ae6"} Apr 02 13:39:48 crc kubenswrapper[4732]: I0402 13:39:48.484422 4732 scope.go:117] "RemoveContainer" containerID="28d7b0869af498c3b2a34b70d9bf3e997eb93afc6c7ad7ab005b8cf1f08bc32a" Apr 02 13:39:48 crc kubenswrapper[4732]: I0402 13:39:48.485044 4732 scope.go:117] "RemoveContainer" containerID="cd694417e4511ced3c44daa2fd791011d9c9065df5dc3851f87ace5361529ae6" Apr 02 13:39:48 crc kubenswrapper[4732]: E0402 13:39:48.485357 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8qmgp_openshift-ovn-kubernetes(a6f5e483-7d6b-4d6d-be84-303d8f07643e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" Apr 02 13:39:48 crc kubenswrapper[4732]: I0402 13:39:48.509547 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:48Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:48 crc kubenswrapper[4732]: I0402 13:39:48.520825 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:48Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:48 crc kubenswrapper[4732]: I0402 13:39:48.533378 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:48Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:48 crc kubenswrapper[4732]: I0402 13:39:48.555211 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd694417e4511ced3c44daa2fd791011d9c9065df5dc3851f87ace5361529ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28d7b0869af498c3b2a34b70d9bf3e997eb93afc6c7ad7ab005b8cf1f08bc32a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:39:44Z\\\",\\\"message\\\":\\\"lpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0402 13:39:44.595872 6680 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0402 13:39:44.595947 6680 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0402 13:39:44.596357 6680 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0402 13:39:44.597485 6680 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0402 13:39:44.597521 6680 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0402 13:39:44.597536 6680 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0402 13:39:44.597546 6680 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0402 13:39:44.597598 6680 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0402 13:39:44.597607 6680 factory.go:656] Stopping watch factory\\\\nI0402 13:39:44.597640 6680 ovnkube.go:599] Stopped ovnkube\\\\nI0402 13:39:44.597661 6680 handler.go:208] Removed *v1.Node event handler 7\\\\nI0402 13:39:44.597670 6680 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI04\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd694417e4511ced3c44daa2fd791011d9c9065df5dc3851f87ace5361529ae6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:39:48Z\\\",\\\"message\\\":\\\"F0402 13:39:48.262702 6864 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:48Z is after 2025-08-24T17:21:41Z]\\\\nI0402 13:39:48.262028 6864 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0402 13:39:48.262716 6864 services_controller.go:443] Built service openshift-config-operator/metrics LB cluster-wide co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:48Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:48 crc kubenswrapper[4732]: I0402 13:39:48.567439 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:48Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:48 crc kubenswrapper[4732]: I0402 13:39:48.581521 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:48Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:48 crc kubenswrapper[4732]: I0402 13:39:48.591431 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:48Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:48 crc kubenswrapper[4732]: I0402 13:39:48.603045 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:48Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:48 crc kubenswrapper[4732]: I0402 13:39:48.614824 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:48Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:48 crc kubenswrapper[4732]: I0402 13:39:48.626019 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:48Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:48 crc kubenswrapper[4732]: I0402 13:39:48.638771 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:48Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:48 crc kubenswrapper[4732]: I0402 13:39:48.652006 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:48Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:48 crc kubenswrapper[4732]: I0402 13:39:48.662975 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:48Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:48 crc kubenswrapper[4732]: I0402 13:39:48.676359 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b89438bcc57ca4588c00c0e9024fd7423eb6839c4c435c5c9bfca56d0fb1c33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:48Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:48 crc kubenswrapper[4732]: I0402 13:39:48.679208 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:39:48 crc kubenswrapper[4732]: E0402 13:39:48.679327 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:39:48 crc kubenswrapper[4732]: I0402 13:39:48.688946 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:48Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:48 crc kubenswrapper[4732]: I0402 13:39:48.699303 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb9093a-f17e-45d5-887a-6b693bbae3f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d431bea9c2c5dcec52d15f7c9d0806d50d74ece8f3ba59562a6f3fd59e50a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:48Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:49 crc kubenswrapper[4732]: I0402 13:39:49.489019 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qmgp_a6f5e483-7d6b-4d6d-be84-303d8f07643e/ovnkube-controller/1.log" Apr 02 13:39:49 crc kubenswrapper[4732]: I0402 13:39:49.493899 4732 scope.go:117] "RemoveContainer" containerID="cd694417e4511ced3c44daa2fd791011d9c9065df5dc3851f87ace5361529ae6" Apr 02 13:39:49 crc kubenswrapper[4732]: E0402 13:39:49.494072 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8qmgp_openshift-ovn-kubernetes(a6f5e483-7d6b-4d6d-be84-303d8f07643e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" Apr 02 13:39:49 crc kubenswrapper[4732]: I0402 13:39:49.504936 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:49Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:49 crc kubenswrapper[4732]: I0402 13:39:49.517382 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:49Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:49 crc kubenswrapper[4732]: I0402 13:39:49.532096 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:49Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:49 crc kubenswrapper[4732]: I0402 13:39:49.543079 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:49Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:49 crc kubenswrapper[4732]: I0402 13:39:49.556899 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:49Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:49 crc kubenswrapper[4732]: I0402 13:39:49.569015 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:49Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:49 crc kubenswrapper[4732]: I0402 13:39:49.580702 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:49Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:49 crc kubenswrapper[4732]: I0402 13:39:49.594109 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b89438bcc57ca4588c00c0e9024fd7423eb6839c4c435c5c9bfca56d0fb1c33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:49Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:49 crc kubenswrapper[4732]: I0402 13:39:49.608377 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:49Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:49 crc kubenswrapper[4732]: I0402 13:39:49.625522 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb9093a-f17e-45d5-887a-6b693bbae3f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d431bea9c2c5dcec52d15f7c9d0806d50d74ece8f3ba59562a6f3fd59e50a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:49Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:49 crc kubenswrapper[4732]: I0402 13:39:49.638934 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:49Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:49 crc kubenswrapper[4732]: I0402 13:39:49.651040 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:49Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:49 crc kubenswrapper[4732]: I0402 13:39:49.664229 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:49Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:49 crc kubenswrapper[4732]: I0402 13:39:49.679182 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:39:49 crc kubenswrapper[4732]: I0402 13:39:49.679212 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:39:49 crc kubenswrapper[4732]: I0402 13:39:49.679221 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:49 crc kubenswrapper[4732]: E0402 13:39:49.679311 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:39:49 crc kubenswrapper[4732]: E0402 13:39:49.679408 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:39:49 crc kubenswrapper[4732]: E0402 13:39:49.679473 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:39:49 crc kubenswrapper[4732]: I0402 13:39:49.684295 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd694417e4511ced3c44daa2fd791011d9c9065df5dc3851f87ace5361529ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd694417e4511ced3c44daa2fd791011d9c9065df5dc3851f87ace5361529ae6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:39:48Z\\\",\\\"message\\\":\\\"F0402 13:39:48.262702 6864 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:48Z is after 2025-08-24T17:21:41Z]\\\\nI0402 13:39:48.262028 6864 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0402 13:39:48.262716 6864 services_controller.go:443] Built service openshift-config-operator/metrics LB cluster-wide co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8qmgp_openshift-ovn-kubernetes(a6f5e483-7d6b-4d6d-be84-303d8f07643e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:49Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:49 crc kubenswrapper[4732]: I0402 13:39:49.696098 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:49Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:49 crc kubenswrapper[4732]: I0402 13:39:49.710825 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:49Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:49 crc kubenswrapper[4732]: E0402 13:39:49.936502 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 02 13:39:50 crc kubenswrapper[4732]: I0402 13:39:50.679885 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:39:50 crc kubenswrapper[4732]: E0402 13:39:50.680052 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:39:51 crc kubenswrapper[4732]: I0402 13:39:51.684771 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:39:51 crc kubenswrapper[4732]: I0402 13:39:51.684771 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:51 crc kubenswrapper[4732]: I0402 13:39:51.684867 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:39:51 crc kubenswrapper[4732]: E0402 13:39:51.685686 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:39:51 crc kubenswrapper[4732]: E0402 13:39:51.685732 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:39:51 crc kubenswrapper[4732]: E0402 13:39:51.685520 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:39:52 crc kubenswrapper[4732]: I0402 13:39:52.679431 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:39:52 crc kubenswrapper[4732]: E0402 13:39:52.679647 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:39:53 crc kubenswrapper[4732]: I0402 13:39:53.679212 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:53 crc kubenswrapper[4732]: I0402 13:39:53.679259 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:39:53 crc kubenswrapper[4732]: I0402 13:39:53.679330 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:39:53 crc kubenswrapper[4732]: E0402 13:39:53.680242 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:39:53 crc kubenswrapper[4732]: E0402 13:39:53.680065 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:39:53 crc kubenswrapper[4732]: E0402 13:39:53.680306 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:39:54 crc kubenswrapper[4732]: I0402 13:39:54.680057 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:39:54 crc kubenswrapper[4732]: E0402 13:39:54.680283 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:39:54 crc kubenswrapper[4732]: I0402 13:39:54.690939 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb9093a-f17e-45d5-887a-6b693bbae3f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d431bea9c2c5dcec52d15f7c9d0806d50d74ece8f3ba59562a6f3fd59e50a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:54Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:54 crc kubenswrapper[4732]: I0402 13:39:54.705188 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:54Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:54 crc kubenswrapper[4732]: I0402 13:39:54.720474 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:54Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:54 crc kubenswrapper[4732]: I0402 13:39:54.732424 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:54Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:54 crc kubenswrapper[4732]: I0402 13:39:54.753540 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b89438bcc57ca4588c00c0e9024fd7423eb6839c4c435c5c9bfca56d0fb1c33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:54Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:54 crc kubenswrapper[4732]: I0402 13:39:54.769094 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:54Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:54 crc kubenswrapper[4732]: I0402 13:39:54.783679 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:54Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:54 crc kubenswrapper[4732]: I0402 13:39:54.798426 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:54Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:54 crc kubenswrapper[4732]: I0402 13:39:54.811561 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:54Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:54 crc kubenswrapper[4732]: I0402 13:39:54.830211 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:54Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:54 crc kubenswrapper[4732]: I0402 13:39:54.849699 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd694417e4511ced3c44daa2fd791011d9c9065df5dc3851f87ace5361529ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd694417e4511ced3c44daa2fd791011d9c9065df5dc3851f87ace5361529ae6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:39:48Z\\\",\\\"message\\\":\\\"F0402 13:39:48.262702 6864 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:48Z is after 2025-08-24T17:21:41Z]\\\\nI0402 13:39:48.262028 6864 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0402 13:39:48.262716 6864 services_controller.go:443] Built service openshift-config-operator/metrics LB cluster-wide co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8qmgp_openshift-ovn-kubernetes(a6f5e483-7d6b-4d6d-be84-303d8f07643e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:54Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:54 crc kubenswrapper[4732]: I0402 13:39:54.864052 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:54Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:54 crc kubenswrapper[4732]: I0402 13:39:54.879033 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:54Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:54 crc kubenswrapper[4732]: I0402 13:39:54.893461 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:54Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:54 crc kubenswrapper[4732]: I0402 13:39:54.909033 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:54Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:54 crc kubenswrapper[4732]: I0402 13:39:54.923687 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:54Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:54 crc kubenswrapper[4732]: E0402 13:39:54.937796 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.648726 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.648793 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.648817 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.648845 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.648862 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:55Z","lastTransitionTime":"2026-04-02T13:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:55 crc kubenswrapper[4732]: E0402 13:39:55.670435 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:55Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.676175 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.676243 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.676269 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.676303 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.676385 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:55Z","lastTransitionTime":"2026-04-02T13:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.679557 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.679669 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.679573 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:39:55 crc kubenswrapper[4732]: E0402 13:39:55.679807 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:39:55 crc kubenswrapper[4732]: E0402 13:39:55.679894 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:39:55 crc kubenswrapper[4732]: E0402 13:39:55.680206 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:39:55 crc kubenswrapper[4732]: E0402 13:39:55.699067 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:55Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.703742 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.703821 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.703848 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.703885 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.703908 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:55Z","lastTransitionTime":"2026-04-02T13:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:55 crc kubenswrapper[4732]: E0402 13:39:55.720303 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:55Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.725150 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.725211 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.725231 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.725256 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.725277 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:55Z","lastTransitionTime":"2026-04-02T13:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:55 crc kubenswrapper[4732]: E0402 13:39:55.744956 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:55Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.748958 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.749118 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.749214 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.749288 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:39:55 crc kubenswrapper[4732]: I0402 13:39:55.749354 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:39:55Z","lastTransitionTime":"2026-04-02T13:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:39:55 crc kubenswrapper[4732]: E0402 13:39:55.763400 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:55Z is after 2025-08-24T17:21:41Z" Apr 02 13:39:55 crc kubenswrapper[4732]: E0402 13:39:55.763549 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Apr 02 13:39:56 crc kubenswrapper[4732]: I0402 13:39:56.679976 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:39:56 crc kubenswrapper[4732]: E0402 13:39:56.680197 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:39:57 crc kubenswrapper[4732]: I0402 13:39:57.679591 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:39:57 crc kubenswrapper[4732]: I0402 13:39:57.679725 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:39:57 crc kubenswrapper[4732]: E0402 13:39:57.679765 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:39:57 crc kubenswrapper[4732]: I0402 13:39:57.679913 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:57 crc kubenswrapper[4732]: E0402 13:39:57.679975 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:39:57 crc kubenswrapper[4732]: I0402 13:39:57.680542 4732 scope.go:117] "RemoveContainer" containerID="5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a" Apr 02 13:39:57 crc kubenswrapper[4732]: E0402 13:39:57.680858 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 02 13:39:57 crc kubenswrapper[4732]: E0402 13:39:57.680837 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:39:58 crc kubenswrapper[4732]: I0402 13:39:58.679194 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:39:58 crc kubenswrapper[4732]: E0402 13:39:58.679374 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:39:59 crc kubenswrapper[4732]: I0402 13:39:59.679321 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:39:59 crc kubenswrapper[4732]: I0402 13:39:59.679427 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:39:59 crc kubenswrapper[4732]: E0402 13:39:59.679540 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:39:59 crc kubenswrapper[4732]: E0402 13:39:59.679647 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:39:59 crc kubenswrapper[4732]: I0402 13:39:59.679729 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:39:59 crc kubenswrapper[4732]: E0402 13:39:59.679784 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:39:59 crc kubenswrapper[4732]: E0402 13:39:59.939831 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 02 13:40:00 crc kubenswrapper[4732]: I0402 13:40:00.679217 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:00 crc kubenswrapper[4732]: E0402 13:40:00.679372 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:00 crc kubenswrapper[4732]: I0402 13:40:00.680180 4732 scope.go:117] "RemoveContainer" containerID="cd694417e4511ced3c44daa2fd791011d9c9065df5dc3851f87ace5361529ae6" Apr 02 13:40:01 crc kubenswrapper[4732]: I0402 13:40:01.537822 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qmgp_a6f5e483-7d6b-4d6d-be84-303d8f07643e/ovnkube-controller/1.log" Apr 02 13:40:01 crc kubenswrapper[4732]: I0402 13:40:01.540836 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerStarted","Data":"1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04"} Apr 02 13:40:01 crc kubenswrapper[4732]: I0402 13:40:01.541305 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:40:01 crc kubenswrapper[4732]: I0402 13:40:01.567876 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:01Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:01 crc kubenswrapper[4732]: I0402 13:40:01.584537 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:01Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:01 crc kubenswrapper[4732]: I0402 13:40:01.595363 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:01Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:01 crc kubenswrapper[4732]: I0402 13:40:01.608546 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b89438bcc57ca4588c00c0e9024fd7423eb6839c4c435c5c9bfca56d0fb1c33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:01Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:01 crc kubenswrapper[4732]: I0402 13:40:01.619731 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:01Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:01 crc kubenswrapper[4732]: I0402 13:40:01.629859 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb9093a-f17e-45d5-887a-6b693bbae3f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d431bea9c2c5dcec52d15f7c9d0806d50d74ece8f3ba59562a6f3fd59e50a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:01Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:01 crc kubenswrapper[4732]: I0402 13:40:01.643855 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:01Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:01 crc kubenswrapper[4732]: I0402 13:40:01.657297 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:01Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:01 crc kubenswrapper[4732]: I0402 13:40:01.670835 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:01Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:01 crc kubenswrapper[4732]: I0402 13:40:01.679894 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:01 crc kubenswrapper[4732]: I0402 13:40:01.679931 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:01 crc kubenswrapper[4732]: I0402 13:40:01.679962 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:01 crc kubenswrapper[4732]: E0402 13:40:01.680006 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:01 crc kubenswrapper[4732]: E0402 13:40:01.680143 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:01 crc kubenswrapper[4732]: E0402 13:40:01.680213 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:01 crc kubenswrapper[4732]: I0402 13:40:01.688112 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd694417e4511ced3c44daa2fd791011d9c9065df5dc3851f87ace5361529ae6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:39:48Z\\\",\\\"message\\\":\\\"F0402 13:39:48.262702 6864 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:48Z is after 2025-08-24T17:21:41Z]\\\\nI0402 13:39:48.262028 6864 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0402 13:39:48.262716 6864 services_controller.go:443] Built service openshift-config-operator/metrics LB cluster-wide co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:40:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:01Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:01 crc kubenswrapper[4732]: I0402 13:40:01.697560 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:01Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:01 crc kubenswrapper[4732]: I0402 13:40:01.709764 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:01Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:01 crc kubenswrapper[4732]: I0402 13:40:01.719993 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:01Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:01 crc kubenswrapper[4732]: I0402 13:40:01.732713 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:01Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:01 crc kubenswrapper[4732]: I0402 13:40:01.744693 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:01Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:01 crc kubenswrapper[4732]: I0402 13:40:01.753603 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:01Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:02 crc kubenswrapper[4732]: I0402 13:40:02.549104 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qmgp_a6f5e483-7d6b-4d6d-be84-303d8f07643e/ovnkube-controller/2.log" Apr 02 13:40:02 crc kubenswrapper[4732]: I0402 13:40:02.550222 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qmgp_a6f5e483-7d6b-4d6d-be84-303d8f07643e/ovnkube-controller/1.log" Apr 02 13:40:02 crc kubenswrapper[4732]: I0402 13:40:02.552904 4732 generic.go:334] "Generic (PLEG): container finished" podID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerID="1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04" exitCode=1 Apr 02 13:40:02 crc kubenswrapper[4732]: I0402 13:40:02.552948 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerDied","Data":"1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04"} Apr 02 13:40:02 crc kubenswrapper[4732]: I0402 13:40:02.552999 4732 scope.go:117] "RemoveContainer" containerID="cd694417e4511ced3c44daa2fd791011d9c9065df5dc3851f87ace5361529ae6" Apr 02 13:40:02 crc kubenswrapper[4732]: I0402 13:40:02.553890 4732 scope.go:117] "RemoveContainer" containerID="1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04" Apr 02 13:40:02 crc kubenswrapper[4732]: E0402 13:40:02.554189 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8qmgp_openshift-ovn-kubernetes(a6f5e483-7d6b-4d6d-be84-303d8f07643e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" Apr 02 13:40:02 crc kubenswrapper[4732]: I0402 13:40:02.577914 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:02Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:02 crc kubenswrapper[4732]: I0402 13:40:02.599900 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd694417e4511ced3c44daa2fd791011d9c9065df5dc3851f87ace5361529ae6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:39:48Z\\\",\\\"message\\\":\\\"F0402 13:39:48.262702 6864 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:39:48Z is after 2025-08-24T17:21:41Z]\\\\nI0402 13:39:48.262028 6864 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0402 13:39:48.262716 6864 services_controller.go:443] Built service openshift-config-operator/metrics LB cluster-wide co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:40:01Z\\\",\\\"message\\\":\\\"registry/node-ca-glqvf\\\\nI0402 13:40:01.774154 7045 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-glqvf\\\\nI0402 13:40:01.774162 7045 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-glqvf in node crc\\\\nI0402 13:40:01.774167 7045 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-glqvf after 0 failed attempt(s)\\\\nF0402 13:40:01.774184 7045 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:01Z is after 2025-08-24T17:21:41Z]\\\\nI0402 13:40:01.774196 7045 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-8q\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:40:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:02Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:02 crc kubenswrapper[4732]: I0402 13:40:02.613707 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:02Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:02 crc kubenswrapper[4732]: I0402 13:40:02.629374 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:02Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:02 crc kubenswrapper[4732]: I0402 13:40:02.643416 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:02Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:02 crc kubenswrapper[4732]: I0402 13:40:02.655894 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:02Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:02 crc kubenswrapper[4732]: I0402 13:40:02.668218 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:02Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:02 crc kubenswrapper[4732]: I0402 13:40:02.680125 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:02 crc kubenswrapper[4732]: E0402 13:40:02.680575 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:02 crc kubenswrapper[4732]: I0402 13:40:02.681471 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:02Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:02 crc kubenswrapper[4732]: I0402 13:40:02.692189 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:02Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:02 crc kubenswrapper[4732]: I0402 13:40:02.701256 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:02Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:02 crc kubenswrapper[4732]: I0402 13:40:02.711496 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:02Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:02 crc kubenswrapper[4732]: I0402 13:40:02.726809 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b89438bcc57ca4588c00c0e9024fd7423eb6839c4c435c5c9bfca56d0fb1c33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:02Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:02 crc kubenswrapper[4732]: I0402 13:40:02.740350 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:02Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:02 crc kubenswrapper[4732]: I0402 13:40:02.750560 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb9093a-f17e-45d5-887a-6b693bbae3f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d431bea9c2c5dcec52d15f7c9d0806d50d74ece8f3ba59562a6f3fd59e50a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:02Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:02 crc kubenswrapper[4732]: I0402 13:40:02.762342 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:02Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:02 crc kubenswrapper[4732]: I0402 13:40:02.774568 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:02Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.475329 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.475484 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.475522 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.475557 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.475599 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:03 crc kubenswrapper[4732]: E0402 13:40:03.475688 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:40:35.475646464 +0000 UTC m=+192.380054057 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:40:03 crc kubenswrapper[4732]: E0402 13:40:03.475776 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 02 13:40:03 crc kubenswrapper[4732]: E0402 13:40:03.475779 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 02 13:40:03 crc kubenswrapper[4732]: E0402 13:40:03.475822 4732 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Apr 02 13:40:03 crc kubenswrapper[4732]: E0402 13:40:03.475832 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 02 13:40:03 crc kubenswrapper[4732]: E0402 13:40:03.475858 4732 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:40:03 crc kubenswrapper[4732]: E0402 13:40:03.475869 4732 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 02 13:40:03 crc kubenswrapper[4732]: E0402 13:40:03.475888 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-02 13:40:35.47587067 +0000 UTC m=+192.380278263 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Apr 02 13:40:03 crc kubenswrapper[4732]: E0402 13:40:03.475801 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 02 13:40:03 crc kubenswrapper[4732]: E0402 13:40:03.475933 4732 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:40:03 crc kubenswrapper[4732]: E0402 13:40:03.475935 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-02 13:40:35.475914491 +0000 UTC m=+192.380322074 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 02 13:40:03 crc kubenswrapper[4732]: E0402 13:40:03.475968 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-04-02 13:40:35.475955312 +0000 UTC m=+192.380362905 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:40:03 crc kubenswrapper[4732]: E0402 13:40:03.475987 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-04-02 13:40:35.475977633 +0000 UTC m=+192.380385226 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.559066 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qmgp_a6f5e483-7d6b-4d6d-be84-303d8f07643e/ovnkube-controller/2.log" Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.564032 4732 scope.go:117] "RemoveContainer" containerID="1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04" Apr 02 13:40:03 crc kubenswrapper[4732]: E0402 13:40:03.564350 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8qmgp_openshift-ovn-kubernetes(a6f5e483-7d6b-4d6d-be84-303d8f07643e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.577016 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs\") pod \"network-metrics-daemon-crx2z\" (UID: \"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\") " pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:03 crc kubenswrapper[4732]: E0402 13:40:03.577241 4732 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 02 13:40:03 crc kubenswrapper[4732]: E0402 13:40:03.577347 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs podName:386bd92b-c67e-4cc6-8a47-6f8d6e799bc7 nodeName:}" failed. No retries permitted until 2026-04-02 13:40:35.577313861 +0000 UTC m=+192.481721444 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs") pod "network-metrics-daemon-crx2z" (UID: "386bd92b-c67e-4cc6-8a47-6f8d6e799bc7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.581982 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:03Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.598233 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:03Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.613577 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:03Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.632663 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:03Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.650182 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:03Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.665547 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb9093a-f17e-45d5-887a-6b693bbae3f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d431bea9c2c5dcec52d15f7c9d0806d50d74ece8f3ba59562a6f3fd59e50a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:03Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.679113 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.679201 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:03 crc kubenswrapper[4732]: E0402 13:40:03.679258 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.679125 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:03 crc kubenswrapper[4732]: E0402 13:40:03.679791 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:03 crc kubenswrapper[4732]: E0402 13:40:03.680071 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.687141 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:03Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.703158 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:03Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.713704 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:03Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.730108 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b89438bcc57ca4588c00c0e9024fd7423eb6839c4c435c5c9bfca56d0fb1c33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:03Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.744114 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:03Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.759757 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:03Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.776218 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:03Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.790558 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:03Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.804771 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:03Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:03 crc kubenswrapper[4732]: I0402 13:40:03.824979 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:40:01Z\\\",\\\"message\\\":\\\"registry/node-ca-glqvf\\\\nI0402 13:40:01.774154 7045 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-glqvf\\\\nI0402 13:40:01.774162 7045 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-glqvf in node crc\\\\nI0402 13:40:01.774167 7045 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-glqvf after 0 failed attempt(s)\\\\nF0402 13:40:01.774184 7045 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:01Z is after 2025-08-24T17:21:41Z]\\\\nI0402 13:40:01.774196 7045 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-8q\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:40:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8qmgp_openshift-ovn-kubernetes(a6f5e483-7d6b-4d6d-be84-303d8f07643e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:03Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:04 crc kubenswrapper[4732]: I0402 13:40:04.679796 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:04 crc kubenswrapper[4732]: E0402 13:40:04.679999 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:04 crc kubenswrapper[4732]: I0402 13:40:04.697806 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:04Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:04 crc kubenswrapper[4732]: I0402 13:40:04.716245 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:04Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:04 crc kubenswrapper[4732]: I0402 13:40:04.731001 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:04Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:04 crc kubenswrapper[4732]: I0402 13:40:04.747020 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b89438bcc57ca4588c00c0e9024fd7423eb6839c4c435c5c9bfca56d0fb1c33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:04Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:04 crc kubenswrapper[4732]: I0402 13:40:04.763755 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:04Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:04 crc kubenswrapper[4732]: I0402 13:40:04.775778 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb9093a-f17e-45d5-887a-6b693bbae3f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d431bea9c2c5dcec52d15f7c9d0806d50d74ece8f3ba59562a6f3fd59e50a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:04Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:04 crc kubenswrapper[4732]: I0402 13:40:04.787441 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:04Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:04 crc kubenswrapper[4732]: I0402 13:40:04.797164 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:04Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:04 crc kubenswrapper[4732]: I0402 13:40:04.815191 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:04Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:04 crc kubenswrapper[4732]: I0402 13:40:04.833626 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:40:01Z\\\",\\\"message\\\":\\\"registry/node-ca-glqvf\\\\nI0402 13:40:01.774154 7045 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-glqvf\\\\nI0402 13:40:01.774162 7045 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-glqvf in node crc\\\\nI0402 13:40:01.774167 7045 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-glqvf after 0 failed attempt(s)\\\\nF0402 13:40:01.774184 7045 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:01Z is after 2025-08-24T17:21:41Z]\\\\nI0402 13:40:01.774196 7045 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-8q\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:40:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8qmgp_openshift-ovn-kubernetes(a6f5e483-7d6b-4d6d-be84-303d8f07643e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:04Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:04 crc kubenswrapper[4732]: I0402 13:40:04.843888 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:04Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:04 crc kubenswrapper[4732]: I0402 13:40:04.855271 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:04Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:04 crc kubenswrapper[4732]: I0402 13:40:04.869876 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:04Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:04 crc kubenswrapper[4732]: I0402 13:40:04.890355 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:04Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:04 crc kubenswrapper[4732]: I0402 13:40:04.905723 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:04Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:04 crc kubenswrapper[4732]: I0402 13:40:04.918014 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:04Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:04 crc kubenswrapper[4732]: E0402 13:40:04.940411 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 02 13:40:05 crc kubenswrapper[4732]: I0402 13:40:05.679568 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:05 crc kubenswrapper[4732]: E0402 13:40:05.680011 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:05 crc kubenswrapper[4732]: I0402 13:40:05.679642 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:05 crc kubenswrapper[4732]: E0402 13:40:05.680121 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:05 crc kubenswrapper[4732]: I0402 13:40:05.679778 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:05 crc kubenswrapper[4732]: E0402 13:40:05.680193 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.110931 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.110983 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.110998 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.111015 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.111026 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:06Z","lastTransitionTime":"2026-04-02T13:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:06 crc kubenswrapper[4732]: E0402 13:40:06.128874 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:06Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.132746 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.132792 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.132803 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.132818 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.132826 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:06Z","lastTransitionTime":"2026-04-02T13:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:06 crc kubenswrapper[4732]: E0402 13:40:06.143784 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:06Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.147104 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.147131 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.147140 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.147163 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.147174 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:06Z","lastTransitionTime":"2026-04-02T13:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:06 crc kubenswrapper[4732]: E0402 13:40:06.159866 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:06Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.164017 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.164100 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.164129 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.164165 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.164190 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:06Z","lastTransitionTime":"2026-04-02T13:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:06 crc kubenswrapper[4732]: E0402 13:40:06.180005 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:06Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.183316 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.183373 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.183399 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.183424 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.183446 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:06Z","lastTransitionTime":"2026-04-02T13:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:06 crc kubenswrapper[4732]: E0402 13:40:06.196851 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:06Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:06 crc kubenswrapper[4732]: E0402 13:40:06.197189 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Apr 02 13:40:06 crc kubenswrapper[4732]: I0402 13:40:06.679942 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:06 crc kubenswrapper[4732]: E0402 13:40:06.680143 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:07 crc kubenswrapper[4732]: I0402 13:40:07.679718 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:07 crc kubenswrapper[4732]: I0402 13:40:07.679771 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:07 crc kubenswrapper[4732]: E0402 13:40:07.679841 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:07 crc kubenswrapper[4732]: E0402 13:40:07.679903 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:07 crc kubenswrapper[4732]: I0402 13:40:07.679782 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:07 crc kubenswrapper[4732]: E0402 13:40:07.680178 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:08 crc kubenswrapper[4732]: I0402 13:40:08.680210 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:08 crc kubenswrapper[4732]: E0402 13:40:08.680374 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:09 crc kubenswrapper[4732]: I0402 13:40:09.679565 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:09 crc kubenswrapper[4732]: I0402 13:40:09.679646 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:09 crc kubenswrapper[4732]: I0402 13:40:09.679646 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:09 crc kubenswrapper[4732]: E0402 13:40:09.679794 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:09 crc kubenswrapper[4732]: E0402 13:40:09.679906 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:09 crc kubenswrapper[4732]: E0402 13:40:09.679981 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:09 crc kubenswrapper[4732]: E0402 13:40:09.942022 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 02 13:40:10 crc kubenswrapper[4732]: I0402 13:40:10.679684 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:10 crc kubenswrapper[4732]: E0402 13:40:10.680048 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:10 crc kubenswrapper[4732]: I0402 13:40:10.680177 4732 scope.go:117] "RemoveContainer" containerID="5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a" Apr 02 13:40:10 crc kubenswrapper[4732]: E0402 13:40:10.680355 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 02 13:40:11 crc kubenswrapper[4732]: I0402 13:40:11.679299 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:11 crc kubenswrapper[4732]: I0402 13:40:11.679415 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:11 crc kubenswrapper[4732]: I0402 13:40:11.679336 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:11 crc kubenswrapper[4732]: E0402 13:40:11.679528 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:11 crc kubenswrapper[4732]: E0402 13:40:11.679766 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:11 crc kubenswrapper[4732]: E0402 13:40:11.679890 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:12 crc kubenswrapper[4732]: I0402 13:40:12.679384 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:12 crc kubenswrapper[4732]: E0402 13:40:12.679503 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:13 crc kubenswrapper[4732]: I0402 13:40:13.679315 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:13 crc kubenswrapper[4732]: I0402 13:40:13.679376 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:13 crc kubenswrapper[4732]: E0402 13:40:13.679479 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:13 crc kubenswrapper[4732]: E0402 13:40:13.679692 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:13 crc kubenswrapper[4732]: I0402 13:40:13.679336 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:13 crc kubenswrapper[4732]: E0402 13:40:13.679795 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:14 crc kubenswrapper[4732]: I0402 13:40:14.680278 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:14 crc kubenswrapper[4732]: E0402 13:40:14.680433 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:14 crc kubenswrapper[4732]: I0402 13:40:14.702107 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:14Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:14 crc kubenswrapper[4732]: I0402 13:40:14.720319 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:14Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:14 crc kubenswrapper[4732]: I0402 13:40:14.738434 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:14Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:14 crc kubenswrapper[4732]: I0402 13:40:14.754739 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:14Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:14 crc kubenswrapper[4732]: I0402 13:40:14.772107 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:14Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:14 crc kubenswrapper[4732]: I0402 13:40:14.791948 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:14Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:14 crc kubenswrapper[4732]: I0402 13:40:14.808164 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:14Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:14 crc kubenswrapper[4732]: I0402 13:40:14.824498 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b89438bcc57ca4588c00c0e9024fd7423eb6839c4c435c5c9bfca56d0fb1c33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:14Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:14 crc kubenswrapper[4732]: I0402 13:40:14.845339 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:14Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:14 crc kubenswrapper[4732]: I0402 13:40:14.864206 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb9093a-f17e-45d5-887a-6b693bbae3f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d431bea9c2c5dcec52d15f7c9d0806d50d74ece8f3ba59562a6f3fd59e50a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:14Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:14 crc kubenswrapper[4732]: I0402 13:40:14.881031 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:14Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:14 crc kubenswrapper[4732]: I0402 13:40:14.893836 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:14Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:14 crc kubenswrapper[4732]: I0402 13:40:14.910343 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:14Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:14 crc kubenswrapper[4732]: I0402 13:40:14.930875 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:40:01Z\\\",\\\"message\\\":\\\"registry/node-ca-glqvf\\\\nI0402 13:40:01.774154 7045 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-glqvf\\\\nI0402 13:40:01.774162 7045 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-glqvf in node crc\\\\nI0402 13:40:01.774167 7045 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-glqvf after 0 failed attempt(s)\\\\nF0402 13:40:01.774184 7045 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:01Z is after 2025-08-24T17:21:41Z]\\\\nI0402 13:40:01.774196 7045 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-8q\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:40:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8qmgp_openshift-ovn-kubernetes(a6f5e483-7d6b-4d6d-be84-303d8f07643e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:14Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:14 crc kubenswrapper[4732]: E0402 13:40:14.942489 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 02 13:40:14 crc kubenswrapper[4732]: I0402 13:40:14.948206 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:14Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:14 crc kubenswrapper[4732]: I0402 13:40:14.966586 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:14Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:15 crc kubenswrapper[4732]: I0402 13:40:15.679551 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:15 crc kubenswrapper[4732]: I0402 13:40:15.679561 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:15 crc kubenswrapper[4732]: E0402 13:40:15.679859 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:15 crc kubenswrapper[4732]: E0402 13:40:15.679887 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:15 crc kubenswrapper[4732]: I0402 13:40:15.679558 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:15 crc kubenswrapper[4732]: E0402 13:40:15.680052 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.406325 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.406383 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.406400 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.406424 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.406441 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:16Z","lastTransitionTime":"2026-04-02T13:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:16 crc kubenswrapper[4732]: E0402 13:40:16.420034 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:16Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.424416 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.424472 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.424484 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.424501 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.424518 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:16Z","lastTransitionTime":"2026-04-02T13:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:16 crc kubenswrapper[4732]: E0402 13:40:16.438241 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:16Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.442476 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.442531 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.442543 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.442562 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.442574 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:16Z","lastTransitionTime":"2026-04-02T13:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:16 crc kubenswrapper[4732]: E0402 13:40:16.455095 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:16Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.458760 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.458815 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.458824 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.458837 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.458846 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:16Z","lastTransitionTime":"2026-04-02T13:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:16 crc kubenswrapper[4732]: E0402 13:40:16.474371 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:16Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.478068 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.478112 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.478123 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.478139 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.478150 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:16Z","lastTransitionTime":"2026-04-02T13:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:16 crc kubenswrapper[4732]: E0402 13:40:16.491444 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:16Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:16 crc kubenswrapper[4732]: E0402 13:40:16.491582 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Apr 02 13:40:16 crc kubenswrapper[4732]: I0402 13:40:16.679752 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:16 crc kubenswrapper[4732]: E0402 13:40:16.679957 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:17 crc kubenswrapper[4732]: I0402 13:40:17.679922 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:17 crc kubenswrapper[4732]: I0402 13:40:17.680058 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:17 crc kubenswrapper[4732]: I0402 13:40:17.680058 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:17 crc kubenswrapper[4732]: E0402 13:40:17.680374 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:17 crc kubenswrapper[4732]: E0402 13:40:17.680793 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:17 crc kubenswrapper[4732]: E0402 13:40:17.680851 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:17 crc kubenswrapper[4732]: I0402 13:40:17.681179 4732 scope.go:117] "RemoveContainer" containerID="1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04" Apr 02 13:40:17 crc kubenswrapper[4732]: E0402 13:40:17.681688 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8qmgp_openshift-ovn-kubernetes(a6f5e483-7d6b-4d6d-be84-303d8f07643e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" Apr 02 13:40:17 crc kubenswrapper[4732]: I0402 13:40:17.694983 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Apr 02 13:40:18 crc kubenswrapper[4732]: I0402 13:40:18.679710 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:18 crc kubenswrapper[4732]: E0402 13:40:18.679873 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:19 crc kubenswrapper[4732]: I0402 13:40:19.679362 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:19 crc kubenswrapper[4732]: I0402 13:40:19.679420 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:19 crc kubenswrapper[4732]: E0402 13:40:19.679485 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:19 crc kubenswrapper[4732]: I0402 13:40:19.679498 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:19 crc kubenswrapper[4732]: E0402 13:40:19.679580 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:19 crc kubenswrapper[4732]: E0402 13:40:19.679680 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:19 crc kubenswrapper[4732]: E0402 13:40:19.944149 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 02 13:40:20 crc kubenswrapper[4732]: I0402 13:40:20.617495 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s52gj_ad206957-df5c-4b3e-bd35-e798a07d2f4e/kube-multus/0.log" Apr 02 13:40:20 crc kubenswrapper[4732]: I0402 13:40:20.617543 4732 generic.go:334] "Generic (PLEG): container finished" podID="ad206957-df5c-4b3e-bd35-e798a07d2f4e" containerID="169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64" exitCode=1 Apr 02 13:40:20 crc kubenswrapper[4732]: I0402 13:40:20.617570 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s52gj" event={"ID":"ad206957-df5c-4b3e-bd35-e798a07d2f4e","Type":"ContainerDied","Data":"169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64"} Apr 02 13:40:20 crc kubenswrapper[4732]: I0402 13:40:20.617943 4732 scope.go:117] "RemoveContainer" containerID="169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64" Apr 02 13:40:20 crc kubenswrapper[4732]: I0402 13:40:20.630375 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:20Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:20 crc kubenswrapper[4732]: I0402 13:40:20.644155 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:40:20Z\\\",\\\"message\\\":\\\"2026-04-02T13:39:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f3c911cc-3398-46ec-ae1b-12065fcdb02c\\\\n2026-04-02T13:39:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f3c911cc-3398-46ec-ae1b-12065fcdb02c to /host/opt/cni/bin/\\\\n2026-04-02T13:39:34Z [verbose] multus-daemon started\\\\n2026-04-02T13:39:34Z [verbose] Readiness Indicator file check\\\\n2026-04-02T13:40:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:20Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:20 crc kubenswrapper[4732]: I0402 13:40:20.662955 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:40:01Z\\\",\\\"message\\\":\\\"registry/node-ca-glqvf\\\\nI0402 13:40:01.774154 7045 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-glqvf\\\\nI0402 13:40:01.774162 7045 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-glqvf in node crc\\\\nI0402 13:40:01.774167 7045 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-glqvf after 0 failed attempt(s)\\\\nF0402 13:40:01.774184 7045 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:01Z is after 2025-08-24T17:21:41Z]\\\\nI0402 13:40:01.774196 7045 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-8q\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:40:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8qmgp_openshift-ovn-kubernetes(a6f5e483-7d6b-4d6d-be84-303d8f07643e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:20Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:20 crc kubenswrapper[4732]: I0402 13:40:20.674553 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:20Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:20 crc kubenswrapper[4732]: I0402 13:40:20.679415 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:20 crc kubenswrapper[4732]: E0402 13:40:20.679534 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:20 crc kubenswrapper[4732]: I0402 13:40:20.687631 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:20Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:20 crc kubenswrapper[4732]: I0402 13:40:20.702641 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:20Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:20 crc kubenswrapper[4732]: I0402 13:40:20.714366 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:20Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:20 crc kubenswrapper[4732]: I0402 13:40:20.724925 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:20Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:20 crc kubenswrapper[4732]: I0402 13:40:20.734370 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:20Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:20 crc kubenswrapper[4732]: I0402 13:40:20.743098 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:20Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:20 crc kubenswrapper[4732]: I0402 13:40:20.756027 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ce5c3a8-e8a4-4074-96da-cfc340c2873f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e597b0db2773d2bb9d7e673759dee264ef036a174be679e2ccfa8ba03cf5c6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8320a4ebc7edb8ed6702aa82a986ae487f9903f3f52ca9e7388135c88a1a8dea\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:38:34Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0402 13:38:04.591905 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0402 13:38:04.592587 1 observer_polling.go:159] Starting file observer\\\\nI0402 13:38:04.593304 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0402 13:38:04.593855 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0402 13:38:33.852475 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0402 13:38:34.195764 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0402 13:38:34.195891 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:38:04Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef96f17edb5e968b50bab42d6ddcc7c4ffc0fae1b3c8bcdebc6150e18977740e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b40621c4cae0af5b1de321842441756156529e8dcb1aff14d7f5dc7db637127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5272db3013bda1d5fcb66bd2669d7714381d3bd39802499b4a27873cbccd6ff9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:20Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:20 crc kubenswrapper[4732]: I0402 13:40:20.767913 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:20Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:20 crc kubenswrapper[4732]: I0402 13:40:20.778244 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:20Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:20 crc kubenswrapper[4732]: I0402 13:40:20.795840 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b89438bcc57ca4588c00c0e9024fd7423eb6839c4c435c5c9bfca56d0fb1c33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:20Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:20 crc kubenswrapper[4732]: I0402 13:40:20.807276 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:20Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:20 crc kubenswrapper[4732]: I0402 13:40:20.817129 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb9093a-f17e-45d5-887a-6b693bbae3f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d431bea9c2c5dcec52d15f7c9d0806d50d74ece8f3ba59562a6f3fd59e50a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:20Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:20 crc kubenswrapper[4732]: I0402 13:40:20.828854 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:20Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:21 crc kubenswrapper[4732]: I0402 13:40:21.622160 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s52gj_ad206957-df5c-4b3e-bd35-e798a07d2f4e/kube-multus/0.log" Apr 02 13:40:21 crc kubenswrapper[4732]: I0402 13:40:21.622232 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s52gj" event={"ID":"ad206957-df5c-4b3e-bd35-e798a07d2f4e","Type":"ContainerStarted","Data":"59880ea9acc1fa69e9974f404371d2876c60b9ad12942fb9c6a0aa01b0632050"} Apr 02 13:40:21 crc kubenswrapper[4732]: I0402 13:40:21.633112 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:21Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:21 crc kubenswrapper[4732]: I0402 13:40:21.645483 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ce5c3a8-e8a4-4074-96da-cfc340c2873f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e597b0db2773d2bb9d7e673759dee264ef036a174be679e2ccfa8ba03cf5c6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8320a4ebc7edb8ed6702aa82a986ae487f9903f3f52ca9e7388135c88a1a8dea\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:38:34Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0402 13:38:04.591905 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0402 13:38:04.592587 1 observer_polling.go:159] Starting file observer\\\\nI0402 13:38:04.593304 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0402 13:38:04.593855 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0402 13:38:33.852475 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0402 13:38:34.195764 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0402 13:38:34.195891 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:38:04Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef96f17edb5e968b50bab42d6ddcc7c4ffc0fae1b3c8bcdebc6150e18977740e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b40621c4cae0af5b1de321842441756156529e8dcb1aff14d7f5dc7db637127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5272db3013bda1d5fcb66bd2669d7714381d3bd39802499b4a27873cbccd6ff9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:21Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:21 crc kubenswrapper[4732]: I0402 13:40:21.655563 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:21Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:21 crc kubenswrapper[4732]: I0402 13:40:21.667794 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:21Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:21 crc kubenswrapper[4732]: I0402 13:40:21.678890 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:21Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:21 crc kubenswrapper[4732]: I0402 13:40:21.679096 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:21 crc kubenswrapper[4732]: I0402 13:40:21.679111 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:21 crc kubenswrapper[4732]: I0402 13:40:21.679096 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:21 crc kubenswrapper[4732]: E0402 13:40:21.679194 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:21 crc kubenswrapper[4732]: E0402 13:40:21.679268 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:21 crc kubenswrapper[4732]: E0402 13:40:21.679316 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:21 crc kubenswrapper[4732]: I0402 13:40:21.689230 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:21Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:21 crc kubenswrapper[4732]: I0402 13:40:21.698063 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb9093a-f17e-45d5-887a-6b693bbae3f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d431bea9c2c5dcec52d15f7c9d0806d50d74ece8f3ba59562a6f3fd59e50a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:21Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:21 crc kubenswrapper[4732]: I0402 13:40:21.708750 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:21Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:21 crc kubenswrapper[4732]: I0402 13:40:21.725467 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:21Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:21 crc kubenswrapper[4732]: I0402 13:40:21.738576 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:21Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:21 crc kubenswrapper[4732]: I0402 13:40:21.754265 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b89438bcc57ca4588c00c0e9024fd7423eb6839c4c435c5c9bfca56d0fb1c33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:21Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:21 crc kubenswrapper[4732]: I0402 13:40:21.765336 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:21Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:21 crc kubenswrapper[4732]: I0402 13:40:21.781280 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:21Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:21 crc kubenswrapper[4732]: I0402 13:40:21.794806 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:21Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:21 crc kubenswrapper[4732]: I0402 13:40:21.807922 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:21Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:21 crc kubenswrapper[4732]: I0402 13:40:21.824070 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59880ea9acc1fa69e9974f404371d2876c60b9ad12942fb9c6a0aa01b0632050\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:40:20Z\\\",\\\"message\\\":\\\"2026-04-02T13:39:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f3c911cc-3398-46ec-ae1b-12065fcdb02c\\\\n2026-04-02T13:39:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f3c911cc-3398-46ec-ae1b-12065fcdb02c to /host/opt/cni/bin/\\\\n2026-04-02T13:39:34Z [verbose] multus-daemon started\\\\n2026-04-02T13:39:34Z [verbose] Readiness Indicator file check\\\\n2026-04-02T13:40:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:40:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:21Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:21 crc kubenswrapper[4732]: I0402 13:40:21.843267 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:40:01Z\\\",\\\"message\\\":\\\"registry/node-ca-glqvf\\\\nI0402 13:40:01.774154 7045 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-glqvf\\\\nI0402 13:40:01.774162 7045 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-glqvf in node crc\\\\nI0402 13:40:01.774167 7045 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-glqvf after 0 failed attempt(s)\\\\nF0402 13:40:01.774184 7045 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:01Z is after 2025-08-24T17:21:41Z]\\\\nI0402 13:40:01.774196 7045 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-8q\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:40:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8qmgp_openshift-ovn-kubernetes(a6f5e483-7d6b-4d6d-be84-303d8f07643e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:21Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:22 crc kubenswrapper[4732]: I0402 13:40:22.679740 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:22 crc kubenswrapper[4732]: E0402 13:40:22.679857 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:23 crc kubenswrapper[4732]: I0402 13:40:23.679199 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:23 crc kubenswrapper[4732]: E0402 13:40:23.679437 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:23 crc kubenswrapper[4732]: I0402 13:40:23.679230 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:23 crc kubenswrapper[4732]: E0402 13:40:23.679576 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:23 crc kubenswrapper[4732]: I0402 13:40:23.679199 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:23 crc kubenswrapper[4732]: E0402 13:40:23.679747 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:24 crc kubenswrapper[4732]: I0402 13:40:24.679728 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:24 crc kubenswrapper[4732]: E0402 13:40:24.679873 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:24 crc kubenswrapper[4732]: I0402 13:40:24.680739 4732 scope.go:117] "RemoveContainer" containerID="5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a" Apr 02 13:40:24 crc kubenswrapper[4732]: E0402 13:40:24.680972 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 02 13:40:24 crc kubenswrapper[4732]: I0402 13:40:24.702018 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:24Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:24 crc kubenswrapper[4732]: I0402 13:40:24.724275 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:24Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:24 crc kubenswrapper[4732]: I0402 13:40:24.748518 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:24Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:24 crc kubenswrapper[4732]: I0402 13:40:24.767711 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59880ea9acc1fa69e9974f404371d2876c60b9ad12942fb9c6a0aa01b0632050\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:40:20Z\\\",\\\"message\\\":\\\"2026-04-02T13:39:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f3c911cc-3398-46ec-ae1b-12065fcdb02c\\\\n2026-04-02T13:39:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f3c911cc-3398-46ec-ae1b-12065fcdb02c to /host/opt/cni/bin/\\\\n2026-04-02T13:39:34Z [verbose] multus-daemon started\\\\n2026-04-02T13:39:34Z [verbose] Readiness Indicator file check\\\\n2026-04-02T13:40:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:40:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:24Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:24 crc kubenswrapper[4732]: I0402 13:40:24.793035 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:40:01Z\\\",\\\"message\\\":\\\"registry/node-ca-glqvf\\\\nI0402 13:40:01.774154 7045 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-glqvf\\\\nI0402 13:40:01.774162 7045 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-glqvf in node crc\\\\nI0402 13:40:01.774167 7045 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-glqvf after 0 failed attempt(s)\\\\nF0402 13:40:01.774184 7045 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:01Z is after 2025-08-24T17:21:41Z]\\\\nI0402 13:40:01.774196 7045 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-8q\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:40:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8qmgp_openshift-ovn-kubernetes(a6f5e483-7d6b-4d6d-be84-303d8f07643e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:24Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:24 crc kubenswrapper[4732]: I0402 13:40:24.813030 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:24Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:24 crc kubenswrapper[4732]: I0402 13:40:24.838528 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ce5c3a8-e8a4-4074-96da-cfc340c2873f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e597b0db2773d2bb9d7e673759dee264ef036a174be679e2ccfa8ba03cf5c6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8320a4ebc7edb8ed6702aa82a986ae487f9903f3f52ca9e7388135c88a1a8dea\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:38:34Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0402 13:38:04.591905 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0402 13:38:04.592587 1 observer_polling.go:159] Starting file observer\\\\nI0402 13:38:04.593304 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0402 13:38:04.593855 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0402 13:38:33.852475 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0402 13:38:34.195764 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0402 13:38:34.195891 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:38:04Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef96f17edb5e968b50bab42d6ddcc7c4ffc0fae1b3c8bcdebc6150e18977740e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b40621c4cae0af5b1de321842441756156529e8dcb1aff14d7f5dc7db637127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5272db3013bda1d5fcb66bd2669d7714381d3bd39802499b4a27873cbccd6ff9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:24Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:24 crc kubenswrapper[4732]: I0402 13:40:24.858475 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:24Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:24 crc kubenswrapper[4732]: I0402 13:40:24.879291 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:24Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:24 crc kubenswrapper[4732]: I0402 13:40:24.892895 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:24Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:24 crc kubenswrapper[4732]: I0402 13:40:24.905411 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:24Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:24 crc kubenswrapper[4732]: I0402 13:40:24.919977 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb9093a-f17e-45d5-887a-6b693bbae3f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d431bea9c2c5dcec52d15f7c9d0806d50d74ece8f3ba59562a6f3fd59e50a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:24Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:24 crc kubenswrapper[4732]: I0402 13:40:24.939422 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:24Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:24 crc kubenswrapper[4732]: E0402 13:40:24.945012 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 02 13:40:24 crc kubenswrapper[4732]: I0402 13:40:24.959195 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:24Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:24 crc kubenswrapper[4732]: I0402 13:40:24.972241 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:24Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:24 crc kubenswrapper[4732]: I0402 13:40:24.992951 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b89438bcc57ca4588c00c0e9024fd7423eb6839c4c435c5c9bfca56d0fb1c33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:24Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:25 crc kubenswrapper[4732]: I0402 13:40:25.010186 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:25Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:25 crc kubenswrapper[4732]: I0402 13:40:25.679732 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:25 crc kubenswrapper[4732]: I0402 13:40:25.679771 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:25 crc kubenswrapper[4732]: E0402 13:40:25.681790 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:25 crc kubenswrapper[4732]: I0402 13:40:25.679823 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:25 crc kubenswrapper[4732]: E0402 13:40:25.681899 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:25 crc kubenswrapper[4732]: E0402 13:40:25.682166 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.680226 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:26 crc kubenswrapper[4732]: E0402 13:40:26.680413 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.860563 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.860670 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.860704 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.860732 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.860752 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:26Z","lastTransitionTime":"2026-04-02T13:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:26 crc kubenswrapper[4732]: E0402 13:40:26.884023 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:26Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.890447 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.890504 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.890521 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.890545 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.890589 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:26Z","lastTransitionTime":"2026-04-02T13:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:26 crc kubenswrapper[4732]: E0402 13:40:26.914074 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:26Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.921785 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.921837 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.921854 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.921878 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.921897 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:26Z","lastTransitionTime":"2026-04-02T13:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:26 crc kubenswrapper[4732]: E0402 13:40:26.945053 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:26Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.951605 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.951738 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.951763 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.951794 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.951817 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:26Z","lastTransitionTime":"2026-04-02T13:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:26 crc kubenswrapper[4732]: E0402 13:40:26.974683 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:26Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.980676 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.980738 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.980759 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.980789 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:26 crc kubenswrapper[4732]: I0402 13:40:26.980810 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:26Z","lastTransitionTime":"2026-04-02T13:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:26 crc kubenswrapper[4732]: E0402 13:40:26.998986 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:26Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:26 crc kubenswrapper[4732]: E0402 13:40:26.999139 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Apr 02 13:40:27 crc kubenswrapper[4732]: I0402 13:40:27.679884 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:27 crc kubenswrapper[4732]: I0402 13:40:27.679914 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:27 crc kubenswrapper[4732]: I0402 13:40:27.679954 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:27 crc kubenswrapper[4732]: E0402 13:40:27.680038 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:27 crc kubenswrapper[4732]: E0402 13:40:27.680113 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:27 crc kubenswrapper[4732]: E0402 13:40:27.680177 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:28 crc kubenswrapper[4732]: I0402 13:40:28.680279 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:28 crc kubenswrapper[4732]: E0402 13:40:28.680438 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:29 crc kubenswrapper[4732]: I0402 13:40:29.679969 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:29 crc kubenswrapper[4732]: E0402 13:40:29.680412 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:29 crc kubenswrapper[4732]: I0402 13:40:29.680306 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:29 crc kubenswrapper[4732]: E0402 13:40:29.681101 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:29 crc kubenswrapper[4732]: I0402 13:40:29.680284 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:29 crc kubenswrapper[4732]: E0402 13:40:29.681452 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:29 crc kubenswrapper[4732]: E0402 13:40:29.946228 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 02 13:40:30 crc kubenswrapper[4732]: I0402 13:40:30.679938 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:30 crc kubenswrapper[4732]: E0402 13:40:30.680095 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:31 crc kubenswrapper[4732]: I0402 13:40:31.679403 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:31 crc kubenswrapper[4732]: I0402 13:40:31.679481 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:31 crc kubenswrapper[4732]: I0402 13:40:31.679425 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:31 crc kubenswrapper[4732]: E0402 13:40:31.679654 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:31 crc kubenswrapper[4732]: E0402 13:40:31.680098 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:31 crc kubenswrapper[4732]: E0402 13:40:31.680223 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:31 crc kubenswrapper[4732]: I0402 13:40:31.680817 4732 scope.go:117] "RemoveContainer" containerID="1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04" Apr 02 13:40:32 crc kubenswrapper[4732]: I0402 13:40:32.679577 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:32 crc kubenswrapper[4732]: E0402 13:40:32.679787 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:33 crc kubenswrapper[4732]: I0402 13:40:33.666537 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qmgp_a6f5e483-7d6b-4d6d-be84-303d8f07643e/ovnkube-controller/2.log" Apr 02 13:40:33 crc kubenswrapper[4732]: I0402 13:40:33.669850 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerStarted","Data":"8438ae7262b1ff33b5168b83661bede090ebb04853882f438b8fad8650d131d1"} Apr 02 13:40:33 crc kubenswrapper[4732]: I0402 13:40:33.680212 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:33 crc kubenswrapper[4732]: I0402 13:40:33.680290 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:33 crc kubenswrapper[4732]: I0402 13:40:33.680250 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:33 crc kubenswrapper[4732]: E0402 13:40:33.680439 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:33 crc kubenswrapper[4732]: E0402 13:40:33.680579 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:33 crc kubenswrapper[4732]: E0402 13:40:33.680758 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:33 crc kubenswrapper[4732]: I0402 13:40:33.706731 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Apr 02 13:40:34 crc kubenswrapper[4732]: I0402 13:40:34.695867 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:34 crc kubenswrapper[4732]: E0402 13:40:34.696028 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:34 crc kubenswrapper[4732]: I0402 13:40:34.722754 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:40:01Z\\\",\\\"message\\\":\\\"registry/node-ca-glqvf\\\\nI0402 13:40:01.774154 7045 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-glqvf\\\\nI0402 13:40:01.774162 7045 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-glqvf in node crc\\\\nI0402 13:40:01.774167 7045 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-glqvf after 0 failed attempt(s)\\\\nF0402 13:40:01.774184 7045 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:01Z is after 2025-08-24T17:21:41Z]\\\\nI0402 13:40:01.774196 7045 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-8q\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:40:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8qmgp_openshift-ovn-kubernetes(a6f5e483-7d6b-4d6d-be84-303d8f07643e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:34 crc kubenswrapper[4732]: I0402 13:40:34.742476 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:34 crc kubenswrapper[4732]: I0402 13:40:34.761078 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:34 crc kubenswrapper[4732]: I0402 13:40:34.772924 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:34 crc kubenswrapper[4732]: I0402 13:40:34.782599 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:34 crc kubenswrapper[4732]: I0402 13:40:34.793863 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59880ea9acc1fa69e9974f404371d2876c60b9ad12942fb9c6a0aa01b0632050\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:40:20Z\\\",\\\"message\\\":\\\"2026-04-02T13:39:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f3c911cc-3398-46ec-ae1b-12065fcdb02c\\\\n2026-04-02T13:39:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f3c911cc-3398-46ec-ae1b-12065fcdb02c to /host/opt/cni/bin/\\\\n2026-04-02T13:39:34Z [verbose] multus-daemon started\\\\n2026-04-02T13:39:34Z [verbose] Readiness Indicator file check\\\\n2026-04-02T13:40:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:40:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:34 crc kubenswrapper[4732]: I0402 13:40:34.804805 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:34 crc kubenswrapper[4732]: I0402 13:40:34.813962 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:34 crc kubenswrapper[4732]: I0402 13:40:34.830874 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ce5c3a8-e8a4-4074-96da-cfc340c2873f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e597b0db2773d2bb9d7e673759dee264ef036a174be679e2ccfa8ba03cf5c6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8320a4ebc7edb8ed6702aa82a986ae487f9903f3f52ca9e7388135c88a1a8dea\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:38:34Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0402 13:38:04.591905 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0402 13:38:04.592587 1 observer_polling.go:159] Starting file observer\\\\nI0402 13:38:04.593304 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0402 13:38:04.593855 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0402 13:38:33.852475 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0402 13:38:34.195764 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0402 13:38:34.195891 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:38:04Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef96f17edb5e968b50bab42d6ddcc7c4ffc0fae1b3c8bcdebc6150e18977740e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b40621c4cae0af5b1de321842441756156529e8dcb1aff14d7f5dc7db637127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5272db3013bda1d5fcb66bd2669d7714381d3bd39802499b4a27873cbccd6ff9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:34 crc kubenswrapper[4732]: I0402 13:40:34.851795 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1718331-906f-4303-bb47-e0146bc821fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://934b0c14e04d9b1d043cf692c8195d6f093d1e40f0e5873dc489354895244800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cf70a7e805d515372a0e5a88d43a670c0bcfbb5bed48de6518be6c42255e4a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc4b20f8f4f165cef3181c09e1a50a16a2b09f86e0b91c15c9bba453c7612ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d79199d36c34a73bc1959b79e0ce4c75b0ca34217c84040b93de082d60ccbfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd7aa0f0f63ba0d0047b0ce4d9ce045e2647c090d5e27474d4eca5c40b97045\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://890c62e37bef5a99941b91022b6cb3fb23e9a9910a937f8ba205a806278fa864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://890c62e37bef5a99941b91022b6cb3fb23e9a9910a937f8ba205a806278fa864\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38338477586597973eafe96225e3fd49deba97373d4d0eda3f7bce9f78dd5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38338477586597973eafe96225e3fd49deba97373d4d0eda3f7bce9f78dd5b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8499bb6d75a647adea985e455603f74d25d327c6738adf2acf5e621c77247c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8499bb6d75a647adea985e455603f74d25d327c6738adf2acf5e621c77247c48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:34 crc kubenswrapper[4732]: I0402 13:40:34.863264 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:34 crc kubenswrapper[4732]: I0402 13:40:34.875510 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:34 crc kubenswrapper[4732]: I0402 13:40:34.891503 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b89438bcc57ca4588c00c0e9024fd7423eb6839c4c435c5c9bfca56d0fb1c33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:34 crc kubenswrapper[4732]: I0402 13:40:34.902444 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:34 crc kubenswrapper[4732]: I0402 13:40:34.911814 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb9093a-f17e-45d5-887a-6b693bbae3f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d431bea9c2c5dcec52d15f7c9d0806d50d74ece8f3ba59562a6f3fd59e50a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:34 crc kubenswrapper[4732]: I0402 13:40:34.924176 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:34 crc kubenswrapper[4732]: I0402 13:40:34.934871 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:34 crc kubenswrapper[4732]: I0402 13:40:34.944362 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:34Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:34 crc kubenswrapper[4732]: E0402 13:40:34.947033 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.528919 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:40:35 crc kubenswrapper[4732]: E0402 13:40:35.529129 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:39.529091579 +0000 UTC m=+256.433499172 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.529543 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.529655 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:35 crc kubenswrapper[4732]: E0402 13:40:35.529738 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 02 13:40:35 crc kubenswrapper[4732]: E0402 13:40:35.529766 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 02 13:40:35 crc kubenswrapper[4732]: E0402 13:40:35.529786 4732 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:40:35 crc kubenswrapper[4732]: E0402 13:40:35.529839 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-04-02 13:41:39.529821658 +0000 UTC m=+256.434229211 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:40:35 crc kubenswrapper[4732]: E0402 13:40:35.529919 4732 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.529751 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:35 crc kubenswrapper[4732]: E0402 13:40:35.530094 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-02 13:41:39.530055864 +0000 UTC m=+256.434463547 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Apr 02 13:40:35 crc kubenswrapper[4732]: E0402 13:40:35.529933 4732 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 02 13:40:35 crc kubenswrapper[4732]: E0402 13:40:35.530183 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-02 13:41:39.530168967 +0000 UTC m=+256.434576540 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.530261 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:35 crc kubenswrapper[4732]: E0402 13:40:35.530400 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 02 13:40:35 crc kubenswrapper[4732]: E0402 13:40:35.530483 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 02 13:40:35 crc kubenswrapper[4732]: E0402 13:40:35.530546 4732 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:40:35 crc kubenswrapper[4732]: E0402 13:40:35.530648 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-04-02 13:41:39.530638789 +0000 UTC m=+256.435046342 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.631443 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs\") pod \"network-metrics-daemon-crx2z\" (UID: \"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\") " pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:35 crc kubenswrapper[4732]: E0402 13:40:35.631714 4732 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 02 13:40:35 crc kubenswrapper[4732]: E0402 13:40:35.631792 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs podName:386bd92b-c67e-4cc6-8a47-6f8d6e799bc7 nodeName:}" failed. No retries permitted until 2026-04-02 13:41:39.631774622 +0000 UTC m=+256.536182175 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs") pod "network-metrics-daemon-crx2z" (UID: "386bd92b-c67e-4cc6-8a47-6f8d6e799bc7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.680252 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.680256 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.680364 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:35 crc kubenswrapper[4732]: E0402 13:40:35.680512 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:35 crc kubenswrapper[4732]: E0402 13:40:35.680752 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:35 crc kubenswrapper[4732]: E0402 13:40:35.680819 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.707574 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qmgp_a6f5e483-7d6b-4d6d-be84-303d8f07643e/ovnkube-controller/3.log" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.708791 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qmgp_a6f5e483-7d6b-4d6d-be84-303d8f07643e/ovnkube-controller/2.log" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.711380 4732 generic.go:334] "Generic (PLEG): container finished" podID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerID="8438ae7262b1ff33b5168b83661bede090ebb04853882f438b8fad8650d131d1" exitCode=1 Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.711725 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerDied","Data":"8438ae7262b1ff33b5168b83661bede090ebb04853882f438b8fad8650d131d1"} Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.711979 4732 scope.go:117] "RemoveContainer" containerID="1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.712424 4732 scope.go:117] "RemoveContainer" containerID="8438ae7262b1ff33b5168b83661bede090ebb04853882f438b8fad8650d131d1" Apr 02 13:40:35 crc kubenswrapper[4732]: E0402 13:40:35.712670 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8qmgp_openshift-ovn-kubernetes(a6f5e483-7d6b-4d6d-be84-303d8f07643e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.725465 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.739161 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.756476 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ce5c3a8-e8a4-4074-96da-cfc340c2873f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e597b0db2773d2bb9d7e673759dee264ef036a174be679e2ccfa8ba03cf5c6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8320a4ebc7edb8ed6702aa82a986ae487f9903f3f52ca9e7388135c88a1a8dea\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:38:34Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0402 13:38:04.591905 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0402 13:38:04.592587 1 observer_polling.go:159] Starting file observer\\\\nI0402 13:38:04.593304 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0402 13:38:04.593855 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0402 13:38:33.852475 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0402 13:38:34.195764 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0402 13:38:34.195891 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:38:04Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef96f17edb5e968b50bab42d6ddcc7c4ffc0fae1b3c8bcdebc6150e18977740e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b40621c4cae0af5b1de321842441756156529e8dcb1aff14d7f5dc7db637127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5272db3013bda1d5fcb66bd2669d7714381d3bd39802499b4a27873cbccd6ff9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.775975 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1718331-906f-4303-bb47-e0146bc821fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://934b0c14e04d9b1d043cf692c8195d6f093d1e40f0e5873dc489354895244800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cf70a7e805d515372a0e5a88d43a670c0bcfbb5bed48de6518be6c42255e4a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc4b20f8f4f165cef3181c09e1a50a16a2b09f86e0b91c15c9bba453c7612ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d79199d36c34a73bc1959b79e0ce4c75b0ca34217c84040b93de082d60ccbfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd7aa0f0f63ba0d0047b0ce4d9ce045e2647c090d5e27474d4eca5c40b97045\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://890c62e37bef5a99941b91022b6cb3fb23e9a9910a937f8ba205a806278fa864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://890c62e37bef5a99941b91022b6cb3fb23e9a9910a937f8ba205a806278fa864\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38338477586597973eafe96225e3fd49deba97373d4d0eda3f7bce9f78dd5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38338477586597973eafe96225e3fd49deba97373d4d0eda3f7bce9f78dd5b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8499bb6d75a647adea985e455603f74d25d327c6738adf2acf5e621c77247c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8499bb6d75a647adea985e455603f74d25d327c6738adf2acf5e621c77247c48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.786768 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.798790 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.811265 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b89438bcc57ca4588c00c0e9024fd7423eb6839c4c435c5c9bfca56d0fb1c33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.823259 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.833222 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb9093a-f17e-45d5-887a-6b693bbae3f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d431bea9c2c5dcec52d15f7c9d0806d50d74ece8f3ba59562a6f3fd59e50a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.845272 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.856487 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.867080 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.885234 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8438ae7262b1ff33b5168b83661bede090ebb04853882f438b8fad8650d131d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7701a1317a8ea04ada7ef0746fa41d271b302277504bd0ffdebd0b14db5a04\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:40:01Z\\\",\\\"message\\\":\\\"registry/node-ca-glqvf\\\\nI0402 13:40:01.774154 7045 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-glqvf\\\\nI0402 13:40:01.774162 7045 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-glqvf in node crc\\\\nI0402 13:40:01.774167 7045 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-glqvf after 0 failed attempt(s)\\\\nF0402 13:40:01.774184 7045 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:01Z is after 2025-08-24T17:21:41Z]\\\\nI0402 13:40:01.774196 7045 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-8q\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:40:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8438ae7262b1ff33b5168b83661bede090ebb04853882f438b8fad8650d131d1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:40:35Z\\\",\\\"message\\\":\\\"y.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:35Z is after 2025-08-24T17:21:41Z]\\\\nI0402 13:40:35.100686 7382 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0402 13:40:35.100697 7382 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0402 13:40:35.100698 7382 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-crx2z] creating logical port openshift-multus_network-metrics-daemon-crx2z for pod on switch crc\\\\nI0402 13:40:35.100704 7382 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0402 13:40:35.100709 7382 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0402 13:40:35.100714 7382 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0402 13:40:35.100538 7382 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q in node crc\\\\nI0402 13:40:35.100730 7382\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:40:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.899673 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.914755 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.927954 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.937969 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:35 crc kubenswrapper[4732]: I0402 13:40:35.949167 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59880ea9acc1fa69e9974f404371d2876c60b9ad12942fb9c6a0aa01b0632050\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:40:20Z\\\",\\\"message\\\":\\\"2026-04-02T13:39:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f3c911cc-3398-46ec-ae1b-12065fcdb02c\\\\n2026-04-02T13:39:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f3c911cc-3398-46ec-ae1b-12065fcdb02c to /host/opt/cni/bin/\\\\n2026-04-02T13:39:34Z [verbose] multus-daemon started\\\\n2026-04-02T13:39:34Z [verbose] Readiness Indicator file check\\\\n2026-04-02T13:40:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:40:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:35Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:36 crc kubenswrapper[4732]: I0402 13:40:36.679451 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:36 crc kubenswrapper[4732]: E0402 13:40:36.679585 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:36 crc kubenswrapper[4732]: I0402 13:40:36.681185 4732 scope.go:117] "RemoveContainer" containerID="5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a" Apr 02 13:40:36 crc kubenswrapper[4732]: E0402 13:40:36.681669 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 02 13:40:36 crc kubenswrapper[4732]: I0402 13:40:36.716928 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qmgp_a6f5e483-7d6b-4d6d-be84-303d8f07643e/ovnkube-controller/3.log" Apr 02 13:40:36 crc kubenswrapper[4732]: I0402 13:40:36.721166 4732 scope.go:117] "RemoveContainer" containerID="8438ae7262b1ff33b5168b83661bede090ebb04853882f438b8fad8650d131d1" Apr 02 13:40:36 crc kubenswrapper[4732]: E0402 13:40:36.721533 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8qmgp_openshift-ovn-kubernetes(a6f5e483-7d6b-4d6d-be84-303d8f07643e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" Apr 02 13:40:36 crc kubenswrapper[4732]: I0402 13:40:36.736809 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ce5c3a8-e8a4-4074-96da-cfc340c2873f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e597b0db2773d2bb9d7e673759dee264ef036a174be679e2ccfa8ba03cf5c6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8320a4ebc7edb8ed6702aa82a986ae487f9903f3f52ca9e7388135c88a1a8dea\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:38:34Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0402 13:38:04.591905 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0402 13:38:04.592587 1 observer_polling.go:159] Starting file observer\\\\nI0402 13:38:04.593304 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0402 13:38:04.593855 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0402 13:38:33.852475 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0402 13:38:34.195764 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0402 13:38:34.195891 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:38:04Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef96f17edb5e968b50bab42d6ddcc7c4ffc0fae1b3c8bcdebc6150e18977740e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b40621c4cae0af5b1de321842441756156529e8dcb1aff14d7f5dc7db637127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5272db3013bda1d5fcb66bd2669d7714381d3bd39802499b4a27873cbccd6ff9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:36 crc kubenswrapper[4732]: I0402 13:40:36.760406 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1718331-906f-4303-bb47-e0146bc821fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://934b0c14e04d9b1d043cf692c8195d6f093d1e40f0e5873dc489354895244800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cf70a7e805d515372a0e5a88d43a670c0bcfbb5bed48de6518be6c42255e4a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc4b20f8f4f165cef3181c09e1a50a16a2b09f86e0b91c15c9bba453c7612ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d79199d36c34a73bc1959b79e0ce4c75b0ca34217c84040b93de082d60ccbfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd7aa0f0f63ba0d0047b0ce4d9ce045e2647c090d5e27474d4eca5c40b97045\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://890c62e37bef5a99941b91022b6cb3fb23e9a9910a937f8ba205a806278fa864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://890c62e37bef5a99941b91022b6cb3fb23e9a9910a937f8ba205a806278fa864\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38338477586597973eafe96225e3fd49deba97373d4d0eda3f7bce9f78dd5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38338477586597973eafe96225e3fd49deba97373d4d0eda3f7bce9f78dd5b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8499bb6d75a647adea985e455603f74d25d327c6738adf2acf5e621c77247c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8499bb6d75a647adea985e455603f74d25d327c6738adf2acf5e621c77247c48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:36 crc kubenswrapper[4732]: I0402 13:40:36.772182 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:36 crc kubenswrapper[4732]: I0402 13:40:36.788972 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:36 crc kubenswrapper[4732]: I0402 13:40:36.801396 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:36 crc kubenswrapper[4732]: I0402 13:40:36.812881 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:36 crc kubenswrapper[4732]: I0402 13:40:36.825066 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb9093a-f17e-45d5-887a-6b693bbae3f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d431bea9c2c5dcec52d15f7c9d0806d50d74ece8f3ba59562a6f3fd59e50a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:36 crc kubenswrapper[4732]: I0402 13:40:36.837733 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:36 crc kubenswrapper[4732]: I0402 13:40:36.849091 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:36 crc kubenswrapper[4732]: I0402 13:40:36.859356 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:36 crc kubenswrapper[4732]: I0402 13:40:36.873967 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b89438bcc57ca4588c00c0e9024fd7423eb6839c4c435c5c9bfca56d0fb1c33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:36 crc kubenswrapper[4732]: I0402 13:40:36.885074 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:36 crc kubenswrapper[4732]: I0402 13:40:36.901292 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:36 crc kubenswrapper[4732]: I0402 13:40:36.914504 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:36 crc kubenswrapper[4732]: I0402 13:40:36.924955 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:36 crc kubenswrapper[4732]: I0402 13:40:36.937017 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59880ea9acc1fa69e9974f404371d2876c60b9ad12942fb9c6a0aa01b0632050\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:40:20Z\\\",\\\"message\\\":\\\"2026-04-02T13:39:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f3c911cc-3398-46ec-ae1b-12065fcdb02c\\\\n2026-04-02T13:39:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f3c911cc-3398-46ec-ae1b-12065fcdb02c to /host/opt/cni/bin/\\\\n2026-04-02T13:39:34Z [verbose] multus-daemon started\\\\n2026-04-02T13:39:34Z [verbose] Readiness Indicator file check\\\\n2026-04-02T13:40:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:40:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:36 crc kubenswrapper[4732]: I0402 13:40:36.957081 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8438ae7262b1ff33b5168b83661bede090ebb04853882f438b8fad8650d131d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8438ae7262b1ff33b5168b83661bede090ebb04853882f438b8fad8650d131d1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:40:35Z\\\",\\\"message\\\":\\\"y.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:35Z is after 2025-08-24T17:21:41Z]\\\\nI0402 13:40:35.100686 7382 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0402 13:40:35.100697 7382 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0402 13:40:35.100698 7382 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-crx2z] creating logical port openshift-multus_network-metrics-daemon-crx2z for pod on switch crc\\\\nI0402 13:40:35.100704 7382 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0402 13:40:35.100709 7382 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0402 13:40:35.100714 7382 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0402 13:40:35.100538 7382 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q in node crc\\\\nI0402 13:40:35.100730 7382\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:40:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8qmgp_openshift-ovn-kubernetes(a6f5e483-7d6b-4d6d-be84-303d8f07643e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:36 crc kubenswrapper[4732]: I0402 13:40:36.968894 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:36Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.265685 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.265769 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.265783 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.265829 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.265843 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:37Z","lastTransitionTime":"2026-04-02T13:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:37 crc kubenswrapper[4732]: E0402 13:40:37.280111 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:37Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.284255 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.284291 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.284301 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.284320 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.284332 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:37Z","lastTransitionTime":"2026-04-02T13:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:37 crc kubenswrapper[4732]: E0402 13:40:37.296330 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:37Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.299850 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.299892 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.299903 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.299921 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.299933 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:37Z","lastTransitionTime":"2026-04-02T13:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:37 crc kubenswrapper[4732]: E0402 13:40:37.312206 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:37Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.316143 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.316341 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.316432 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.316516 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.316604 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:37Z","lastTransitionTime":"2026-04-02T13:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:37 crc kubenswrapper[4732]: E0402 13:40:37.327937 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:37Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.331229 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.331265 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.331279 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.331295 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.331308 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:37Z","lastTransitionTime":"2026-04-02T13:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:37 crc kubenswrapper[4732]: E0402 13:40:37.345055 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:37Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:37 crc kubenswrapper[4732]: E0402 13:40:37.345221 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.679800 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.679915 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:37 crc kubenswrapper[4732]: I0402 13:40:37.679950 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:37 crc kubenswrapper[4732]: E0402 13:40:37.680026 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:37 crc kubenswrapper[4732]: E0402 13:40:37.680258 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:37 crc kubenswrapper[4732]: E0402 13:40:37.680438 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:38 crc kubenswrapper[4732]: I0402 13:40:38.679547 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:38 crc kubenswrapper[4732]: E0402 13:40:38.679722 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:39 crc kubenswrapper[4732]: I0402 13:40:39.679742 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:39 crc kubenswrapper[4732]: I0402 13:40:39.679787 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:39 crc kubenswrapper[4732]: I0402 13:40:39.679737 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:39 crc kubenswrapper[4732]: E0402 13:40:39.679876 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:39 crc kubenswrapper[4732]: E0402 13:40:39.679994 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:39 crc kubenswrapper[4732]: E0402 13:40:39.680065 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:39 crc kubenswrapper[4732]: E0402 13:40:39.948479 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 02 13:40:40 crc kubenswrapper[4732]: I0402 13:40:40.680701 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:40 crc kubenswrapper[4732]: E0402 13:40:40.680893 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:41 crc kubenswrapper[4732]: I0402 13:40:41.679520 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:41 crc kubenswrapper[4732]: I0402 13:40:41.679648 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:41 crc kubenswrapper[4732]: E0402 13:40:41.679714 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:41 crc kubenswrapper[4732]: E0402 13:40:41.679814 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:41 crc kubenswrapper[4732]: I0402 13:40:41.680168 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:41 crc kubenswrapper[4732]: E0402 13:40:41.680418 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:42 crc kubenswrapper[4732]: I0402 13:40:42.679832 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:42 crc kubenswrapper[4732]: E0402 13:40:42.680172 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:42 crc kubenswrapper[4732]: I0402 13:40:42.689862 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Apr 02 13:40:43 crc kubenswrapper[4732]: I0402 13:40:43.679472 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:43 crc kubenswrapper[4732]: I0402 13:40:43.679554 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:43 crc kubenswrapper[4732]: I0402 13:40:43.679512 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:43 crc kubenswrapper[4732]: E0402 13:40:43.679650 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:43 crc kubenswrapper[4732]: E0402 13:40:43.679770 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:43 crc kubenswrapper[4732]: E0402 13:40:43.679826 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:44 crc kubenswrapper[4732]: I0402 13:40:44.679828 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:44 crc kubenswrapper[4732]: E0402 13:40:44.679975 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:44 crc kubenswrapper[4732]: I0402 13:40:44.703557 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ce5c3a8-e8a4-4074-96da-cfc340c2873f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e597b0db2773d2bb9d7e673759dee264ef036a174be679e2ccfa8ba03cf5c6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8320a4ebc7edb8ed6702aa82a986ae487f9903f3f52ca9e7388135c88a1a8dea\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:38:34Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0402 13:38:04.591905 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0402 13:38:04.592587 1 observer_polling.go:159] Starting file observer\\\\nI0402 13:38:04.593304 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0402 13:38:04.593855 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0402 13:38:33.852475 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0402 13:38:34.195764 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0402 13:38:34.195891 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:38:04Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef96f17edb5e968b50bab42d6ddcc7c4ffc0fae1b3c8bcdebc6150e18977740e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b40621c4cae0af5b1de321842441756156529e8dcb1aff14d7f5dc7db637127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5272db3013bda1d5fcb66bd2669d7714381d3bd39802499b4a27873cbccd6ff9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:44 crc kubenswrapper[4732]: I0402 13:40:44.727251 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1718331-906f-4303-bb47-e0146bc821fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://934b0c14e04d9b1d043cf692c8195d6f093d1e40f0e5873dc489354895244800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cf70a7e805d515372a0e5a88d43a670c0bcfbb5bed48de6518be6c42255e4a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc4b20f8f4f165cef3181c09e1a50a16a2b09f86e0b91c15c9bba453c7612ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d79199d36c34a73bc1959b79e0ce4c75b0ca34217c84040b93de082d60ccbfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd7aa0f0f63ba0d0047b0ce4d9ce045e2647c090d5e27474d4eca5c40b97045\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://890c62e37bef5a99941b91022b6cb3fb23e9a9910a937f8ba205a806278fa864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://890c62e37bef5a99941b91022b6cb3fb23e9a9910a937f8ba205a806278fa864\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38338477586597973eafe96225e3fd49deba97373d4d0eda3f7bce9f78dd5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38338477586597973eafe96225e3fd49deba97373d4d0eda3f7bce9f78dd5b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8499bb6d75a647adea985e455603f74d25d327c6738adf2acf5e621c77247c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8499bb6d75a647adea985e455603f74d25d327c6738adf2acf5e621c77247c48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:44 crc kubenswrapper[4732]: I0402 13:40:44.742105 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:44 crc kubenswrapper[4732]: I0402 13:40:44.757958 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:44 crc kubenswrapper[4732]: I0402 13:40:44.767739 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:44 crc kubenswrapper[4732]: I0402 13:40:44.776665 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:44 crc kubenswrapper[4732]: I0402 13:40:44.786140 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb9093a-f17e-45d5-887a-6b693bbae3f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d431bea9c2c5dcec52d15f7c9d0806d50d74ece8f3ba59562a6f3fd59e50a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:44 crc kubenswrapper[4732]: I0402 13:40:44.798442 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:44 crc kubenswrapper[4732]: I0402 13:40:44.811546 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:44 crc kubenswrapper[4732]: I0402 13:40:44.821547 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:44 crc kubenswrapper[4732]: I0402 13:40:44.849447 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b89438bcc57ca4588c00c0e9024fd7423eb6839c4c435c5c9bfca56d0fb1c33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:44 crc kubenswrapper[4732]: I0402 13:40:44.870111 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:44 crc kubenswrapper[4732]: I0402 13:40:44.889334 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:44 crc kubenswrapper[4732]: I0402 13:40:44.900805 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:44 crc kubenswrapper[4732]: I0402 13:40:44.910669 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:44 crc kubenswrapper[4732]: I0402 13:40:44.922989 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59880ea9acc1fa69e9974f404371d2876c60b9ad12942fb9c6a0aa01b0632050\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:40:20Z\\\",\\\"message\\\":\\\"2026-04-02T13:39:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f3c911cc-3398-46ec-ae1b-12065fcdb02c\\\\n2026-04-02T13:39:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f3c911cc-3398-46ec-ae1b-12065fcdb02c to /host/opt/cni/bin/\\\\n2026-04-02T13:39:34Z [verbose] multus-daemon started\\\\n2026-04-02T13:39:34Z [verbose] Readiness Indicator file check\\\\n2026-04-02T13:40:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:40:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:44 crc kubenswrapper[4732]: I0402 13:40:44.940127 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8438ae7262b1ff33b5168b83661bede090ebb04853882f438b8fad8650d131d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8438ae7262b1ff33b5168b83661bede090ebb04853882f438b8fad8650d131d1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:40:35Z\\\",\\\"message\\\":\\\"y.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:35Z is after 2025-08-24T17:21:41Z]\\\\nI0402 13:40:35.100686 7382 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0402 13:40:35.100697 7382 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0402 13:40:35.100698 7382 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-crx2z] creating logical port openshift-multus_network-metrics-daemon-crx2z for pod on switch crc\\\\nI0402 13:40:35.100704 7382 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0402 13:40:35.100709 7382 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0402 13:40:35.100714 7382 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0402 13:40:35.100538 7382 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q in node crc\\\\nI0402 13:40:35.100730 7382\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:40:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8qmgp_openshift-ovn-kubernetes(a6f5e483-7d6b-4d6d-be84-303d8f07643e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:44 crc kubenswrapper[4732]: E0402 13:40:44.948890 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 02 13:40:44 crc kubenswrapper[4732]: I0402 13:40:44.952127 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:44 crc kubenswrapper[4732]: I0402 13:40:44.964720 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0654c6f-b103-4146-b591-b2acee4900a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f4f8c5575530ca3fe7ef2e6cd25a4f63888841ae0564192385b7371093305d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03db4bfdbd69007c0b4a4d3f465fd21957c5a829c84cac0dd39914617e44cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb15ce78ba79858cf5ae05773b302a03fd75dd3a3cdd53d8ca5ff2bdbc6d48d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c38540c1e54421b3069733149cadd0da6b74c4f6aa0160090c8a14429797dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18c38540c1e54421b3069733149cadd0da6b74c4f6aa0160090c8a14429797dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:44Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:45 crc kubenswrapper[4732]: I0402 13:40:45.679892 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:45 crc kubenswrapper[4732]: I0402 13:40:45.679957 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:45 crc kubenswrapper[4732]: I0402 13:40:45.679921 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:45 crc kubenswrapper[4732]: E0402 13:40:45.680036 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:45 crc kubenswrapper[4732]: E0402 13:40:45.680140 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:45 crc kubenswrapper[4732]: E0402 13:40:45.680252 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:46 crc kubenswrapper[4732]: I0402 13:40:46.680202 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:46 crc kubenswrapper[4732]: E0402 13:40:46.680357 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.536607 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.536678 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.536688 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.536735 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.536752 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:47Z","lastTransitionTime":"2026-04-02T13:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:47 crc kubenswrapper[4732]: E0402 13:40:47.548834 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:47Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.551916 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.551956 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.551966 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.551981 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.551990 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:47Z","lastTransitionTime":"2026-04-02T13:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:47 crc kubenswrapper[4732]: E0402 13:40:47.562374 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:47Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.566414 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.566450 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.566460 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.566478 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.566491 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:47Z","lastTransitionTime":"2026-04-02T13:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:47 crc kubenswrapper[4732]: E0402 13:40:47.577725 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:47Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.581458 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.581504 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.581513 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.581529 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.581540 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:47Z","lastTransitionTime":"2026-04-02T13:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:47 crc kubenswrapper[4732]: E0402 13:40:47.594548 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:47Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.598589 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.598664 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.598676 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.598693 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.598707 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:47Z","lastTransitionTime":"2026-04-02T13:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:47 crc kubenswrapper[4732]: E0402 13:40:47.612444 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69443537-6792-4af0-92ca-9d3f256e3009\\\",\\\"systemUUID\\\":\\\"3be5867b-5df6-4c65-8d4b-c54c471927ff\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:47Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:47 crc kubenswrapper[4732]: E0402 13:40:47.612594 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.680204 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.680279 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:47 crc kubenswrapper[4732]: I0402 13:40:47.680204 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:47 crc kubenswrapper[4732]: E0402 13:40:47.680342 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:47 crc kubenswrapper[4732]: E0402 13:40:47.680414 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:47 crc kubenswrapper[4732]: E0402 13:40:47.680725 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:48 crc kubenswrapper[4732]: I0402 13:40:48.679532 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:48 crc kubenswrapper[4732]: E0402 13:40:48.679718 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:49 crc kubenswrapper[4732]: I0402 13:40:49.679354 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:49 crc kubenswrapper[4732]: E0402 13:40:49.679490 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:49 crc kubenswrapper[4732]: I0402 13:40:49.679354 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:49 crc kubenswrapper[4732]: I0402 13:40:49.679548 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:49 crc kubenswrapper[4732]: E0402 13:40:49.679728 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:49 crc kubenswrapper[4732]: E0402 13:40:49.679861 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:49 crc kubenswrapper[4732]: I0402 13:40:49.680217 4732 scope.go:117] "RemoveContainer" containerID="5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a" Apr 02 13:40:49 crc kubenswrapper[4732]: E0402 13:40:49.950359 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 02 13:40:50 crc kubenswrapper[4732]: I0402 13:40:50.679428 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:50 crc kubenswrapper[4732]: E0402 13:40:50.679606 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:50 crc kubenswrapper[4732]: I0402 13:40:50.680218 4732 scope.go:117] "RemoveContainer" containerID="8438ae7262b1ff33b5168b83661bede090ebb04853882f438b8fad8650d131d1" Apr 02 13:40:50 crc kubenswrapper[4732]: E0402 13:40:50.680379 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8qmgp_openshift-ovn-kubernetes(a6f5e483-7d6b-4d6d-be84-303d8f07643e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" Apr 02 13:40:50 crc kubenswrapper[4732]: I0402 13:40:50.761853 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Apr 02 13:40:50 crc kubenswrapper[4732]: I0402 13:40:50.763956 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"19f830282d549fbab00b4a19b6698b2abc656f62287b73347cd4d0698fd5de90"} Apr 02 13:40:50 crc kubenswrapper[4732]: I0402 13:40:50.764209 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:40:50 crc kubenswrapper[4732]: I0402 13:40:50.780423 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0654c6f-b103-4146-b591-b2acee4900a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f4f8c5575530ca3fe7ef2e6cd25a4f63888841ae0564192385b7371093305d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03db4bfdbd69007c0b4a4d3f465fd21957c5a829c84cac0dd39914617e44cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb15ce78ba79858cf5ae05773b302a03fd75dd3a3cdd53d8ca5ff2bdbc6d48d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c38540c1e54421b3069733149cadd0da6b74c4f6aa0160090c8a14429797dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18c38540c1e54421b3069733149cadd0da6b74c4f6aa0160090c8a14429797dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:50Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:50 crc kubenswrapper[4732]: I0402 13:40:50.802521 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1718331-906f-4303-bb47-e0146bc821fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://934b0c14e04d9b1d043cf692c8195d6f093d1e40f0e5873dc489354895244800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cf70a7e805d515372a0e5a88d43a670c0bcfbb5bed48de6518be6c42255e4a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc4b20f8f4f165cef3181c09e1a50a16a2b09f86e0b91c15c9bba453c7612ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d79199d36c34a73bc1959b79e0ce4c75b0ca34217c84040b93de082d60ccbfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd7aa0f0f63ba0d0047b0ce4d9ce045e2647c090d5e27474d4eca5c40b97045\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://890c62e37bef5a99941b91022b6cb3fb23e9a9910a937f8ba205a806278fa864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://890c62e37bef5a99941b91022b6cb3fb23e9a9910a937f8ba205a806278fa864\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38338477586597973eafe96225e3fd49deba97373d4d0eda3f7bce9f78dd5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38338477586597973eafe96225e3fd49deba97373d4d0eda3f7bce9f78dd5b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8499bb6d75a647adea985e455603f74d25d327c6738adf2acf5e621c77247c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8499bb6d75a647adea985e455603f74d25d327c6738adf2acf5e621c77247c48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:50Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:50 crc kubenswrapper[4732]: I0402 13:40:50.814043 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-crx2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-crx2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:50Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:50 crc kubenswrapper[4732]: I0402 13:40:50.827130 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:50Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:50 crc kubenswrapper[4732]: I0402 13:40:50.844965 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e388cf7666048a2a06ac7572afee8a8a73493dc8b315897b9118b66dfe72b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:50Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:50 crc kubenswrapper[4732]: I0402 13:40:50.857753 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tg9vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fcb1965-4cef-41a4-8894-3eb24e0ff80c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e38eea7442762874b0db44fa8fcd1ffc4cbe0d149d992139a1fc0bc92dc386d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9vgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tg9vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:50Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:50 crc kubenswrapper[4732]: I0402 13:40:50.870954 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ce5c3a8-e8a4-4074-96da-cfc340c2873f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e597b0db2773d2bb9d7e673759dee264ef036a174be679e2ccfa8ba03cf5c6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8320a4ebc7edb8ed6702aa82a986ae487f9903f3f52ca9e7388135c88a1a8dea\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:38:34Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0402 13:38:04.591905 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0402 13:38:04.592587 1 observer_polling.go:159] Starting file observer\\\\nI0402 13:38:04.593304 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0402 13:38:04.593855 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0402 13:38:33.852475 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0402 13:38:34.195764 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0402 13:38:34.195891 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:38:04Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef96f17edb5e968b50bab42d6ddcc7c4ffc0fae1b3c8bcdebc6150e18977740e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b40621c4cae0af5b1de321842441756156529e8dcb1aff14d7f5dc7db637127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5272db3013bda1d5fcb66bd2669d7714381d3bd39802499b4a27873cbccd6ff9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:50Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:50 crc kubenswrapper[4732]: I0402 13:40:50.885952 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:50Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:50 crc kubenswrapper[4732]: I0402 13:40:50.897867 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37b84e7e2a23e3e100ff5bc979d4126bb3044a27f802ccc21c171516554dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abfadcbcfff56cc9a6bf0c4068f03e943dea5db96705cd3fba2aea8cbd39111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:50Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:50 crc kubenswrapper[4732]: I0402 13:40:50.905891 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-glqvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57616858-4140-40f0-83e5-388787b685b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af55257a6baf7f9bcba13eb0d7aae49bd0671f182aeb11018802d4712df0ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqn9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-glqvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:50Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:50 crc kubenswrapper[4732]: I0402 13:40:50.921934 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658a9576-efdc-4e4b-937e-bd63032cbee6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b89438bcc57ca4588c00c0e9024fd7423eb6839c4c435c5c9bfca56d0fb1c33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de69c4d183706f6729301a7859d6c312fdedb379c0c9b4aac3d4061f89b82067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d68d8de71d99082a6cf12e05e279f97cf4b0eccd9defa756ac083fcb2c412945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19d1db75c9d54c6d8d304e28faa91270a4d70096596100fb97d5a6ea74f7ad8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbddc88e12f323552ea813b6440e7770a844e187f892c11778a4a82241db38ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b9df3b256d9a633666a3195e4fd0c9f409cfeb941932f46356e1a8e2e98d706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b27d70a0f82b84c5ed47e2f3287c0faf9c449ed235ec78ef71fe769485f3e67f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzv9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nqhwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:50Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:50 crc kubenswrapper[4732]: I0402 13:40:50.932910 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:50Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:50 crc kubenswrapper[4732]: I0402 13:40:50.942641 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb9093a-f17e-45d5-887a-6b693bbae3f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d431bea9c2c5dcec52d15f7c9d0806d50d74ece8f3ba59562a6f3fd59e50a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ca33771abc8b4070c51fd6fcb9a5f88bf7097b87e6762d829b293bb628588a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:50Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:50 crc kubenswrapper[4732]: I0402 13:40:50.954738 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:50Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:50 crc kubenswrapper[4732]: I0402 13:40:50.966505 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38409e5e-4545-49da-8f6c-4bfb30582878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8b639c60ac006909c664208b4a5aa24df48983728ace3f6722514164e330a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74dmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6vtmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:50Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:50 crc kubenswrapper[4732]: I0402 13:40:50.985124 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s52gj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad206957-df5c-4b3e-bd35-e798a07d2f4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:40:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59880ea9acc1fa69e9974f404371d2876c60b9ad12942fb9c6a0aa01b0632050\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:40:20Z\\\",\\\"message\\\":\\\"2026-04-02T13:39:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f3c911cc-3398-46ec-ae1b-12065fcdb02c\\\\n2026-04-02T13:39:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f3c911cc-3398-46ec-ae1b-12065fcdb02c to /host/opt/cni/bin/\\\\n2026-04-02T13:39:34Z [verbose] multus-daemon started\\\\n2026-04-02T13:39:34Z [verbose] Readiness Indicator file check\\\\n2026-04-02T13:40:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:40:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dssdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s52gj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:50Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:51 crc kubenswrapper[4732]: I0402 13:40:51.001790 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6f5e483-7d6b-4d6d-be84-303d8f07643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8438ae7262b1ff33b5168b83661bede090ebb04853882f438b8fad8650d131d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8438ae7262b1ff33b5168b83661bede090ebb04853882f438b8fad8650d131d1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-04-02T13:40:35Z\\\",\\\"message\\\":\\\"y.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:35Z is after 2025-08-24T17:21:41Z]\\\\nI0402 13:40:35.100686 7382 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0402 13:40:35.100697 7382 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0402 13:40:35.100698 7382 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-crx2z] creating logical port openshift-multus_network-metrics-daemon-crx2z for pod on switch crc\\\\nI0402 13:40:35.100704 7382 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0402 13:40:35.100709 7382 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0402 13:40:35.100714 7382 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0402 13:40:35.100538 7382 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q in node crc\\\\nI0402 13:40:35.100730 7382\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:40:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8qmgp_openshift-ovn-kubernetes(a6f5e483-7d6b-4d6d-be84-303d8f07643e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jljzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8qmgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:50Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:51 crc kubenswrapper[4732]: I0402 13:40:51.012964 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7185760-c057-4c47-8da2-60572500a472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:39:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76490beb8e54fc846562aaf130168383c45dfe45e9582e3d87c36c38a69b8fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bec3eb12ab864fa88c229cc24bf8ab853d0dc249a18430163a45fbe02e09c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c275t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b2l5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:51Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:51 crc kubenswrapper[4732]: I0402 13:40:51.026432 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-02T13:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f830282d549fbab00b4a19b6698b2abc656f62287b73347cd4d0698fd5de90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-04-02T13:39:23Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0402 13:39:23.616086 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0402 13:39:23.616554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0402 13:39:23.618421 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-69827246/tls.crt::/tmp/serving-cert-69827246/tls.key\\\\\\\"\\\\nI0402 13:39:23.816116 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0402 13:39:23.819962 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0402 13:39:23.820008 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0402 13:39:23.820046 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0402 13:39:23.820061 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0402 13:39:23.827938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0402 13:39:23.827970 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0402 13:39:23.827984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0402 13:39:23.827988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0402 13:39:23.828005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0402 13:39:23.828009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0402 13:39:23.828004 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0402 13:39:23.830934 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-04-02T13:39:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:37:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-04-02T13:37:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-02T13:40:51Z is after 2025-08-24T17:21:41Z" Apr 02 13:40:51 crc kubenswrapper[4732]: I0402 13:40:51.679660 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:51 crc kubenswrapper[4732]: I0402 13:40:51.679803 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:51 crc kubenswrapper[4732]: E0402 13:40:51.680002 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:51 crc kubenswrapper[4732]: I0402 13:40:51.680071 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:51 crc kubenswrapper[4732]: E0402 13:40:51.680158 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:51 crc kubenswrapper[4732]: E0402 13:40:51.680251 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:52 crc kubenswrapper[4732]: I0402 13:40:52.680124 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:52 crc kubenswrapper[4732]: E0402 13:40:52.680312 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:53 crc kubenswrapper[4732]: I0402 13:40:53.680079 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:53 crc kubenswrapper[4732]: I0402 13:40:53.680241 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:53 crc kubenswrapper[4732]: E0402 13:40:53.680303 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:53 crc kubenswrapper[4732]: I0402 13:40:53.680393 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:53 crc kubenswrapper[4732]: E0402 13:40:53.680451 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:53 crc kubenswrapper[4732]: E0402 13:40:53.680506 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:54 crc kubenswrapper[4732]: I0402 13:40:54.679951 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:54 crc kubenswrapper[4732]: E0402 13:40:54.680156 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:54 crc kubenswrapper[4732]: I0402 13:40:54.746310 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podStartSLOduration=134.746286207 podStartE2EDuration="2m14.746286207s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:40:54.745838455 +0000 UTC m=+211.650246038" watchObservedRunningTime="2026-04-02 13:40:54.746286207 +0000 UTC m=+211.650693800" Apr 02 13:40:54 crc kubenswrapper[4732]: I0402 13:40:54.760474 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-s52gj" podStartSLOduration=134.760454611 podStartE2EDuration="2m14.760454611s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:40:54.76038906 +0000 UTC m=+211.664796623" watchObservedRunningTime="2026-04-02 13:40:54.760454611 +0000 UTC m=+211.664862164" Apr 02 13:40:54 crc kubenswrapper[4732]: I0402 13:40:54.826163 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=83.826146858 podStartE2EDuration="1m23.826146858s" podCreationTimestamp="2026-04-02 13:39:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:40:54.825892731 +0000 UTC m=+211.730300304" watchObservedRunningTime="2026-04-02 13:40:54.826146858 +0000 UTC m=+211.730554411" Apr 02 13:40:54 crc kubenswrapper[4732]: I0402 13:40:54.826665 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b2l5q" podStartSLOduration=133.826658631 podStartE2EDuration="2m13.826658631s" podCreationTimestamp="2026-04-02 13:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:40:54.805735258 +0000 UTC m=+211.710142821" watchObservedRunningTime="2026-04-02 13:40:54.826658631 +0000 UTC m=+211.731066184" Apr 02 13:40:54 crc kubenswrapper[4732]: I0402 13:40:54.862643 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=21.862608182 podStartE2EDuration="21.862608182s" podCreationTimestamp="2026-04-02 13:40:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:40:54.861868972 +0000 UTC m=+211.766276535" watchObservedRunningTime="2026-04-02 13:40:54.862608182 +0000 UTC m=+211.767015735" Apr 02 13:40:54 crc kubenswrapper[4732]: I0402 13:40:54.862823 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=12.862817277 podStartE2EDuration="12.862817277s" podCreationTimestamp="2026-04-02 13:40:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:40:54.838356581 +0000 UTC m=+211.742764134" watchObservedRunningTime="2026-04-02 13:40:54.862817277 +0000 UTC m=+211.767224850" Apr 02 13:40:54 crc kubenswrapper[4732]: I0402 13:40:54.919260 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=37.919243939 podStartE2EDuration="37.919243939s" podCreationTimestamp="2026-04-02 13:40:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:40:54.918844148 +0000 UTC m=+211.823251711" watchObservedRunningTime="2026-04-02 13:40:54.919243939 +0000 UTC m=+211.823651502" Apr 02 13:40:54 crc kubenswrapper[4732]: I0402 13:40:54.919380 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-tg9vx" podStartSLOduration=134.919375902 podStartE2EDuration="2m14.919375902s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:40:54.905073344 +0000 UTC m=+211.809480897" watchObservedRunningTime="2026-04-02 13:40:54.919375902 +0000 UTC m=+211.823783465" Apr 02 13:40:54 crc kubenswrapper[4732]: E0402 13:40:54.950697 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 02 13:40:54 crc kubenswrapper[4732]: I0402 13:40:54.953077 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-glqvf" podStartSLOduration=134.953064703 podStartE2EDuration="2m14.953064703s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:40:54.952376155 +0000 UTC m=+211.856783718" watchObservedRunningTime="2026-04-02 13:40:54.953064703 +0000 UTC m=+211.857472256" Apr 02 13:40:54 crc kubenswrapper[4732]: I0402 13:40:54.979126 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-nqhwm" podStartSLOduration=134.979101841 podStartE2EDuration="2m14.979101841s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:40:54.968372938 +0000 UTC m=+211.872780521" watchObservedRunningTime="2026-04-02 13:40:54.979101841 +0000 UTC m=+211.883509404" Apr 02 13:40:54 crc kubenswrapper[4732]: I0402 13:40:54.987494 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=73.987473743 podStartE2EDuration="1m13.987473743s" podCreationTimestamp="2026-04-02 13:39:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:40:54.986829536 +0000 UTC m=+211.891237119" watchObservedRunningTime="2026-04-02 13:40:54.987473743 +0000 UTC m=+211.891881316" Apr 02 13:40:55 crc kubenswrapper[4732]: I0402 13:40:55.679598 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:55 crc kubenswrapper[4732]: I0402 13:40:55.679722 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:55 crc kubenswrapper[4732]: E0402 13:40:55.679774 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:55 crc kubenswrapper[4732]: I0402 13:40:55.679734 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:55 crc kubenswrapper[4732]: E0402 13:40:55.679891 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:55 crc kubenswrapper[4732]: E0402 13:40:55.679957 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:56 crc kubenswrapper[4732]: I0402 13:40:56.679976 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:56 crc kubenswrapper[4732]: E0402 13:40:56.680099 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:57 crc kubenswrapper[4732]: I0402 13:40:57.679259 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:57 crc kubenswrapper[4732]: I0402 13:40:57.679259 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:57 crc kubenswrapper[4732]: I0402 13:40:57.679284 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:57 crc kubenswrapper[4732]: E0402 13:40:57.679640 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:57 crc kubenswrapper[4732]: E0402 13:40:57.679781 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:57 crc kubenswrapper[4732]: E0402 13:40:57.679375 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:57 crc kubenswrapper[4732]: I0402 13:40:57.761764 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 02 13:40:57 crc kubenswrapper[4732]: I0402 13:40:57.761939 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 02 13:40:57 crc kubenswrapper[4732]: I0402 13:40:57.761954 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 02 13:40:57 crc kubenswrapper[4732]: I0402 13:40:57.761973 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 02 13:40:57 crc kubenswrapper[4732]: I0402 13:40:57.761988 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-02T13:40:57Z","lastTransitionTime":"2026-04-02T13:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 02 13:40:57 crc kubenswrapper[4732]: I0402 13:40:57.800187 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-4brjf"] Apr 02 13:40:57 crc kubenswrapper[4732]: I0402 13:40:57.800525 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4brjf" Apr 02 13:40:57 crc kubenswrapper[4732]: I0402 13:40:57.802273 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Apr 02 13:40:57 crc kubenswrapper[4732]: I0402 13:40:57.802494 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Apr 02 13:40:57 crc kubenswrapper[4732]: I0402 13:40:57.802592 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Apr 02 13:40:57 crc kubenswrapper[4732]: I0402 13:40:57.807187 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Apr 02 13:40:57 crc kubenswrapper[4732]: I0402 13:40:57.960242 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b5b5b183-8709-49fa-8d99-4b5f89659384-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4brjf\" (UID: \"b5b5b183-8709-49fa-8d99-4b5f89659384\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4brjf" Apr 02 13:40:57 crc kubenswrapper[4732]: I0402 13:40:57.960294 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b5b5b183-8709-49fa-8d99-4b5f89659384-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4brjf\" (UID: \"b5b5b183-8709-49fa-8d99-4b5f89659384\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4brjf" Apr 02 13:40:57 crc kubenswrapper[4732]: I0402 13:40:57.960334 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5b5b183-8709-49fa-8d99-4b5f89659384-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4brjf\" (UID: \"b5b5b183-8709-49fa-8d99-4b5f89659384\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4brjf" Apr 02 13:40:57 crc kubenswrapper[4732]: I0402 13:40:57.960504 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5b5b183-8709-49fa-8d99-4b5f89659384-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4brjf\" (UID: \"b5b5b183-8709-49fa-8d99-4b5f89659384\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4brjf" Apr 02 13:40:57 crc kubenswrapper[4732]: I0402 13:40:57.960746 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5b5b183-8709-49fa-8d99-4b5f89659384-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4brjf\" (UID: \"b5b5b183-8709-49fa-8d99-4b5f89659384\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4brjf" Apr 02 13:40:58 crc kubenswrapper[4732]: I0402 13:40:58.061876 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5b5b183-8709-49fa-8d99-4b5f89659384-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4brjf\" (UID: \"b5b5b183-8709-49fa-8d99-4b5f89659384\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4brjf" Apr 02 13:40:58 crc kubenswrapper[4732]: I0402 13:40:58.061986 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b5b5b183-8709-49fa-8d99-4b5f89659384-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4brjf\" (UID: \"b5b5b183-8709-49fa-8d99-4b5f89659384\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4brjf" Apr 02 13:40:58 crc kubenswrapper[4732]: I0402 13:40:58.062044 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b5b5b183-8709-49fa-8d99-4b5f89659384-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4brjf\" (UID: \"b5b5b183-8709-49fa-8d99-4b5f89659384\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4brjf" Apr 02 13:40:58 crc kubenswrapper[4732]: I0402 13:40:58.062123 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5b5b183-8709-49fa-8d99-4b5f89659384-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4brjf\" (UID: \"b5b5b183-8709-49fa-8d99-4b5f89659384\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4brjf" Apr 02 13:40:58 crc kubenswrapper[4732]: I0402 13:40:58.062199 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5b5b183-8709-49fa-8d99-4b5f89659384-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4brjf\" (UID: \"b5b5b183-8709-49fa-8d99-4b5f89659384\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4brjf" Apr 02 13:40:58 crc kubenswrapper[4732]: I0402 13:40:58.062232 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b5b5b183-8709-49fa-8d99-4b5f89659384-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4brjf\" (UID: \"b5b5b183-8709-49fa-8d99-4b5f89659384\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4brjf" Apr 02 13:40:58 crc kubenswrapper[4732]: I0402 13:40:58.062233 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b5b5b183-8709-49fa-8d99-4b5f89659384-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4brjf\" (UID: \"b5b5b183-8709-49fa-8d99-4b5f89659384\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4brjf" Apr 02 13:40:58 crc kubenswrapper[4732]: I0402 13:40:58.063008 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5b5b183-8709-49fa-8d99-4b5f89659384-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4brjf\" (UID: \"b5b5b183-8709-49fa-8d99-4b5f89659384\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4brjf" Apr 02 13:40:58 crc kubenswrapper[4732]: I0402 13:40:58.067761 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5b5b183-8709-49fa-8d99-4b5f89659384-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4brjf\" (UID: \"b5b5b183-8709-49fa-8d99-4b5f89659384\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4brjf" Apr 02 13:40:58 crc kubenswrapper[4732]: I0402 13:40:58.090585 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5b5b183-8709-49fa-8d99-4b5f89659384-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4brjf\" (UID: \"b5b5b183-8709-49fa-8d99-4b5f89659384\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4brjf" Apr 02 13:40:58 crc kubenswrapper[4732]: I0402 13:40:58.116923 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4brjf" Apr 02 13:40:58 crc kubenswrapper[4732]: I0402 13:40:58.678377 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Apr 02 13:40:58 crc kubenswrapper[4732]: I0402 13:40:58.679552 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:40:58 crc kubenswrapper[4732]: E0402 13:40:58.679989 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:40:58 crc kubenswrapper[4732]: I0402 13:40:58.686404 4732 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Apr 02 13:40:58 crc kubenswrapper[4732]: I0402 13:40:58.788238 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4brjf" event={"ID":"b5b5b183-8709-49fa-8d99-4b5f89659384","Type":"ContainerStarted","Data":"ce8b67dee07c5516f2ef7a889b55238ab07c4a9f6acac44a6b69feb72338d6ee"} Apr 02 13:40:58 crc kubenswrapper[4732]: I0402 13:40:58.788283 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4brjf" event={"ID":"b5b5b183-8709-49fa-8d99-4b5f89659384","Type":"ContainerStarted","Data":"ef80c3aa307738f654a1d49941c2570e51e22626f5ad7d17270eb9314620e151"} Apr 02 13:40:58 crc kubenswrapper[4732]: I0402 13:40:58.803371 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4brjf" podStartSLOduration=138.803354615 podStartE2EDuration="2m18.803354615s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:40:58.802557884 +0000 UTC m=+215.706965447" watchObservedRunningTime="2026-04-02 13:40:58.803354615 +0000 UTC m=+215.707762168" Apr 02 13:40:59 crc kubenswrapper[4732]: I0402 13:40:59.679395 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:40:59 crc kubenswrapper[4732]: I0402 13:40:59.679489 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:40:59 crc kubenswrapper[4732]: I0402 13:40:59.679428 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:40:59 crc kubenswrapper[4732]: E0402 13:40:59.679553 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:40:59 crc kubenswrapper[4732]: E0402 13:40:59.679693 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:40:59 crc kubenswrapper[4732]: E0402 13:40:59.679785 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:40:59 crc kubenswrapper[4732]: E0402 13:40:59.951701 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 02 13:41:00 crc kubenswrapper[4732]: I0402 13:41:00.679657 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:41:00 crc kubenswrapper[4732]: E0402 13:41:00.679846 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:41:01 crc kubenswrapper[4732]: I0402 13:41:01.547078 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:41:01 crc kubenswrapper[4732]: I0402 13:41:01.679486 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:41:01 crc kubenswrapper[4732]: I0402 13:41:01.679486 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:41:01 crc kubenswrapper[4732]: I0402 13:41:01.679548 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:41:01 crc kubenswrapper[4732]: E0402 13:41:01.679760 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:41:01 crc kubenswrapper[4732]: E0402 13:41:01.679796 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:41:01 crc kubenswrapper[4732]: E0402 13:41:01.679852 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:41:01 crc kubenswrapper[4732]: I0402 13:41:01.958192 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:41:01 crc kubenswrapper[4732]: I0402 13:41:01.959002 4732 scope.go:117] "RemoveContainer" containerID="8438ae7262b1ff33b5168b83661bede090ebb04853882f438b8fad8650d131d1" Apr 02 13:41:01 crc kubenswrapper[4732]: E0402 13:41:01.959159 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8qmgp_openshift-ovn-kubernetes(a6f5e483-7d6b-4d6d-be84-303d8f07643e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" Apr 02 13:41:02 crc kubenswrapper[4732]: I0402 13:41:02.679834 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:41:02 crc kubenswrapper[4732]: E0402 13:41:02.679980 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:41:03 crc kubenswrapper[4732]: I0402 13:41:03.679823 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:41:03 crc kubenswrapper[4732]: I0402 13:41:03.679858 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:41:03 crc kubenswrapper[4732]: I0402 13:41:03.679929 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:41:03 crc kubenswrapper[4732]: E0402 13:41:03.679953 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:41:03 crc kubenswrapper[4732]: E0402 13:41:03.680066 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:41:03 crc kubenswrapper[4732]: E0402 13:41:03.680143 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:41:04 crc kubenswrapper[4732]: I0402 13:41:04.679879 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:41:04 crc kubenswrapper[4732]: E0402 13:41:04.680820 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:41:04 crc kubenswrapper[4732]: E0402 13:41:04.952194 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 02 13:41:05 crc kubenswrapper[4732]: I0402 13:41:05.679424 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:41:05 crc kubenswrapper[4732]: I0402 13:41:05.679424 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:41:05 crc kubenswrapper[4732]: I0402 13:41:05.679440 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:41:05 crc kubenswrapper[4732]: E0402 13:41:05.679665 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:41:05 crc kubenswrapper[4732]: E0402 13:41:05.679551 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:41:05 crc kubenswrapper[4732]: E0402 13:41:05.679818 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:41:06 crc kubenswrapper[4732]: I0402 13:41:06.679944 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:41:06 crc kubenswrapper[4732]: E0402 13:41:06.680136 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:41:06 crc kubenswrapper[4732]: I0402 13:41:06.813561 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s52gj_ad206957-df5c-4b3e-bd35-e798a07d2f4e/kube-multus/1.log" Apr 02 13:41:06 crc kubenswrapper[4732]: I0402 13:41:06.813953 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s52gj_ad206957-df5c-4b3e-bd35-e798a07d2f4e/kube-multus/0.log" Apr 02 13:41:06 crc kubenswrapper[4732]: I0402 13:41:06.814008 4732 generic.go:334] "Generic (PLEG): container finished" podID="ad206957-df5c-4b3e-bd35-e798a07d2f4e" containerID="59880ea9acc1fa69e9974f404371d2876c60b9ad12942fb9c6a0aa01b0632050" exitCode=1 Apr 02 13:41:06 crc kubenswrapper[4732]: I0402 13:41:06.814040 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s52gj" event={"ID":"ad206957-df5c-4b3e-bd35-e798a07d2f4e","Type":"ContainerDied","Data":"59880ea9acc1fa69e9974f404371d2876c60b9ad12942fb9c6a0aa01b0632050"} Apr 02 13:41:06 crc kubenswrapper[4732]: I0402 13:41:06.814086 4732 scope.go:117] "RemoveContainer" containerID="169383eed1546547fe6f683a3607c929733ee7049497b740db106a44ce7d1c64" Apr 02 13:41:06 crc kubenswrapper[4732]: I0402 13:41:06.814477 4732 scope.go:117] "RemoveContainer" containerID="59880ea9acc1fa69e9974f404371d2876c60b9ad12942fb9c6a0aa01b0632050" Apr 02 13:41:06 crc kubenswrapper[4732]: E0402 13:41:06.814634 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-s52gj_openshift-multus(ad206957-df5c-4b3e-bd35-e798a07d2f4e)\"" pod="openshift-multus/multus-s52gj" podUID="ad206957-df5c-4b3e-bd35-e798a07d2f4e" Apr 02 13:41:07 crc kubenswrapper[4732]: I0402 13:41:07.679374 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:41:07 crc kubenswrapper[4732]: I0402 13:41:07.679432 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:41:07 crc kubenswrapper[4732]: I0402 13:41:07.679492 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:41:07 crc kubenswrapper[4732]: E0402 13:41:07.679610 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:41:07 crc kubenswrapper[4732]: E0402 13:41:07.679832 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:41:07 crc kubenswrapper[4732]: E0402 13:41:07.679939 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:41:07 crc kubenswrapper[4732]: I0402 13:41:07.818113 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s52gj_ad206957-df5c-4b3e-bd35-e798a07d2f4e/kube-multus/1.log" Apr 02 13:41:08 crc kubenswrapper[4732]: I0402 13:41:08.679576 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:41:08 crc kubenswrapper[4732]: E0402 13:41:08.679800 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:41:09 crc kubenswrapper[4732]: I0402 13:41:09.679827 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:41:09 crc kubenswrapper[4732]: E0402 13:41:09.679962 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:41:09 crc kubenswrapper[4732]: I0402 13:41:09.679988 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:41:09 crc kubenswrapper[4732]: E0402 13:41:09.680106 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:41:09 crc kubenswrapper[4732]: I0402 13:41:09.680119 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:41:09 crc kubenswrapper[4732]: E0402 13:41:09.680295 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:41:09 crc kubenswrapper[4732]: E0402 13:41:09.953913 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 02 13:41:10 crc kubenswrapper[4732]: I0402 13:41:10.679646 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:41:10 crc kubenswrapper[4732]: E0402 13:41:10.679799 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:41:11 crc kubenswrapper[4732]: I0402 13:41:11.679997 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:41:11 crc kubenswrapper[4732]: I0402 13:41:11.680086 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:41:11 crc kubenswrapper[4732]: E0402 13:41:11.680150 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:41:11 crc kubenswrapper[4732]: I0402 13:41:11.680098 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:41:11 crc kubenswrapper[4732]: E0402 13:41:11.680274 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:41:11 crc kubenswrapper[4732]: E0402 13:41:11.680427 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:41:12 crc kubenswrapper[4732]: I0402 13:41:12.680352 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:41:12 crc kubenswrapper[4732]: E0402 13:41:12.680539 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:41:13 crc kubenswrapper[4732]: I0402 13:41:13.680113 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:41:13 crc kubenswrapper[4732]: I0402 13:41:13.680178 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:41:13 crc kubenswrapper[4732]: E0402 13:41:13.680260 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:41:13 crc kubenswrapper[4732]: I0402 13:41:13.680312 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:41:13 crc kubenswrapper[4732]: E0402 13:41:13.680450 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:41:13 crc kubenswrapper[4732]: E0402 13:41:13.680569 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:41:14 crc kubenswrapper[4732]: I0402 13:41:14.680098 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:41:14 crc kubenswrapper[4732]: E0402 13:41:14.680253 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:41:14 crc kubenswrapper[4732]: E0402 13:41:14.954461 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 02 13:41:15 crc kubenswrapper[4732]: I0402 13:41:15.679552 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:41:15 crc kubenswrapper[4732]: I0402 13:41:15.679559 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:41:15 crc kubenswrapper[4732]: I0402 13:41:15.679648 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:41:15 crc kubenswrapper[4732]: E0402 13:41:15.680193 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:41:15 crc kubenswrapper[4732]: E0402 13:41:15.680393 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:41:15 crc kubenswrapper[4732]: E0402 13:41:15.680609 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:41:15 crc kubenswrapper[4732]: I0402 13:41:15.681251 4732 scope.go:117] "RemoveContainer" containerID="8438ae7262b1ff33b5168b83661bede090ebb04853882f438b8fad8650d131d1" Apr 02 13:41:15 crc kubenswrapper[4732]: I0402 13:41:15.845596 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qmgp_a6f5e483-7d6b-4d6d-be84-303d8f07643e/ovnkube-controller/3.log" Apr 02 13:41:15 crc kubenswrapper[4732]: I0402 13:41:15.848221 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerStarted","Data":"3e2ff95a85f25b815a62b7e159ba1b483e166fb7951d270c49c1d1cd5ba86e09"} Apr 02 13:41:16 crc kubenswrapper[4732]: I0402 13:41:16.511601 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-crx2z"] Apr 02 13:41:16 crc kubenswrapper[4732]: I0402 13:41:16.511752 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:41:16 crc kubenswrapper[4732]: E0402 13:41:16.511882 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:41:16 crc kubenswrapper[4732]: I0402 13:41:16.851876 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:41:17 crc kubenswrapper[4732]: I0402 13:41:17.679395 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:41:17 crc kubenswrapper[4732]: I0402 13:41:17.679487 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:41:17 crc kubenswrapper[4732]: E0402 13:41:17.679525 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:41:17 crc kubenswrapper[4732]: E0402 13:41:17.679605 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:41:17 crc kubenswrapper[4732]: I0402 13:41:17.679412 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:41:17 crc kubenswrapper[4732]: E0402 13:41:17.679799 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:41:18 crc kubenswrapper[4732]: I0402 13:41:18.679573 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:41:18 crc kubenswrapper[4732]: E0402 13:41:18.679980 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:41:18 crc kubenswrapper[4732]: I0402 13:41:18.680388 4732 scope.go:117] "RemoveContainer" containerID="59880ea9acc1fa69e9974f404371d2876c60b9ad12942fb9c6a0aa01b0632050" Apr 02 13:41:18 crc kubenswrapper[4732]: I0402 13:41:18.709523 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" podStartSLOduration=158.70950326 podStartE2EDuration="2m38.70950326s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:16.886729115 +0000 UTC m=+233.791136698" watchObservedRunningTime="2026-04-02 13:41:18.70950326 +0000 UTC m=+235.613910813" Apr 02 13:41:18 crc kubenswrapper[4732]: I0402 13:41:18.859840 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s52gj_ad206957-df5c-4b3e-bd35-e798a07d2f4e/kube-multus/1.log" Apr 02 13:41:18 crc kubenswrapper[4732]: I0402 13:41:18.859894 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s52gj" event={"ID":"ad206957-df5c-4b3e-bd35-e798a07d2f4e","Type":"ContainerStarted","Data":"820f77a95a045dd94c63cb41cc026f01d115283a3748994d26bdeafeace13fd8"} Apr 02 13:41:19 crc kubenswrapper[4732]: I0402 13:41:19.679353 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:41:19 crc kubenswrapper[4732]: I0402 13:41:19.679474 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:41:19 crc kubenswrapper[4732]: E0402 13:41:19.679511 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:41:19 crc kubenswrapper[4732]: I0402 13:41:19.679392 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:41:19 crc kubenswrapper[4732]: E0402 13:41:19.680016 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:41:19 crc kubenswrapper[4732]: E0402 13:41:19.680109 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:41:19 crc kubenswrapper[4732]: E0402 13:41:19.956854 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 02 13:41:20 crc kubenswrapper[4732]: I0402 13:41:20.679965 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:41:20 crc kubenswrapper[4732]: E0402 13:41:20.680159 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:41:21 crc kubenswrapper[4732]: I0402 13:41:21.679401 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:41:21 crc kubenswrapper[4732]: I0402 13:41:21.679401 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:41:21 crc kubenswrapper[4732]: E0402 13:41:21.679541 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:41:21 crc kubenswrapper[4732]: I0402 13:41:21.679422 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:41:21 crc kubenswrapper[4732]: E0402 13:41:21.679754 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:41:21 crc kubenswrapper[4732]: E0402 13:41:21.679667 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:41:22 crc kubenswrapper[4732]: I0402 13:41:22.679567 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:41:22 crc kubenswrapper[4732]: E0402 13:41:22.679758 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:41:23 crc kubenswrapper[4732]: I0402 13:41:23.679632 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:41:23 crc kubenswrapper[4732]: I0402 13:41:23.679749 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:41:23 crc kubenswrapper[4732]: I0402 13:41:23.679811 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:41:23 crc kubenswrapper[4732]: E0402 13:41:23.679776 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 02 13:41:23 crc kubenswrapper[4732]: E0402 13:41:23.679964 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 02 13:41:23 crc kubenswrapper[4732]: E0402 13:41:23.680058 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 02 13:41:24 crc kubenswrapper[4732]: I0402 13:41:24.679381 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:41:24 crc kubenswrapper[4732]: E0402 13:41:24.681447 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crx2z" podUID="386bd92b-c67e-4cc6-8a47-6f8d6e799bc7" Apr 02 13:41:25 crc kubenswrapper[4732]: I0402 13:41:25.679497 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:41:25 crc kubenswrapper[4732]: I0402 13:41:25.679514 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:41:25 crc kubenswrapper[4732]: I0402 13:41:25.679564 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:41:25 crc kubenswrapper[4732]: I0402 13:41:25.682009 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Apr 02 13:41:25 crc kubenswrapper[4732]: I0402 13:41:25.682078 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Apr 02 13:41:25 crc kubenswrapper[4732]: I0402 13:41:25.682271 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Apr 02 13:41:25 crc kubenswrapper[4732]: I0402 13:41:25.683190 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Apr 02 13:41:26 crc kubenswrapper[4732]: I0402 13:41:26.679809 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:41:26 crc kubenswrapper[4732]: I0402 13:41:26.682750 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Apr 02 13:41:26 crc kubenswrapper[4732]: I0402 13:41:26.683288 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.468391 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.497941 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.498450 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.498989 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-6fp76"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.499899 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6fp76" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.500360 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-86zsp"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.500788 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.501486 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7sxt"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.501976 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7sxt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.502509 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vmngg"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.502966 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vmngg" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.503112 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.509152 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qdvzn"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.509582 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qdvzn" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.511965 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.513314 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r98n2"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.513870 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r98n2" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.514046 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.514080 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.514131 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.514244 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.514354 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.514371 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.514398 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.514472 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.514660 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.514666 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.514733 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.514736 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.514804 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.515214 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-76hwt"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.515230 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.515237 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.515287 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.515386 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.515395 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.515457 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.515539 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.515604 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-76hwt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.515606 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.515662 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.515729 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.515762 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.515824 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.515835 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.515847 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.515868 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.515880 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.515958 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.515961 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.515976 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.516058 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.522123 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.522491 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.522560 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.522575 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.522587 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.523887 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.531591 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.537719 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.541707 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f998c"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.543095 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-54khq"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.543751 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.548921 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.548945 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.556340 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.556471 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.556723 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.556953 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f998c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.557237 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.557728 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.557751 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.559162 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fvjh7"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.559540 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.559694 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvjh7" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.559885 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sxw8j"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.560294 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sxw8j" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.562321 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-95tg7"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.562530 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.562687 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-95tg7" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.562902 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-xl58c"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.563339 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xl58c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.564117 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-szr89"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.564855 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-szr89" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.571126 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.571973 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.572012 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.572214 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.572364 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.572724 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.572856 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.573217 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.573391 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.573680 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.573841 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.574155 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.574272 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zhxz4"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.574856 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.575674 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-845fh"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.576275 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-845fh" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.576780 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.576923 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.576783 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.576866 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.577334 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.577979 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.578205 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.578271 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.585273 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.585410 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.585320 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.585597 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.585383 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.585739 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.593758 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.594151 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.594246 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.594198 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8dxlz"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.594217 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.594836 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.594924 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5425ff81-73c8-4fca-b208-9c9dbc6a949d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-m7sxt\" (UID: \"5425ff81-73c8-4fca-b208-9c9dbc6a949d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7sxt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.594987 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595031 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595047 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595060 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78ttp\" (UniqueName: \"kubernetes.io/projected/2c662639-32a4-4f78-af37-9b1e65bab4e8-kube-api-access-78ttp\") pod \"route-controller-manager-6576b87f9c-jbmtp\" (UID: \"2c662639-32a4-4f78-af37-9b1e65bab4e8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595086 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/49c54f44-4a94-4b19-b03d-8469355931d0-etcd-ca\") pod \"etcd-operator-b45778765-qdvzn\" (UID: \"49c54f44-4a94-4b19-b03d-8469355931d0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qdvzn" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595144 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595228 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0bcf0e9-75b6-443e-afd8-e2fb6f807e90-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-95tg7\" (UID: \"d0bcf0e9-75b6-443e-afd8-e2fb6f807e90\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-95tg7" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595260 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595280 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d20f42f-7f98-4c97-a28e-17749d977819-trusted-ca\") pod \"ingress-operator-5b745b69d9-fvjh7\" (UID: \"6d20f42f-7f98-4c97-a28e-17749d977819\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvjh7" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595297 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e1e85ba5-52ed-4b2f-9901-b04090159f4c-auth-proxy-config\") pod \"machine-approver-56656f9798-6fp76\" (UID: \"e1e85ba5-52ed-4b2f-9901-b04090159f4c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6fp76" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595312 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e1e85ba5-52ed-4b2f-9901-b04090159f4c-machine-approver-tls\") pod \"machine-approver-56656f9798-6fp76\" (UID: \"e1e85ba5-52ed-4b2f-9901-b04090159f4c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6fp76" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595356 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d75d692-d062-4dfa-b7e7-ea53683bc549-proxy-tls\") pod \"machine-config-controller-84d6567774-xl58c\" (UID: \"2d75d692-d062-4dfa-b7e7-ea53683bc549\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xl58c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595385 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c662639-32a4-4f78-af37-9b1e65bab4e8-serving-cert\") pod \"route-controller-manager-6576b87f9c-jbmtp\" (UID: \"2c662639-32a4-4f78-af37-9b1e65bab4e8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595408 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a34a201-8137-4efe-a99a-1ebd89e40c68-config\") pod \"console-operator-58897d9998-76hwt\" (UID: \"4a34a201-8137-4efe-a99a-1ebd89e40c68\") " pod="openshift-console-operator/console-operator-58897d9998-76hwt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595430 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmhxj\" (UniqueName: \"kubernetes.io/projected/c70b5281-74d8-44ff-8f4b-326a3d7192aa-kube-api-access-tmhxj\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595452 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2d75d692-d062-4dfa-b7e7-ea53683bc549-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-xl58c\" (UID: \"2d75d692-d062-4dfa-b7e7-ea53683bc549\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xl58c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595475 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595501 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595521 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a34a201-8137-4efe-a99a-1ebd89e40c68-serving-cert\") pod \"console-operator-58897d9998-76hwt\" (UID: \"4a34a201-8137-4efe-a99a-1ebd89e40c68\") " pod="openshift-console-operator/console-operator-58897d9998-76hwt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595543 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/36f26e27-d72e-42f2-9380-598616e5626b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-f998c\" (UID: \"36f26e27-d72e-42f2-9380-598616e5626b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f998c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595564 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ff7c4e9d-5437-412b-867d-1e44dfc73df5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-r98n2\" (UID: \"ff7c4e9d-5437-412b-867d-1e44dfc73df5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r98n2" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595586 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6qfk\" (UniqueName: \"kubernetes.io/projected/5425ff81-73c8-4fca-b208-9c9dbc6a949d-kube-api-access-q6qfk\") pod \"openshift-apiserver-operator-796bbdcf4f-m7sxt\" (UID: \"5425ff81-73c8-4fca-b208-9c9dbc6a949d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7sxt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595614 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16ec121b-fdf2-452d-8963-08d6132f7c5c-client-ca\") pod \"controller-manager-879f6c89f-54khq\" (UID: \"16ec121b-fdf2-452d-8963-08d6132f7c5c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595654 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c70b5281-74d8-44ff-8f4b-326a3d7192aa-audit-dir\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595695 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c662639-32a4-4f78-af37-9b1e65bab4e8-client-ca\") pod \"route-controller-manager-6576b87f9c-jbmtp\" (UID: \"2c662639-32a4-4f78-af37-9b1e65bab4e8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595733 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8btn6\" (UniqueName: \"kubernetes.io/projected/16ec121b-fdf2-452d-8963-08d6132f7c5c-kube-api-access-8btn6\") pod \"controller-manager-879f6c89f-54khq\" (UID: \"16ec121b-fdf2-452d-8963-08d6132f7c5c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595758 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36f26e27-d72e-42f2-9380-598616e5626b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-f998c\" (UID: \"36f26e27-d72e-42f2-9380-598616e5626b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f998c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595776 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac5f29d0-13ce-46eb-babc-70f32ac34feb-config\") pod \"kube-apiserver-operator-766d6c64bb-sxw8j\" (UID: \"ac5f29d0-13ce-46eb-babc-70f32ac34feb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sxw8j" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595794 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f0c6d23c-1aff-4a2f-b92b-6cb07a0ca8b6-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vmngg\" (UID: \"f0c6d23c-1aff-4a2f-b92b-6cb07a0ca8b6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vmngg" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595812 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c662639-32a4-4f78-af37-9b1e65bab4e8-config\") pod \"route-controller-manager-6576b87f9c-jbmtp\" (UID: \"2c662639-32a4-4f78-af37-9b1e65bab4e8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595869 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16ec121b-fdf2-452d-8963-08d6132f7c5c-serving-cert\") pod \"controller-manager-879f6c89f-54khq\" (UID: \"16ec121b-fdf2-452d-8963-08d6132f7c5c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595899 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16ec121b-fdf2-452d-8963-08d6132f7c5c-config\") pod \"controller-manager-879f6c89f-54khq\" (UID: \"16ec121b-fdf2-452d-8963-08d6132f7c5c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595915 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zltjk\" (UniqueName: \"kubernetes.io/projected/36f26e27-d72e-42f2-9380-598616e5626b-kube-api-access-zltjk\") pod \"cluster-image-registry-operator-dc59b4c8b-f998c\" (UID: \"36f26e27-d72e-42f2-9380-598616e5626b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f998c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595960 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.595988 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49c54f44-4a94-4b19-b03d-8469355931d0-config\") pod \"etcd-operator-b45778765-qdvzn\" (UID: \"49c54f44-4a94-4b19-b03d-8469355931d0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qdvzn" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596008 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596026 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac5f29d0-13ce-46eb-babc-70f32ac34feb-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sxw8j\" (UID: \"ac5f29d0-13ce-46eb-babc-70f32ac34feb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sxw8j" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596080 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d20f42f-7f98-4c97-a28e-17749d977819-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fvjh7\" (UID: \"6d20f42f-7f98-4c97-a28e-17749d977819\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvjh7" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596096 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5mx6\" (UniqueName: \"kubernetes.io/projected/49c54f44-4a94-4b19-b03d-8469355931d0-kube-api-access-c5mx6\") pod \"etcd-operator-b45778765-qdvzn\" (UID: \"49c54f44-4a94-4b19-b03d-8469355931d0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qdvzn" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596117 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj7fw\" (UniqueName: \"kubernetes.io/projected/2d75d692-d062-4dfa-b7e7-ea53683bc549-kube-api-access-zj7fw\") pod \"machine-config-controller-84d6567774-xl58c\" (UID: \"2d75d692-d062-4dfa-b7e7-ea53683bc549\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xl58c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596133 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0c6d23c-1aff-4a2f-b92b-6cb07a0ca8b6-serving-cert\") pod \"openshift-config-operator-7777fb866f-vmngg\" (UID: \"f0c6d23c-1aff-4a2f-b92b-6cb07a0ca8b6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vmngg" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596150 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d20f42f-7f98-4c97-a28e-17749d977819-metrics-tls\") pod \"ingress-operator-5b745b69d9-fvjh7\" (UID: \"6d20f42f-7f98-4c97-a28e-17749d977819\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvjh7" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596170 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/36f26e27-d72e-42f2-9380-598616e5626b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-f998c\" (UID: \"36f26e27-d72e-42f2-9380-598616e5626b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f998c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596184 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49c54f44-4a94-4b19-b03d-8469355931d0-serving-cert\") pod \"etcd-operator-b45778765-qdvzn\" (UID: \"49c54f44-4a94-4b19-b03d-8469355931d0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qdvzn" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596197 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac5f29d0-13ce-46eb-babc-70f32ac34feb-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sxw8j\" (UID: \"ac5f29d0-13ce-46eb-babc-70f32ac34feb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sxw8j" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596240 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596256 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1e85ba5-52ed-4b2f-9901-b04090159f4c-config\") pod \"machine-approver-56656f9798-6fp76\" (UID: \"e1e85ba5-52ed-4b2f-9901-b04090159f4c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6fp76" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596284 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/49c54f44-4a94-4b19-b03d-8469355931d0-etcd-client\") pod \"etcd-operator-b45778765-qdvzn\" (UID: \"49c54f44-4a94-4b19-b03d-8469355931d0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qdvzn" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596300 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0bcf0e9-75b6-443e-afd8-e2fb6f807e90-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-95tg7\" (UID: \"d0bcf0e9-75b6-443e-afd8-e2fb6f807e90\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-95tg7" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596316 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596333 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rv8j\" (UniqueName: \"kubernetes.io/projected/ff7c4e9d-5437-412b-867d-1e44dfc73df5-kube-api-access-6rv8j\") pod \"cluster-samples-operator-665b6dd947-r98n2\" (UID: \"ff7c4e9d-5437-412b-867d-1e44dfc73df5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r98n2" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596349 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a34a201-8137-4efe-a99a-1ebd89e40c68-trusted-ca\") pod \"console-operator-58897d9998-76hwt\" (UID: \"4a34a201-8137-4efe-a99a-1ebd89e40c68\") " pod="openshift-console-operator/console-operator-58897d9998-76hwt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596364 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xznt7\" (UniqueName: \"kubernetes.io/projected/e1e85ba5-52ed-4b2f-9901-b04090159f4c-kube-api-access-xznt7\") pod \"machine-approver-56656f9798-6fp76\" (UID: \"e1e85ba5-52ed-4b2f-9901-b04090159f4c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6fp76" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596384 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0bcf0e9-75b6-443e-afd8-e2fb6f807e90-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-95tg7\" (UID: \"d0bcf0e9-75b6-443e-afd8-e2fb6f807e90\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-95tg7" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596402 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c70b5281-74d8-44ff-8f4b-326a3d7192aa-audit-policies\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596419 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596436 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16ec121b-fdf2-452d-8963-08d6132f7c5c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-54khq\" (UID: \"16ec121b-fdf2-452d-8963-08d6132f7c5c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596451 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c54f44-4a94-4b19-b03d-8469355931d0-etcd-service-ca\") pod \"etcd-operator-b45778765-qdvzn\" (UID: \"49c54f44-4a94-4b19-b03d-8469355931d0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qdvzn" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596466 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcckl\" (UniqueName: \"kubernetes.io/projected/4a34a201-8137-4efe-a99a-1ebd89e40c68-kube-api-access-zcckl\") pod \"console-operator-58897d9998-76hwt\" (UID: \"4a34a201-8137-4efe-a99a-1ebd89e40c68\") " pod="openshift-console-operator/console-operator-58897d9998-76hwt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596488 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4qvl\" (UniqueName: \"kubernetes.io/projected/6d20f42f-7f98-4c97-a28e-17749d977819-kube-api-access-r4qvl\") pod \"ingress-operator-5b745b69d9-fvjh7\" (UID: \"6d20f42f-7f98-4c97-a28e-17749d977819\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvjh7" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596504 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5425ff81-73c8-4fca-b208-9c9dbc6a949d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-m7sxt\" (UID: \"5425ff81-73c8-4fca-b208-9c9dbc6a949d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7sxt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.596520 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s255l\" (UniqueName: \"kubernetes.io/projected/f0c6d23c-1aff-4a2f-b92b-6cb07a0ca8b6-kube-api-access-s255l\") pod \"openshift-config-operator-7777fb866f-vmngg\" (UID: \"f0c6d23c-1aff-4a2f-b92b-6cb07a0ca8b6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vmngg" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.597657 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.597867 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.598200 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8qffd"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.598745 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8qffd" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.601571 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8dxlz" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.603837 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-m4dzs"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.604394 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4dzs" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.605496 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-s6sm2"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.616345 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-s6sm2" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.615030 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.615062 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.618000 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.621604 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.631604 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.633858 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kpg9f"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.634628 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.634878 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-kpg9f" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.635058 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.635766 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-8qcmg"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.636545 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-8qcmg" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.637155 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-dq9x9"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.637785 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.638399 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.641114 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.642254 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.642854 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.643089 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.643279 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.643469 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.643867 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.651756 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2gdhw"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.652236 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8m6h"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.652486 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xzg9j"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.652684 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8m6h" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.652690 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2gdhw" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.653514 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.654908 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-6qzc4"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.655071 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.655483 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6qzc4" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.656466 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-w7tht"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.657302 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w7tht" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.658710 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-t9dlm"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.659239 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-t9dlm" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.659729 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-zmhn7"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.660359 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zmhn7" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.660390 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.664113 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77jmk"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.664836 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ggc4c"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.665999 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ggc4c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.666259 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77jmk" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.666403 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7sxt"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.667721 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-brl6n"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.668301 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29585610-wbzhk"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.668804 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29585610-wbzhk" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.669056 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-brl6n" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.669056 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585620-t897v"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.669696 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585620-t897v" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.674661 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dcwb2"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.675442 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dcwb2" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.675507 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vmngg"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.677484 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.678378 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-86zsp"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.679049 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntdkq"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.679747 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntdkq" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.684949 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.689114 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r98n2"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.689226 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f998c"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.689296 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.689944 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-95tg7"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.690031 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fvjh7"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.690101 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sxw8j"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.690163 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wbcwp"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.690673 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8dxlz"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.690766 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qdvzn"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.690895 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wbcwp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.691165 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.691184 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zhxz4"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.692163 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8qffd"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.693224 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-m4dzs"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.697686 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d20f42f-7f98-4c97-a28e-17749d977819-metrics-tls\") pod \"ingress-operator-5b745b69d9-fvjh7\" (UID: \"6d20f42f-7f98-4c97-a28e-17749d977819\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvjh7" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.697725 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/36f26e27-d72e-42f2-9380-598616e5626b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-f998c\" (UID: \"36f26e27-d72e-42f2-9380-598616e5626b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f998c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.697749 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49c54f44-4a94-4b19-b03d-8469355931d0-serving-cert\") pod \"etcd-operator-b45778765-qdvzn\" (UID: \"49c54f44-4a94-4b19-b03d-8469355931d0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qdvzn" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.697771 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac5f29d0-13ce-46eb-babc-70f32ac34feb-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sxw8j\" (UID: \"ac5f29d0-13ce-46eb-babc-70f32ac34feb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sxw8j" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.697800 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9d200061-d82a-4b89-9bea-83a1c7d9eca8-srv-cert\") pod \"catalog-operator-68c6474976-ggc4c\" (UID: \"9d200061-d82a-4b89-9bea-83a1c7d9eca8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ggc4c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.697822 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1e85ba5-52ed-4b2f-9901-b04090159f4c-config\") pod \"machine-approver-56656f9798-6fp76\" (UID: \"e1e85ba5-52ed-4b2f-9901-b04090159f4c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6fp76" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.697842 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/803d2a64-1416-46cb-ae46-5a8462b057f9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2gdhw\" (UID: \"803d2a64-1416-46cb-ae46-5a8462b057f9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2gdhw" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.697913 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.697959 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3568fcc7-10bd-4972-9782-b97aa3c9c8a0-images\") pod \"machine-api-operator-5694c8668f-kpg9f\" (UID: \"3568fcc7-10bd-4972-9782-b97aa3c9c8a0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kpg9f" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.697982 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4c6l\" (UniqueName: \"kubernetes.io/projected/446ace70-43fa-494c-8bce-13a5ea3ca452-kube-api-access-k4c6l\") pod \"dns-operator-744455d44c-t9dlm\" (UID: \"446ace70-43fa-494c-8bce-13a5ea3ca452\") " pod="openshift-dns-operator/dns-operator-744455d44c-t9dlm" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698017 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/49c54f44-4a94-4b19-b03d-8469355931d0-etcd-client\") pod \"etcd-operator-b45778765-qdvzn\" (UID: \"49c54f44-4a94-4b19-b03d-8469355931d0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qdvzn" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698039 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rv8j\" (UniqueName: \"kubernetes.io/projected/ff7c4e9d-5437-412b-867d-1e44dfc73df5-kube-api-access-6rv8j\") pod \"cluster-samples-operator-665b6dd947-r98n2\" (UID: \"ff7c4e9d-5437-412b-867d-1e44dfc73df5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r98n2" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698060 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3568fcc7-10bd-4972-9782-b97aa3c9c8a0-config\") pod \"machine-api-operator-5694c8668f-kpg9f\" (UID: \"3568fcc7-10bd-4972-9782-b97aa3c9c8a0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kpg9f" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698079 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs7h4\" (UniqueName: \"kubernetes.io/projected/d8ff2a93-ff6b-4ef8-9109-de3c22a6f108-kube-api-access-cs7h4\") pod \"router-default-5444994796-6qzc4\" (UID: \"d8ff2a93-ff6b-4ef8-9109-de3c22a6f108\") " pod="openshift-ingress/router-default-5444994796-6qzc4" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698101 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a34a201-8137-4efe-a99a-1ebd89e40c68-trusted-ca\") pod \"console-operator-58897d9998-76hwt\" (UID: \"4a34a201-8137-4efe-a99a-1ebd89e40c68\") " pod="openshift-console-operator/console-operator-58897d9998-76hwt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698125 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1eee8837-1a56-40df-b564-bb65ad94d593-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-szr89\" (UID: \"1eee8837-1a56-40df-b564-bb65ad94d593\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-szr89" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698148 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xznt7\" (UniqueName: \"kubernetes.io/projected/e1e85ba5-52ed-4b2f-9901-b04090159f4c-kube-api-access-xznt7\") pod \"machine-approver-56656f9798-6fp76\" (UID: \"e1e85ba5-52ed-4b2f-9901-b04090159f4c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6fp76" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698166 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbd2d545-1f43-45dd-8f7e-23ea76dbc0a4-serving-cert\") pod \"service-ca-operator-777779d784-m4dzs\" (UID: \"cbd2d545-1f43-45dd-8f7e-23ea76dbc0a4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4dzs" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698190 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0bcf0e9-75b6-443e-afd8-e2fb6f807e90-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-95tg7\" (UID: \"d0bcf0e9-75b6-443e-afd8-e2fb6f807e90\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-95tg7" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698214 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/99e5508c-0d75-4f87-9c07-b53509e461aa-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-s6sm2\" (UID: \"99e5508c-0d75-4f87-9c07-b53509e461aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-s6sm2" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698242 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c70b5281-74d8-44ff-8f4b-326a3d7192aa-audit-policies\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698262 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16ec121b-fdf2-452d-8963-08d6132f7c5c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-54khq\" (UID: \"16ec121b-fdf2-452d-8963-08d6132f7c5c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698284 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2pmz\" (UniqueName: \"kubernetes.io/projected/9d200061-d82a-4b89-9bea-83a1c7d9eca8-kube-api-access-v2pmz\") pod \"catalog-operator-68c6474976-ggc4c\" (UID: \"9d200061-d82a-4b89-9bea-83a1c7d9eca8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ggc4c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698306 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4d77c191-7d04-4381-838f-b7a355e7c2d4-console-config\") pod \"console-f9d7485db-dq9x9\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698325 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4d77c191-7d04-4381-838f-b7a355e7c2d4-service-ca\") pod \"console-f9d7485db-dq9x9\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698346 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3568fcc7-10bd-4972-9782-b97aa3c9c8a0-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kpg9f\" (UID: \"3568fcc7-10bd-4972-9782-b97aa3c9c8a0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kpg9f" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698364 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d77c191-7d04-4381-838f-b7a355e7c2d4-trusted-ca-bundle\") pod \"console-f9d7485db-dq9x9\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698383 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/446ace70-43fa-494c-8bce-13a5ea3ca452-metrics-tls\") pod \"dns-operator-744455d44c-t9dlm\" (UID: \"446ace70-43fa-494c-8bce-13a5ea3ca452\") " pod="openshift-dns-operator/dns-operator-744455d44c-t9dlm" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698404 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c54f44-4a94-4b19-b03d-8469355931d0-etcd-service-ca\") pod \"etcd-operator-b45778765-qdvzn\" (UID: \"49c54f44-4a94-4b19-b03d-8469355931d0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qdvzn" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698426 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64a68003-b71d-4ac2-aaaf-76b67ed758cd-service-ca-bundle\") pod \"authentication-operator-69f744f599-8qcmg\" (UID: \"64a68003-b71d-4ac2-aaaf-76b67ed758cd\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8qcmg" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698445 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5425ff81-73c8-4fca-b208-9c9dbc6a949d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-m7sxt\" (UID: \"5425ff81-73c8-4fca-b208-9c9dbc6a949d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7sxt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698469 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s255l\" (UniqueName: \"kubernetes.io/projected/f0c6d23c-1aff-4a2f-b92b-6cb07a0ca8b6-kube-api-access-s255l\") pod \"openshift-config-operator-7777fb866f-vmngg\" (UID: \"f0c6d23c-1aff-4a2f-b92b-6cb07a0ca8b6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vmngg" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698489 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d8ff2a93-ff6b-4ef8-9109-de3c22a6f108-stats-auth\") pod \"router-default-5444994796-6qzc4\" (UID: \"d8ff2a93-ff6b-4ef8-9109-de3c22a6f108\") " pod="openshift-ingress/router-default-5444994796-6qzc4" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698509 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4qvl\" (UniqueName: \"kubernetes.io/projected/6d20f42f-7f98-4c97-a28e-17749d977819-kube-api-access-r4qvl\") pod \"ingress-operator-5b745b69d9-fvjh7\" (UID: \"6d20f42f-7f98-4c97-a28e-17749d977819\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvjh7" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698607 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67577cc5-6dee-4465-beee-ea424d976972-secret-volume\") pod \"collect-profiles-29585610-wbzhk\" (UID: \"67577cc5-6dee-4465-beee-ea424d976972\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585610-wbzhk" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698685 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d8ff2a93-ff6b-4ef8-9109-de3c22a6f108-metrics-certs\") pod \"router-default-5444994796-6qzc4\" (UID: \"d8ff2a93-ff6b-4ef8-9109-de3c22a6f108\") " pod="openshift-ingress/router-default-5444994796-6qzc4" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698812 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78ttp\" (UniqueName: \"kubernetes.io/projected/2c662639-32a4-4f78-af37-9b1e65bab4e8-kube-api-access-78ttp\") pod \"route-controller-manager-6576b87f9c-jbmtp\" (UID: \"2c662639-32a4-4f78-af37-9b1e65bab4e8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698833 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67577cc5-6dee-4465-beee-ea424d976972-config-volume\") pod \"collect-profiles-29585610-wbzhk\" (UID: \"67577cc5-6dee-4465-beee-ea424d976972\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585610-wbzhk" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698882 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698912 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0bcf0e9-75b6-443e-afd8-e2fb6f807e90-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-95tg7\" (UID: \"d0bcf0e9-75b6-443e-afd8-e2fb6f807e90\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-95tg7" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698937 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dd5add8c-a0d5-412d-925d-bd21c5893935-auth-proxy-config\") pod \"machine-config-operator-74547568cd-w7tht\" (UID: \"dd5add8c-a0d5-412d-925d-bd21c5893935\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w7tht" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698961 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.698982 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d20f42f-7f98-4c97-a28e-17749d977819-trusted-ca\") pod \"ingress-operator-5b745b69d9-fvjh7\" (UID: \"6d20f42f-7f98-4c97-a28e-17749d977819\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvjh7" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699002 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699020 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e1e85ba5-52ed-4b2f-9901-b04090159f4c-machine-approver-tls\") pod \"machine-approver-56656f9798-6fp76\" (UID: \"e1e85ba5-52ed-4b2f-9901-b04090159f4c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6fp76" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699040 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a34a201-8137-4efe-a99a-1ebd89e40c68-config\") pod \"console-operator-58897d9998-76hwt\" (UID: \"4a34a201-8137-4efe-a99a-1ebd89e40c68\") " pod="openshift-console-operator/console-operator-58897d9998-76hwt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699061 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2d75d692-d062-4dfa-b7e7-ea53683bc549-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-xl58c\" (UID: \"2d75d692-d062-4dfa-b7e7-ea53683bc549\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xl58c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699081 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5cb3c06a-e3cf-4a60-b180-82759b9d55fc-signing-key\") pod \"service-ca-9c57cc56f-brl6n\" (UID: \"5cb3c06a-e3cf-4a60-b180-82759b9d55fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-brl6n" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699099 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6t4g\" (UniqueName: \"kubernetes.io/projected/bb62d6fb-d819-4fc1-aa43-35fb1012a1ba-kube-api-access-q6t4g\") pod \"olm-operator-6b444d44fb-845fh\" (UID: \"bb62d6fb-d819-4fc1-aa43-35fb1012a1ba\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-845fh" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699119 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4d77c191-7d04-4381-838f-b7a355e7c2d4-oauth-serving-cert\") pod \"console-f9d7485db-dq9x9\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699141 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ff7c4e9d-5437-412b-867d-1e44dfc73df5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-r98n2\" (UID: \"ff7c4e9d-5437-412b-867d-1e44dfc73df5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r98n2" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699162 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vmdm\" (UniqueName: \"kubernetes.io/projected/3568fcc7-10bd-4972-9782-b97aa3c9c8a0-kube-api-access-5vmdm\") pod \"machine-api-operator-5694c8668f-kpg9f\" (UID: \"3568fcc7-10bd-4972-9782-b97aa3c9c8a0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kpg9f" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699177 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d77c191-7d04-4381-838f-b7a355e7c2d4-console-serving-cert\") pod \"console-f9d7485db-dq9x9\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699199 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a34a201-8137-4efe-a99a-1ebd89e40c68-serving-cert\") pod \"console-operator-58897d9998-76hwt\" (UID: \"4a34a201-8137-4efe-a99a-1ebd89e40c68\") " pod="openshift-console-operator/console-operator-58897d9998-76hwt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699224 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/36f26e27-d72e-42f2-9380-598616e5626b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-f998c\" (UID: \"36f26e27-d72e-42f2-9380-598616e5626b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f998c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699244 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16ec121b-fdf2-452d-8963-08d6132f7c5c-client-ca\") pod \"controller-manager-879f6c89f-54khq\" (UID: \"16ec121b-fdf2-452d-8963-08d6132f7c5c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699262 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c70b5281-74d8-44ff-8f4b-326a3d7192aa-audit-dir\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699279 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36f26e27-d72e-42f2-9380-598616e5626b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-f998c\" (UID: \"36f26e27-d72e-42f2-9380-598616e5626b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f998c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699299 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16ec121b-fdf2-452d-8963-08d6132f7c5c-serving-cert\") pod \"controller-manager-879f6c89f-54khq\" (UID: \"16ec121b-fdf2-452d-8963-08d6132f7c5c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699319 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5cb3c06a-e3cf-4a60-b180-82759b9d55fc-signing-cabundle\") pod \"service-ca-9c57cc56f-brl6n\" (UID: \"5cb3c06a-e3cf-4a60-b180-82759b9d55fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-brl6n" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699338 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49c54f44-4a94-4b19-b03d-8469355931d0-config\") pod \"etcd-operator-b45778765-qdvzn\" (UID: \"49c54f44-4a94-4b19-b03d-8469355931d0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qdvzn" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699355 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zltjk\" (UniqueName: \"kubernetes.io/projected/36f26e27-d72e-42f2-9380-598616e5626b-kube-api-access-zltjk\") pod \"cluster-image-registry-operator-dc59b4c8b-f998c\" (UID: \"36f26e27-d72e-42f2-9380-598616e5626b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f998c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699376 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbd2d545-1f43-45dd-8f7e-23ea76dbc0a4-config\") pod \"service-ca-operator-777779d784-m4dzs\" (UID: \"cbd2d545-1f43-45dd-8f7e-23ea76dbc0a4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4dzs" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699397 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699418 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d20f42f-7f98-4c97-a28e-17749d977819-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fvjh7\" (UID: \"6d20f42f-7f98-4c97-a28e-17749d977819\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvjh7" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699440 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bb62d6fb-d819-4fc1-aa43-35fb1012a1ba-profile-collector-cert\") pod \"olm-operator-6b444d44fb-845fh\" (UID: \"bb62d6fb-d819-4fc1-aa43-35fb1012a1ba\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-845fh" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699457 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd5add8c-a0d5-412d-925d-bd21c5893935-proxy-tls\") pod \"machine-config-operator-74547568cd-w7tht\" (UID: \"dd5add8c-a0d5-412d-925d-bd21c5893935\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w7tht" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699476 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64a68003-b71d-4ac2-aaaf-76b67ed758cd-config\") pod \"authentication-operator-69f744f599-8qcmg\" (UID: \"64a68003-b71d-4ac2-aaaf-76b67ed758cd\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8qcmg" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699497 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj7fw\" (UniqueName: \"kubernetes.io/projected/2d75d692-d062-4dfa-b7e7-ea53683bc549-kube-api-access-zj7fw\") pod \"machine-config-controller-84d6567774-xl58c\" (UID: \"2d75d692-d062-4dfa-b7e7-ea53683bc549\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xl58c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699517 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0c6d23c-1aff-4a2f-b92b-6cb07a0ca8b6-serving-cert\") pod \"openshift-config-operator-7777fb866f-vmngg\" (UID: \"f0c6d23c-1aff-4a2f-b92b-6cb07a0ca8b6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vmngg" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.699688 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/dd5add8c-a0d5-412d-925d-bd21c5893935-images\") pod \"machine-config-operator-74547568cd-w7tht\" (UID: \"dd5add8c-a0d5-412d-925d-bd21c5893935\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w7tht" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.700005 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0bcf0e9-75b6-443e-afd8-e2fb6f807e90-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-95tg7\" (UID: \"d0bcf0e9-75b6-443e-afd8-e2fb6f807e90\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-95tg7" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.700066 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82sjv\" (UniqueName: \"kubernetes.io/projected/803d2a64-1416-46cb-ae46-5a8462b057f9-kube-api-access-82sjv\") pod \"openshift-controller-manager-operator-756b6f6bc6-2gdhw\" (UID: \"803d2a64-1416-46cb-ae46-5a8462b057f9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2gdhw" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.700098 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqcj6\" (UniqueName: \"kubernetes.io/projected/67577cc5-6dee-4465-beee-ea424d976972-kube-api-access-rqcj6\") pod \"collect-profiles-29585610-wbzhk\" (UID: \"67577cc5-6dee-4465-beee-ea424d976972\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585610-wbzhk" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.700134 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.700157 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkkwm\" (UniqueName: \"kubernetes.io/projected/4d77c191-7d04-4381-838f-b7a355e7c2d4-kube-api-access-nkkwm\") pod \"console-f9d7485db-dq9x9\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.700446 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c70b5281-74d8-44ff-8f4b-326a3d7192aa-audit-policies\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.700937 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1e85ba5-52ed-4b2f-9901-b04090159f4c-config\") pod \"machine-approver-56656f9798-6fp76\" (UID: \"e1e85ba5-52ed-4b2f-9901-b04090159f4c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6fp76" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.701425 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16ec121b-fdf2-452d-8963-08d6132f7c5c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-54khq\" (UID: \"16ec121b-fdf2-452d-8963-08d6132f7c5c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.701950 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c54f44-4a94-4b19-b03d-8469355931d0-etcd-service-ca\") pod \"etcd-operator-b45778765-qdvzn\" (UID: \"49c54f44-4a94-4b19-b03d-8469355931d0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qdvzn" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.702403 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dcwb2"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.702432 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5425ff81-73c8-4fca-b208-9c9dbc6a949d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-m7sxt\" (UID: \"5425ff81-73c8-4fca-b208-9c9dbc6a949d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7sxt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.702447 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-t9dlm"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.702548 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-76hwt"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.702893 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.704098 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/99e5508c-0d75-4f87-9c07-b53509e461aa-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-s6sm2\" (UID: \"99e5508c-0d75-4f87-9c07-b53509e461aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-s6sm2" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.704174 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5f0e45b7-fac8-406c-bbe9-e92490d95fda-webhook-cert\") pod \"packageserver-d55dfcdfc-77jmk\" (UID: \"5f0e45b7-fac8-406c-bbe9-e92490d95fda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77jmk" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.704209 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.704242 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5f0e45b7-fac8-406c-bbe9-e92490d95fda-tmpfs\") pod \"packageserver-d55dfcdfc-77jmk\" (UID: \"5f0e45b7-fac8-406c-bbe9-e92490d95fda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77jmk" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.704330 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0bcf0e9-75b6-443e-afd8-e2fb6f807e90-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-95tg7\" (UID: \"d0bcf0e9-75b6-443e-afd8-e2fb6f807e90\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-95tg7" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.704437 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.705416 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ff413ea9-0a19-4fc0-8067-9521bc9e472c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8qffd\" (UID: \"ff413ea9-0a19-4fc0-8067-9521bc9e472c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8qffd" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.706138 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/49c54f44-4a94-4b19-b03d-8469355931d0-etcd-client\") pod \"etcd-operator-b45778765-qdvzn\" (UID: \"49c54f44-4a94-4b19-b03d-8469355931d0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qdvzn" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.706449 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a34a201-8137-4efe-a99a-1ebd89e40c68-config\") pod \"console-operator-58897d9998-76hwt\" (UID: \"4a34a201-8137-4efe-a99a-1ebd89e40c68\") " pod="openshift-console-operator/console-operator-58897d9998-76hwt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.706577 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a34a201-8137-4efe-a99a-1ebd89e40c68-trusted-ca\") pod \"console-operator-58897d9998-76hwt\" (UID: \"4a34a201-8137-4efe-a99a-1ebd89e40c68\") " pod="openshift-console-operator/console-operator-58897d9998-76hwt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.707356 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2d75d692-d062-4dfa-b7e7-ea53683bc549-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-xl58c\" (UID: \"2d75d692-d062-4dfa-b7e7-ea53683bc549\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xl58c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.707446 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcckl\" (UniqueName: \"kubernetes.io/projected/4a34a201-8137-4efe-a99a-1ebd89e40c68-kube-api-access-zcckl\") pod \"console-operator-58897d9998-76hwt\" (UID: \"4a34a201-8137-4efe-a99a-1ebd89e40c68\") " pod="openshift-console-operator/console-operator-58897d9998-76hwt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.709593 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e1e85ba5-52ed-4b2f-9901-b04090159f4c-machine-approver-tls\") pod \"machine-approver-56656f9798-6fp76\" (UID: \"e1e85ba5-52ed-4b2f-9901-b04090159f4c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6fp76" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.711856 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.712152 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a34a201-8137-4efe-a99a-1ebd89e40c68-serving-cert\") pod \"console-operator-58897d9998-76hwt\" (UID: \"4a34a201-8137-4efe-a99a-1ebd89e40c68\") " pod="openshift-console-operator/console-operator-58897d9998-76hwt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.707500 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8qcb\" (UniqueName: \"kubernetes.io/projected/5cb3c06a-e3cf-4a60-b180-82759b9d55fc-kube-api-access-m8qcb\") pod \"service-ca-9c57cc56f-brl6n\" (UID: \"5cb3c06a-e3cf-4a60-b180-82759b9d55fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-brl6n" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.712197 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16ec121b-fdf2-452d-8963-08d6132f7c5c-client-ca\") pod \"controller-manager-879f6c89f-54khq\" (UID: \"16ec121b-fdf2-452d-8963-08d6132f7c5c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.712219 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/49c54f44-4a94-4b19-b03d-8469355931d0-etcd-ca\") pod \"etcd-operator-b45778765-qdvzn\" (UID: \"49c54f44-4a94-4b19-b03d-8469355931d0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qdvzn" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.712242 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5425ff81-73c8-4fca-b208-9c9dbc6a949d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-m7sxt\" (UID: \"5425ff81-73c8-4fca-b208-9c9dbc6a949d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7sxt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.712287 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.712308 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9d200061-d82a-4b89-9bea-83a1c7d9eca8-profile-collector-cert\") pod \"catalog-operator-68c6474976-ggc4c\" (UID: \"9d200061-d82a-4b89-9bea-83a1c7d9eca8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ggc4c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.712327 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6kwg\" (UniqueName: \"kubernetes.io/projected/dd5add8c-a0d5-412d-925d-bd21c5893935-kube-api-access-v6kwg\") pod \"machine-config-operator-74547568cd-w7tht\" (UID: \"dd5add8c-a0d5-412d-925d-bd21c5893935\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w7tht" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.712355 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8ff2a93-ff6b-4ef8-9109-de3c22a6f108-service-ca-bundle\") pod \"router-default-5444994796-6qzc4\" (UID: \"d8ff2a93-ff6b-4ef8-9109-de3c22a6f108\") " pod="openshift-ingress/router-default-5444994796-6qzc4" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.712370 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqtpb\" (UniqueName: \"kubernetes.io/projected/99e5508c-0d75-4f87-9c07-b53509e461aa-kube-api-access-kqtpb\") pod \"marketplace-operator-79b997595-s6sm2\" (UID: \"99e5508c-0d75-4f87-9c07-b53509e461aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-s6sm2" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.712379 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.712498 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49c54f44-4a94-4b19-b03d-8469355931d0-serving-cert\") pod \"etcd-operator-b45778765-qdvzn\" (UID: \"49c54f44-4a94-4b19-b03d-8469355931d0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qdvzn" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.712964 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ff7c4e9d-5437-412b-867d-1e44dfc73df5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-r98n2\" (UID: \"ff7c4e9d-5437-412b-867d-1e44dfc73df5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r98n2" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.713008 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/36f26e27-d72e-42f2-9380-598616e5626b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-f998c\" (UID: \"36f26e27-d72e-42f2-9380-598616e5626b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f998c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.713013 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.713043 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c70b5281-74d8-44ff-8f4b-326a3d7192aa-audit-dir\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.713294 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49c54f44-4a94-4b19-b03d-8469355931d0-config\") pod \"etcd-operator-b45778765-qdvzn\" (UID: \"49c54f44-4a94-4b19-b03d-8469355931d0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qdvzn" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.713367 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77jmk"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.713799 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36f26e27-d72e-42f2-9380-598616e5626b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-f998c\" (UID: \"36f26e27-d72e-42f2-9380-598616e5626b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f998c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.713879 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.713906 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d20f42f-7f98-4c97-a28e-17749d977819-trusted-ca\") pod \"ingress-operator-5b745b69d9-fvjh7\" (UID: \"6d20f42f-7f98-4c97-a28e-17749d977819\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvjh7" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.713950 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/49c54f44-4a94-4b19-b03d-8469355931d0-etcd-ca\") pod \"etcd-operator-b45778765-qdvzn\" (UID: \"49c54f44-4a94-4b19-b03d-8469355931d0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qdvzn" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.714021 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d20f42f-7f98-4c97-a28e-17749d977819-metrics-tls\") pod \"ingress-operator-5b745b69d9-fvjh7\" (UID: \"6d20f42f-7f98-4c97-a28e-17749d977819\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvjh7" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.714196 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0c6d23c-1aff-4a2f-b92b-6cb07a0ca8b6-serving-cert\") pod \"openshift-config-operator-7777fb866f-vmngg\" (UID: \"f0c6d23c-1aff-4a2f-b92b-6cb07a0ca8b6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vmngg" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.714210 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-s6sm2"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.714258 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c662639-32a4-4f78-af37-9b1e65bab4e8-serving-cert\") pod \"route-controller-manager-6576b87f9c-jbmtp\" (UID: \"2c662639-32a4-4f78-af37-9b1e65bab4e8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.714290 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e1e85ba5-52ed-4b2f-9901-b04090159f4c-auth-proxy-config\") pod \"machine-approver-56656f9798-6fp76\" (UID: \"e1e85ba5-52ed-4b2f-9901-b04090159f4c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6fp76" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.714310 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/803d2a64-1416-46cb-ae46-5a8462b057f9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2gdhw\" (UID: \"803d2a64-1416-46cb-ae46-5a8462b057f9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2gdhw" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.714347 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8njx\" (UniqueName: \"kubernetes.io/projected/ff413ea9-0a19-4fc0-8067-9521bc9e472c-kube-api-access-f8njx\") pod \"multus-admission-controller-857f4d67dd-8qffd\" (UID: \"ff413ea9-0a19-4fc0-8067-9521bc9e472c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8qffd" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.714828 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0bcf0e9-75b6-443e-afd8-e2fb6f807e90-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-95tg7\" (UID: \"d0bcf0e9-75b6-443e-afd8-e2fb6f807e90\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-95tg7" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.714846 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e1e85ba5-52ed-4b2f-9901-b04090159f4c-auth-proxy-config\") pod \"machine-approver-56656f9798-6fp76\" (UID: \"e1e85ba5-52ed-4b2f-9901-b04090159f4c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6fp76" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.714885 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d75d692-d062-4dfa-b7e7-ea53683bc549-proxy-tls\") pod \"machine-config-controller-84d6567774-xl58c\" (UID: \"2d75d692-d062-4dfa-b7e7-ea53683bc549\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xl58c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.714910 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzpjt\" (UniqueName: \"kubernetes.io/projected/5f0e45b7-fac8-406c-bbe9-e92490d95fda-kube-api-access-jzpjt\") pod \"packageserver-d55dfcdfc-77jmk\" (UID: \"5f0e45b7-fac8-406c-bbe9-e92490d95fda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77jmk" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.714937 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5425ff81-73c8-4fca-b208-9c9dbc6a949d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-m7sxt\" (UID: \"5425ff81-73c8-4fca-b208-9c9dbc6a949d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7sxt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.714946 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.714974 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmhxj\" (UniqueName: \"kubernetes.io/projected/c70b5281-74d8-44ff-8f4b-326a3d7192aa-kube-api-access-tmhxj\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.715074 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d8ff2a93-ff6b-4ef8-9109-de3c22a6f108-default-certificate\") pod \"router-default-5444994796-6qzc4\" (UID: \"d8ff2a93-ff6b-4ef8-9109-de3c22a6f108\") " pod="openshift-ingress/router-default-5444994796-6qzc4" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.715107 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc9pl\" (UniqueName: \"kubernetes.io/projected/1eee8837-1a56-40df-b564-bb65ad94d593-kube-api-access-xc9pl\") pod \"control-plane-machine-set-operator-78cbb6b69f-szr89\" (UID: \"1eee8837-1a56-40df-b564-bb65ad94d593\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-szr89" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.715128 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.715163 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6qfk\" (UniqueName: \"kubernetes.io/projected/5425ff81-73c8-4fca-b208-9c9dbc6a949d-kube-api-access-q6qfk\") pod \"openshift-apiserver-operator-796bbdcf4f-m7sxt\" (UID: \"5425ff81-73c8-4fca-b208-9c9dbc6a949d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7sxt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.715183 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8fmm\" (UniqueName: \"kubernetes.io/projected/4e9d3578-0893-4852-80b6-999e5a7ccdc5-kube-api-access-f8fmm\") pod \"downloads-7954f5f757-zmhn7\" (UID: \"4e9d3578-0893-4852-80b6-999e5a7ccdc5\") " pod="openshift-console/downloads-7954f5f757-zmhn7" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.715330 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5f0e45b7-fac8-406c-bbe9-e92490d95fda-apiservice-cert\") pod \"packageserver-d55dfcdfc-77jmk\" (UID: \"5f0e45b7-fac8-406c-bbe9-e92490d95fda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77jmk" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.715356 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c662639-32a4-4f78-af37-9b1e65bab4e8-client-ca\") pod \"route-controller-manager-6576b87f9c-jbmtp\" (UID: \"2c662639-32a4-4f78-af37-9b1e65bab4e8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.715463 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8btn6\" (UniqueName: \"kubernetes.io/projected/16ec121b-fdf2-452d-8963-08d6132f7c5c-kube-api-access-8btn6\") pod \"controller-manager-879f6c89f-54khq\" (UID: \"16ec121b-fdf2-452d-8963-08d6132f7c5c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.715520 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c662639-32a4-4f78-af37-9b1e65bab4e8-config\") pod \"route-controller-manager-6576b87f9c-jbmtp\" (UID: \"2c662639-32a4-4f78-af37-9b1e65bab4e8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.715666 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac5f29d0-13ce-46eb-babc-70f32ac34feb-config\") pod \"kube-apiserver-operator-766d6c64bb-sxw8j\" (UID: \"ac5f29d0-13ce-46eb-babc-70f32ac34feb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sxw8j" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.715813 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f0c6d23c-1aff-4a2f-b92b-6cb07a0ca8b6-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vmngg\" (UID: \"f0c6d23c-1aff-4a2f-b92b-6cb07a0ca8b6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vmngg" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.716178 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c662639-32a4-4f78-af37-9b1e65bab4e8-client-ca\") pod \"route-controller-manager-6576b87f9c-jbmtp\" (UID: \"2c662639-32a4-4f78-af37-9b1e65bab4e8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.716407 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16ec121b-fdf2-452d-8963-08d6132f7c5c-serving-cert\") pod \"controller-manager-879f6c89f-54khq\" (UID: \"16ec121b-fdf2-452d-8963-08d6132f7c5c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.716450 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f0c6d23c-1aff-4a2f-b92b-6cb07a0ca8b6-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vmngg\" (UID: \"f0c6d23c-1aff-4a2f-b92b-6cb07a0ca8b6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vmngg" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.716685 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c662639-32a4-4f78-af37-9b1e65bab4e8-config\") pod \"route-controller-manager-6576b87f9c-jbmtp\" (UID: \"2c662639-32a4-4f78-af37-9b1e65bab4e8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.716734 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac5f29d0-13ce-46eb-babc-70f32ac34feb-config\") pod \"kube-apiserver-operator-766d6c64bb-sxw8j\" (UID: \"ac5f29d0-13ce-46eb-babc-70f32ac34feb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sxw8j" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.716744 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dktjz\" (UniqueName: \"kubernetes.io/projected/cbd2d545-1f43-45dd-8f7e-23ea76dbc0a4-kube-api-access-dktjz\") pod \"service-ca-operator-777779d784-m4dzs\" (UID: \"cbd2d545-1f43-45dd-8f7e-23ea76dbc0a4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4dzs" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.716803 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.716860 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16ec121b-fdf2-452d-8963-08d6132f7c5c-config\") pod \"controller-manager-879f6c89f-54khq\" (UID: \"16ec121b-fdf2-452d-8963-08d6132f7c5c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.716969 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.717027 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64a68003-b71d-4ac2-aaaf-76b67ed758cd-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-8qcmg\" (UID: \"64a68003-b71d-4ac2-aaaf-76b67ed758cd\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8qcmg" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.717050 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64a68003-b71d-4ac2-aaaf-76b67ed758cd-serving-cert\") pod \"authentication-operator-69f744f599-8qcmg\" (UID: \"64a68003-b71d-4ac2-aaaf-76b67ed758cd\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8qcmg" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.717069 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfsxg\" (UniqueName: \"kubernetes.io/projected/64a68003-b71d-4ac2-aaaf-76b67ed758cd-kube-api-access-vfsxg\") pod \"authentication-operator-69f744f599-8qcmg\" (UID: \"64a68003-b71d-4ac2-aaaf-76b67ed758cd\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8qcmg" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.717133 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bb62d6fb-d819-4fc1-aa43-35fb1012a1ba-srv-cert\") pod \"olm-operator-6b444d44fb-845fh\" (UID: \"bb62d6fb-d819-4fc1-aa43-35fb1012a1ba\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-845fh" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.717151 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4d77c191-7d04-4381-838f-b7a355e7c2d4-console-oauth-config\") pod \"console-f9d7485db-dq9x9\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.717170 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac5f29d0-13ce-46eb-babc-70f32ac34feb-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sxw8j\" (UID: \"ac5f29d0-13ce-46eb-babc-70f32ac34feb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sxw8j" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.717784 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.717872 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5mx6\" (UniqueName: \"kubernetes.io/projected/49c54f44-4a94-4b19-b03d-8469355931d0-kube-api-access-c5mx6\") pod \"etcd-operator-b45778765-qdvzn\" (UID: \"49c54f44-4a94-4b19-b03d-8469355931d0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qdvzn" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.718400 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16ec121b-fdf2-452d-8963-08d6132f7c5c-config\") pod \"controller-manager-879f6c89f-54khq\" (UID: \"16ec121b-fdf2-452d-8963-08d6132f7c5c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.718562 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.719031 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-brl6n"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.720983 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-xl58c"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.720737 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.721447 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d75d692-d062-4dfa-b7e7-ea53683bc549-proxy-tls\") pod \"machine-config-controller-84d6567774-xl58c\" (UID: \"2d75d692-d062-4dfa-b7e7-ea53683bc549\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xl58c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.721992 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c662639-32a4-4f78-af37-9b1e65bab4e8-serving-cert\") pod \"route-controller-manager-6576b87f9c-jbmtp\" (UID: \"2c662639-32a4-4f78-af37-9b1e65bab4e8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.723830 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac5f29d0-13ce-46eb-babc-70f32ac34feb-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sxw8j\" (UID: \"ac5f29d0-13ce-46eb-babc-70f32ac34feb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sxw8j" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.723936 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.724306 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2gdhw"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.724387 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.725205 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.726596 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-54khq"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.727738 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-szr89"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.729743 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-dq9x9"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.733954 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-w7tht"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.735907 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-845fh"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.737386 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29585610-wbzhk"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.739358 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xzg9j"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.741817 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ggc4c"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.743534 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.747819 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-tfwcw"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.748581 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-tfwcw" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.753701 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-cbp4s"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.754375 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cbp4s" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.756341 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wbcwp"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.758695 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-8qcmg"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.760725 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zmhn7"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.760736 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.762928 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.764114 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8m6h"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.765137 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585620-t897v"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.766230 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kpg9f"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.768171 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cbp4s"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.770240 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntdkq"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.771058 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-k2sdm"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.772134 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.772671 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-k2sdm"] Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.779949 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.808350 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818483 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d8ff2a93-ff6b-4ef8-9109-de3c22a6f108-default-certificate\") pod \"router-default-5444994796-6qzc4\" (UID: \"d8ff2a93-ff6b-4ef8-9109-de3c22a6f108\") " pod="openshift-ingress/router-default-5444994796-6qzc4" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818525 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc9pl\" (UniqueName: \"kubernetes.io/projected/1eee8837-1a56-40df-b564-bb65ad94d593-kube-api-access-xc9pl\") pod \"control-plane-machine-set-operator-78cbb6b69f-szr89\" (UID: \"1eee8837-1a56-40df-b564-bb65ad94d593\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-szr89" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818552 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8fmm\" (UniqueName: \"kubernetes.io/projected/4e9d3578-0893-4852-80b6-999e5a7ccdc5-kube-api-access-f8fmm\") pod \"downloads-7954f5f757-zmhn7\" (UID: \"4e9d3578-0893-4852-80b6-999e5a7ccdc5\") " pod="openshift-console/downloads-7954f5f757-zmhn7" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818568 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5f0e45b7-fac8-406c-bbe9-e92490d95fda-apiservice-cert\") pod \"packageserver-d55dfcdfc-77jmk\" (UID: \"5f0e45b7-fac8-406c-bbe9-e92490d95fda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77jmk" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818592 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dktjz\" (UniqueName: \"kubernetes.io/projected/cbd2d545-1f43-45dd-8f7e-23ea76dbc0a4-kube-api-access-dktjz\") pod \"service-ca-operator-777779d784-m4dzs\" (UID: \"cbd2d545-1f43-45dd-8f7e-23ea76dbc0a4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4dzs" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818608 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64a68003-b71d-4ac2-aaaf-76b67ed758cd-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-8qcmg\" (UID: \"64a68003-b71d-4ac2-aaaf-76b67ed758cd\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8qcmg" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818640 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64a68003-b71d-4ac2-aaaf-76b67ed758cd-serving-cert\") pod \"authentication-operator-69f744f599-8qcmg\" (UID: \"64a68003-b71d-4ac2-aaaf-76b67ed758cd\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8qcmg" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818658 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfsxg\" (UniqueName: \"kubernetes.io/projected/64a68003-b71d-4ac2-aaaf-76b67ed758cd-kube-api-access-vfsxg\") pod \"authentication-operator-69f744f599-8qcmg\" (UID: \"64a68003-b71d-4ac2-aaaf-76b67ed758cd\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8qcmg" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818671 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bb62d6fb-d819-4fc1-aa43-35fb1012a1ba-srv-cert\") pod \"olm-operator-6b444d44fb-845fh\" (UID: \"bb62d6fb-d819-4fc1-aa43-35fb1012a1ba\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-845fh" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818685 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4d77c191-7d04-4381-838f-b7a355e7c2d4-console-oauth-config\") pod \"console-f9d7485db-dq9x9\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818720 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9d200061-d82a-4b89-9bea-83a1c7d9eca8-srv-cert\") pod \"catalog-operator-68c6474976-ggc4c\" (UID: \"9d200061-d82a-4b89-9bea-83a1c7d9eca8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ggc4c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818734 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/803d2a64-1416-46cb-ae46-5a8462b057f9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2gdhw\" (UID: \"803d2a64-1416-46cb-ae46-5a8462b057f9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2gdhw" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818751 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4c6l\" (UniqueName: \"kubernetes.io/projected/446ace70-43fa-494c-8bce-13a5ea3ca452-kube-api-access-k4c6l\") pod \"dns-operator-744455d44c-t9dlm\" (UID: \"446ace70-43fa-494c-8bce-13a5ea3ca452\") " pod="openshift-dns-operator/dns-operator-744455d44c-t9dlm" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818774 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3568fcc7-10bd-4972-9782-b97aa3c9c8a0-images\") pod \"machine-api-operator-5694c8668f-kpg9f\" (UID: \"3568fcc7-10bd-4972-9782-b97aa3c9c8a0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kpg9f" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818795 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3568fcc7-10bd-4972-9782-b97aa3c9c8a0-config\") pod \"machine-api-operator-5694c8668f-kpg9f\" (UID: \"3568fcc7-10bd-4972-9782-b97aa3c9c8a0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kpg9f" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818810 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs7h4\" (UniqueName: \"kubernetes.io/projected/d8ff2a93-ff6b-4ef8-9109-de3c22a6f108-kube-api-access-cs7h4\") pod \"router-default-5444994796-6qzc4\" (UID: \"d8ff2a93-ff6b-4ef8-9109-de3c22a6f108\") " pod="openshift-ingress/router-default-5444994796-6qzc4" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818826 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1eee8837-1a56-40df-b564-bb65ad94d593-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-szr89\" (UID: \"1eee8837-1a56-40df-b564-bb65ad94d593\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-szr89" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818847 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbd2d545-1f43-45dd-8f7e-23ea76dbc0a4-serving-cert\") pod \"service-ca-operator-777779d784-m4dzs\" (UID: \"cbd2d545-1f43-45dd-8f7e-23ea76dbc0a4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4dzs" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818868 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/99e5508c-0d75-4f87-9c07-b53509e461aa-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-s6sm2\" (UID: \"99e5508c-0d75-4f87-9c07-b53509e461aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-s6sm2" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818884 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2pmz\" (UniqueName: \"kubernetes.io/projected/9d200061-d82a-4b89-9bea-83a1c7d9eca8-kube-api-access-v2pmz\") pod \"catalog-operator-68c6474976-ggc4c\" (UID: \"9d200061-d82a-4b89-9bea-83a1c7d9eca8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ggc4c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818898 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4d77c191-7d04-4381-838f-b7a355e7c2d4-console-config\") pod \"console-f9d7485db-dq9x9\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818911 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4d77c191-7d04-4381-838f-b7a355e7c2d4-service-ca\") pod \"console-f9d7485db-dq9x9\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818925 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d77c191-7d04-4381-838f-b7a355e7c2d4-trusted-ca-bundle\") pod \"console-f9d7485db-dq9x9\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818940 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/446ace70-43fa-494c-8bce-13a5ea3ca452-metrics-tls\") pod \"dns-operator-744455d44c-t9dlm\" (UID: \"446ace70-43fa-494c-8bce-13a5ea3ca452\") " pod="openshift-dns-operator/dns-operator-744455d44c-t9dlm" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818957 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64a68003-b71d-4ac2-aaaf-76b67ed758cd-service-ca-bundle\") pod \"authentication-operator-69f744f599-8qcmg\" (UID: \"64a68003-b71d-4ac2-aaaf-76b67ed758cd\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8qcmg" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818972 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3568fcc7-10bd-4972-9782-b97aa3c9c8a0-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kpg9f\" (UID: \"3568fcc7-10bd-4972-9782-b97aa3c9c8a0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kpg9f" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.818991 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d8ff2a93-ff6b-4ef8-9109-de3c22a6f108-stats-auth\") pod \"router-default-5444994796-6qzc4\" (UID: \"d8ff2a93-ff6b-4ef8-9109-de3c22a6f108\") " pod="openshift-ingress/router-default-5444994796-6qzc4" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819011 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67577cc5-6dee-4465-beee-ea424d976972-secret-volume\") pod \"collect-profiles-29585610-wbzhk\" (UID: \"67577cc5-6dee-4465-beee-ea424d976972\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585610-wbzhk" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819027 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d8ff2a93-ff6b-4ef8-9109-de3c22a6f108-metrics-certs\") pod \"router-default-5444994796-6qzc4\" (UID: \"d8ff2a93-ff6b-4ef8-9109-de3c22a6f108\") " pod="openshift-ingress/router-default-5444994796-6qzc4" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819045 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67577cc5-6dee-4465-beee-ea424d976972-config-volume\") pod \"collect-profiles-29585610-wbzhk\" (UID: \"67577cc5-6dee-4465-beee-ea424d976972\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585610-wbzhk" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819085 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dd5add8c-a0d5-412d-925d-bd21c5893935-auth-proxy-config\") pod \"machine-config-operator-74547568cd-w7tht\" (UID: \"dd5add8c-a0d5-412d-925d-bd21c5893935\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w7tht" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819103 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5cb3c06a-e3cf-4a60-b180-82759b9d55fc-signing-key\") pod \"service-ca-9c57cc56f-brl6n\" (UID: \"5cb3c06a-e3cf-4a60-b180-82759b9d55fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-brl6n" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819120 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4d77c191-7d04-4381-838f-b7a355e7c2d4-oauth-serving-cert\") pod \"console-f9d7485db-dq9x9\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819136 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6t4g\" (UniqueName: \"kubernetes.io/projected/bb62d6fb-d819-4fc1-aa43-35fb1012a1ba-kube-api-access-q6t4g\") pod \"olm-operator-6b444d44fb-845fh\" (UID: \"bb62d6fb-d819-4fc1-aa43-35fb1012a1ba\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-845fh" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819152 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vmdm\" (UniqueName: \"kubernetes.io/projected/3568fcc7-10bd-4972-9782-b97aa3c9c8a0-kube-api-access-5vmdm\") pod \"machine-api-operator-5694c8668f-kpg9f\" (UID: \"3568fcc7-10bd-4972-9782-b97aa3c9c8a0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kpg9f" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819167 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d77c191-7d04-4381-838f-b7a355e7c2d4-console-serving-cert\") pod \"console-f9d7485db-dq9x9\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819184 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5cb3c06a-e3cf-4a60-b180-82759b9d55fc-signing-cabundle\") pod \"service-ca-9c57cc56f-brl6n\" (UID: \"5cb3c06a-e3cf-4a60-b180-82759b9d55fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-brl6n" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819205 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbd2d545-1f43-45dd-8f7e-23ea76dbc0a4-config\") pod \"service-ca-operator-777779d784-m4dzs\" (UID: \"cbd2d545-1f43-45dd-8f7e-23ea76dbc0a4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4dzs" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819225 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd5add8c-a0d5-412d-925d-bd21c5893935-proxy-tls\") pod \"machine-config-operator-74547568cd-w7tht\" (UID: \"dd5add8c-a0d5-412d-925d-bd21c5893935\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w7tht" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819240 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64a68003-b71d-4ac2-aaaf-76b67ed758cd-config\") pod \"authentication-operator-69f744f599-8qcmg\" (UID: \"64a68003-b71d-4ac2-aaaf-76b67ed758cd\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8qcmg" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819255 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bb62d6fb-d819-4fc1-aa43-35fb1012a1ba-profile-collector-cert\") pod \"olm-operator-6b444d44fb-845fh\" (UID: \"bb62d6fb-d819-4fc1-aa43-35fb1012a1ba\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-845fh" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819290 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/dd5add8c-a0d5-412d-925d-bd21c5893935-images\") pod \"machine-config-operator-74547568cd-w7tht\" (UID: \"dd5add8c-a0d5-412d-925d-bd21c5893935\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w7tht" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819307 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82sjv\" (UniqueName: \"kubernetes.io/projected/803d2a64-1416-46cb-ae46-5a8462b057f9-kube-api-access-82sjv\") pod \"openshift-controller-manager-operator-756b6f6bc6-2gdhw\" (UID: \"803d2a64-1416-46cb-ae46-5a8462b057f9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2gdhw" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819323 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqcj6\" (UniqueName: \"kubernetes.io/projected/67577cc5-6dee-4465-beee-ea424d976972-kube-api-access-rqcj6\") pod \"collect-profiles-29585610-wbzhk\" (UID: \"67577cc5-6dee-4465-beee-ea424d976972\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585610-wbzhk" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819338 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkkwm\" (UniqueName: \"kubernetes.io/projected/4d77c191-7d04-4381-838f-b7a355e7c2d4-kube-api-access-nkkwm\") pod \"console-f9d7485db-dq9x9\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819353 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/99e5508c-0d75-4f87-9c07-b53509e461aa-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-s6sm2\" (UID: \"99e5508c-0d75-4f87-9c07-b53509e461aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-s6sm2" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819371 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5f0e45b7-fac8-406c-bbe9-e92490d95fda-webhook-cert\") pod \"packageserver-d55dfcdfc-77jmk\" (UID: \"5f0e45b7-fac8-406c-bbe9-e92490d95fda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77jmk" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819388 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ff413ea9-0a19-4fc0-8067-9521bc9e472c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8qffd\" (UID: \"ff413ea9-0a19-4fc0-8067-9521bc9e472c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8qffd" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819402 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5f0e45b7-fac8-406c-bbe9-e92490d95fda-tmpfs\") pod \"packageserver-d55dfcdfc-77jmk\" (UID: \"5f0e45b7-fac8-406c-bbe9-e92490d95fda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77jmk" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819425 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8qcb\" (UniqueName: \"kubernetes.io/projected/5cb3c06a-e3cf-4a60-b180-82759b9d55fc-kube-api-access-m8qcb\") pod \"service-ca-9c57cc56f-brl6n\" (UID: \"5cb3c06a-e3cf-4a60-b180-82759b9d55fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-brl6n" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819447 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9d200061-d82a-4b89-9bea-83a1c7d9eca8-profile-collector-cert\") pod \"catalog-operator-68c6474976-ggc4c\" (UID: \"9d200061-d82a-4b89-9bea-83a1c7d9eca8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ggc4c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819462 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6kwg\" (UniqueName: \"kubernetes.io/projected/dd5add8c-a0d5-412d-925d-bd21c5893935-kube-api-access-v6kwg\") pod \"machine-config-operator-74547568cd-w7tht\" (UID: \"dd5add8c-a0d5-412d-925d-bd21c5893935\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w7tht" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819482 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8ff2a93-ff6b-4ef8-9109-de3c22a6f108-service-ca-bundle\") pod \"router-default-5444994796-6qzc4\" (UID: \"d8ff2a93-ff6b-4ef8-9109-de3c22a6f108\") " pod="openshift-ingress/router-default-5444994796-6qzc4" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819497 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqtpb\" (UniqueName: \"kubernetes.io/projected/99e5508c-0d75-4f87-9c07-b53509e461aa-kube-api-access-kqtpb\") pod \"marketplace-operator-79b997595-s6sm2\" (UID: \"99e5508c-0d75-4f87-9c07-b53509e461aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-s6sm2" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819513 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/803d2a64-1416-46cb-ae46-5a8462b057f9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2gdhw\" (UID: \"803d2a64-1416-46cb-ae46-5a8462b057f9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2gdhw" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819528 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8njx\" (UniqueName: \"kubernetes.io/projected/ff413ea9-0a19-4fc0-8067-9521bc9e472c-kube-api-access-f8njx\") pod \"multus-admission-controller-857f4d67dd-8qffd\" (UID: \"ff413ea9-0a19-4fc0-8067-9521bc9e472c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8qffd" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.819544 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzpjt\" (UniqueName: \"kubernetes.io/projected/5f0e45b7-fac8-406c-bbe9-e92490d95fda-kube-api-access-jzpjt\") pod \"packageserver-d55dfcdfc-77jmk\" (UID: \"5f0e45b7-fac8-406c-bbe9-e92490d95fda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77jmk" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.820367 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbd2d545-1f43-45dd-8f7e-23ea76dbc0a4-config\") pod \"service-ca-operator-777779d784-m4dzs\" (UID: \"cbd2d545-1f43-45dd-8f7e-23ea76dbc0a4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4dzs" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.820691 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3568fcc7-10bd-4972-9782-b97aa3c9c8a0-images\") pod \"machine-api-operator-5694c8668f-kpg9f\" (UID: \"3568fcc7-10bd-4972-9782-b97aa3c9c8a0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kpg9f" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.820887 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64a68003-b71d-4ac2-aaaf-76b67ed758cd-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-8qcmg\" (UID: \"64a68003-b71d-4ac2-aaaf-76b67ed758cd\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8qcmg" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.821568 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3568fcc7-10bd-4972-9782-b97aa3c9c8a0-config\") pod \"machine-api-operator-5694c8668f-kpg9f\" (UID: \"3568fcc7-10bd-4972-9782-b97aa3c9c8a0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kpg9f" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.822578 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5f0e45b7-fac8-406c-bbe9-e92490d95fda-tmpfs\") pod \"packageserver-d55dfcdfc-77jmk\" (UID: \"5f0e45b7-fac8-406c-bbe9-e92490d95fda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77jmk" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.823815 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dd5add8c-a0d5-412d-925d-bd21c5893935-auth-proxy-config\") pod \"machine-config-operator-74547568cd-w7tht\" (UID: \"dd5add8c-a0d5-412d-925d-bd21c5893935\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w7tht" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.824043 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/99e5508c-0d75-4f87-9c07-b53509e461aa-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-s6sm2\" (UID: \"99e5508c-0d75-4f87-9c07-b53509e461aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-s6sm2" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.824363 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64a68003-b71d-4ac2-aaaf-76b67ed758cd-serving-cert\") pod \"authentication-operator-69f744f599-8qcmg\" (UID: \"64a68003-b71d-4ac2-aaaf-76b67ed758cd\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8qcmg" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.824476 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.824517 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bb62d6fb-d819-4fc1-aa43-35fb1012a1ba-profile-collector-cert\") pod \"olm-operator-6b444d44fb-845fh\" (UID: \"bb62d6fb-d819-4fc1-aa43-35fb1012a1ba\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-845fh" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.825311 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1eee8837-1a56-40df-b564-bb65ad94d593-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-szr89\" (UID: \"1eee8837-1a56-40df-b564-bb65ad94d593\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-szr89" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.825692 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3568fcc7-10bd-4972-9782-b97aa3c9c8a0-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kpg9f\" (UID: \"3568fcc7-10bd-4972-9782-b97aa3c9c8a0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kpg9f" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.825778 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bb62d6fb-d819-4fc1-aa43-35fb1012a1ba-srv-cert\") pod \"olm-operator-6b444d44fb-845fh\" (UID: \"bb62d6fb-d819-4fc1-aa43-35fb1012a1ba\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-845fh" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.826233 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/99e5508c-0d75-4f87-9c07-b53509e461aa-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-s6sm2\" (UID: \"99e5508c-0d75-4f87-9c07-b53509e461aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-s6sm2" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.826755 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67577cc5-6dee-4465-beee-ea424d976972-secret-volume\") pod \"collect-profiles-29585610-wbzhk\" (UID: \"67577cc5-6dee-4465-beee-ea424d976972\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585610-wbzhk" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.826760 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ff413ea9-0a19-4fc0-8067-9521bc9e472c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8qffd\" (UID: \"ff413ea9-0a19-4fc0-8067-9521bc9e472c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8qffd" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.827560 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbd2d545-1f43-45dd-8f7e-23ea76dbc0a4-serving-cert\") pod \"service-ca-operator-777779d784-m4dzs\" (UID: \"cbd2d545-1f43-45dd-8f7e-23ea76dbc0a4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4dzs" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.829753 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9d200061-d82a-4b89-9bea-83a1c7d9eca8-profile-collector-cert\") pod \"catalog-operator-68c6474976-ggc4c\" (UID: \"9d200061-d82a-4b89-9bea-83a1c7d9eca8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ggc4c" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.831925 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64a68003-b71d-4ac2-aaaf-76b67ed758cd-config\") pod \"authentication-operator-69f744f599-8qcmg\" (UID: \"64a68003-b71d-4ac2-aaaf-76b67ed758cd\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8qcmg" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.840739 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.843381 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64a68003-b71d-4ac2-aaaf-76b67ed758cd-service-ca-bundle\") pod \"authentication-operator-69f744f599-8qcmg\" (UID: \"64a68003-b71d-4ac2-aaaf-76b67ed758cd\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8qcmg" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.860291 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.880366 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.900929 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.906078 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4d77c191-7d04-4381-838f-b7a355e7c2d4-console-config\") pod \"console-f9d7485db-dq9x9\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.921954 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.941088 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.954355 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d77c191-7d04-4381-838f-b7a355e7c2d4-console-serving-cert\") pod \"console-f9d7485db-dq9x9\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.960853 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.973588 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4d77c191-7d04-4381-838f-b7a355e7c2d4-console-oauth-config\") pod \"console-f9d7485db-dq9x9\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.980664 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Apr 02 13:41:28 crc kubenswrapper[4732]: I0402 13:41:28.984234 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4d77c191-7d04-4381-838f-b7a355e7c2d4-service-ca\") pod \"console-f9d7485db-dq9x9\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.007577 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.013925 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d77c191-7d04-4381-838f-b7a355e7c2d4-trusted-ca-bundle\") pod \"console-f9d7485db-dq9x9\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.021881 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.030795 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4d77c191-7d04-4381-838f-b7a355e7c2d4-oauth-serving-cert\") pod \"console-f9d7485db-dq9x9\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.040774 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.060665 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.101123 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.120898 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.141524 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.160989 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.181257 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.201078 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.221922 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.240764 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.251659 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/803d2a64-1416-46cb-ae46-5a8462b057f9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2gdhw\" (UID: \"803d2a64-1416-46cb-ae46-5a8462b057f9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2gdhw" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.262098 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.281766 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.287025 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/803d2a64-1416-46cb-ae46-5a8462b057f9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2gdhw\" (UID: \"803d2a64-1416-46cb-ae46-5a8462b057f9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2gdhw" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.300597 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.321443 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.349261 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.360985 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.382162 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.400998 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.421847 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.441239 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.462293 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.482034 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.501567 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.521257 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.527905 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d8ff2a93-ff6b-4ef8-9109-de3c22a6f108-metrics-certs\") pod \"router-default-5444994796-6qzc4\" (UID: \"d8ff2a93-ff6b-4ef8-9109-de3c22a6f108\") " pod="openshift-ingress/router-default-5444994796-6qzc4" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.541379 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.560717 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.568682 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d8ff2a93-ff6b-4ef8-9109-de3c22a6f108-stats-auth\") pod \"router-default-5444994796-6qzc4\" (UID: \"d8ff2a93-ff6b-4ef8-9109-de3c22a6f108\") " pod="openshift-ingress/router-default-5444994796-6qzc4" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.581786 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.593905 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d8ff2a93-ff6b-4ef8-9109-de3c22a6f108-default-certificate\") pod \"router-default-5444994796-6qzc4\" (UID: \"d8ff2a93-ff6b-4ef8-9109-de3c22a6f108\") " pod="openshift-ingress/router-default-5444994796-6qzc4" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.601851 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.621706 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.641311 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.643585 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8ff2a93-ff6b-4ef8-9109-de3c22a6f108-service-ca-bundle\") pod \"router-default-5444994796-6qzc4\" (UID: \"d8ff2a93-ff6b-4ef8-9109-de3c22a6f108\") " pod="openshift-ingress/router-default-5444994796-6qzc4" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.663775 4732 request.go:700] Waited for 1.006167938s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/configmaps?fieldSelector=metadata.name%3Dmachine-config-operator-images&limit=500&resourceVersion=0 Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.666919 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.673397 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/dd5add8c-a0d5-412d-925d-bd21c5893935-images\") pod \"machine-config-operator-74547568cd-w7tht\" (UID: \"dd5add8c-a0d5-412d-925d-bd21c5893935\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w7tht" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.682704 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.701076 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.716331 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd5add8c-a0d5-412d-925d-bd21c5893935-proxy-tls\") pod \"machine-config-operator-74547568cd-w7tht\" (UID: \"dd5add8c-a0d5-412d-925d-bd21c5893935\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w7tht" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.721055 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.727682 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/446ace70-43fa-494c-8bce-13a5ea3ca452-metrics-tls\") pod \"dns-operator-744455d44c-t9dlm\" (UID: \"446ace70-43fa-494c-8bce-13a5ea3ca452\") " pod="openshift-dns-operator/dns-operator-744455d44c-t9dlm" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.742920 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.760385 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.781498 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.801755 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Apr 02 13:41:29 crc kubenswrapper[4732]: E0402 13:41:29.819562 4732 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Apr 02 13:41:29 crc kubenswrapper[4732]: E0402 13:41:29.819739 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f0e45b7-fac8-406c-bbe9-e92490d95fda-apiservice-cert podName:5f0e45b7-fac8-406c-bbe9-e92490d95fda nodeName:}" failed. No retries permitted until 2026-04-02 13:41:30.319703309 +0000 UTC m=+247.224110902 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/5f0e45b7-fac8-406c-bbe9-e92490d95fda-apiservice-cert") pod "packageserver-d55dfcdfc-77jmk" (UID: "5f0e45b7-fac8-406c-bbe9-e92490d95fda") : failed to sync secret cache: timed out waiting for the condition Apr 02 13:41:29 crc kubenswrapper[4732]: E0402 13:41:29.819821 4732 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Apr 02 13:41:29 crc kubenswrapper[4732]: E0402 13:41:29.819907 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d200061-d82a-4b89-9bea-83a1c7d9eca8-srv-cert podName:9d200061-d82a-4b89-9bea-83a1c7d9eca8 nodeName:}" failed. No retries permitted until 2026-04-02 13:41:30.319879024 +0000 UTC m=+247.224286617 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/9d200061-d82a-4b89-9bea-83a1c7d9eca8-srv-cert") pod "catalog-operator-68c6474976-ggc4c" (UID: "9d200061-d82a-4b89-9bea-83a1c7d9eca8") : failed to sync secret cache: timed out waiting for the condition Apr 02 13:41:29 crc kubenswrapper[4732]: E0402 13:41:29.819992 4732 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Apr 02 13:41:29 crc kubenswrapper[4732]: E0402 13:41:29.820048 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5cb3c06a-e3cf-4a60-b180-82759b9d55fc-signing-cabundle podName:5cb3c06a-e3cf-4a60-b180-82759b9d55fc nodeName:}" failed. No retries permitted until 2026-04-02 13:41:30.320028738 +0000 UTC m=+247.224436331 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/5cb3c06a-e3cf-4a60-b180-82759b9d55fc-signing-cabundle") pod "service-ca-9c57cc56f-brl6n" (UID: "5cb3c06a-e3cf-4a60-b180-82759b9d55fc") : failed to sync configmap cache: timed out waiting for the condition Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.821290 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Apr 02 13:41:29 crc kubenswrapper[4732]: E0402 13:41:29.821815 4732 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Apr 02 13:41:29 crc kubenswrapper[4732]: E0402 13:41:29.821974 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f0e45b7-fac8-406c-bbe9-e92490d95fda-webhook-cert podName:5f0e45b7-fac8-406c-bbe9-e92490d95fda nodeName:}" failed. No retries permitted until 2026-04-02 13:41:30.321901208 +0000 UTC m=+247.226308811 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/5f0e45b7-fac8-406c-bbe9-e92490d95fda-webhook-cert") pod "packageserver-d55dfcdfc-77jmk" (UID: "5f0e45b7-fac8-406c-bbe9-e92490d95fda") : failed to sync secret cache: timed out waiting for the condition Apr 02 13:41:29 crc kubenswrapper[4732]: E0402 13:41:29.823365 4732 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Apr 02 13:41:29 crc kubenswrapper[4732]: E0402 13:41:29.823456 4732 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Apr 02 13:41:29 crc kubenswrapper[4732]: E0402 13:41:29.823470 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/67577cc5-6dee-4465-beee-ea424d976972-config-volume podName:67577cc5-6dee-4465-beee-ea424d976972 nodeName:}" failed. No retries permitted until 2026-04-02 13:41:30.323443448 +0000 UTC m=+247.227851041 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/67577cc5-6dee-4465-beee-ea424d976972-config-volume") pod "collect-profiles-29585610-wbzhk" (UID: "67577cc5-6dee-4465-beee-ea424d976972") : failed to sync configmap cache: timed out waiting for the condition Apr 02 13:41:29 crc kubenswrapper[4732]: E0402 13:41:29.823527 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cb3c06a-e3cf-4a60-b180-82759b9d55fc-signing-key podName:5cb3c06a-e3cf-4a60-b180-82759b9d55fc nodeName:}" failed. No retries permitted until 2026-04-02 13:41:30.32350296 +0000 UTC m=+247.227910553 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/5cb3c06a-e3cf-4a60-b180-82759b9d55fc-signing-key") pod "service-ca-9c57cc56f-brl6n" (UID: "5cb3c06a-e3cf-4a60-b180-82759b9d55fc") : failed to sync secret cache: timed out waiting for the condition Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.842043 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.860741 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.880834 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.900684 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.923037 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.940979 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.960557 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Apr 02 13:41:29 crc kubenswrapper[4732]: I0402 13:41:29.981458 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.002310 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.021090 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.041320 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.062054 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.081061 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.101515 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.120509 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.141868 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.160983 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.201682 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.220933 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.240480 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.261678 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.281008 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.300902 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.321370 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.341706 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.344073 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5cb3c06a-e3cf-4a60-b180-82759b9d55fc-signing-cabundle\") pod \"service-ca-9c57cc56f-brl6n\" (UID: \"5cb3c06a-e3cf-4a60-b180-82759b9d55fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-brl6n" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.344327 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5f0e45b7-fac8-406c-bbe9-e92490d95fda-webhook-cert\") pod \"packageserver-d55dfcdfc-77jmk\" (UID: \"5f0e45b7-fac8-406c-bbe9-e92490d95fda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77jmk" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.344593 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5f0e45b7-fac8-406c-bbe9-e92490d95fda-apiservice-cert\") pod \"packageserver-d55dfcdfc-77jmk\" (UID: \"5f0e45b7-fac8-406c-bbe9-e92490d95fda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77jmk" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.344821 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9d200061-d82a-4b89-9bea-83a1c7d9eca8-srv-cert\") pod \"catalog-operator-68c6474976-ggc4c\" (UID: \"9d200061-d82a-4b89-9bea-83a1c7d9eca8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ggc4c" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.345806 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5cb3c06a-e3cf-4a60-b180-82759b9d55fc-signing-cabundle\") pod \"service-ca-9c57cc56f-brl6n\" (UID: \"5cb3c06a-e3cf-4a60-b180-82759b9d55fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-brl6n" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.346077 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67577cc5-6dee-4465-beee-ea424d976972-config-volume\") pod \"collect-profiles-29585610-wbzhk\" (UID: \"67577cc5-6dee-4465-beee-ea424d976972\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585610-wbzhk" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.346320 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5cb3c06a-e3cf-4a60-b180-82759b9d55fc-signing-key\") pod \"service-ca-9c57cc56f-brl6n\" (UID: \"5cb3c06a-e3cf-4a60-b180-82759b9d55fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-brl6n" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.347187 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67577cc5-6dee-4465-beee-ea424d976972-config-volume\") pod \"collect-profiles-29585610-wbzhk\" (UID: \"67577cc5-6dee-4465-beee-ea424d976972\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585610-wbzhk" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.348688 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5f0e45b7-fac8-406c-bbe9-e92490d95fda-webhook-cert\") pod \"packageserver-d55dfcdfc-77jmk\" (UID: \"5f0e45b7-fac8-406c-bbe9-e92490d95fda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77jmk" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.349559 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9d200061-d82a-4b89-9bea-83a1c7d9eca8-srv-cert\") pod \"catalog-operator-68c6474976-ggc4c\" (UID: \"9d200061-d82a-4b89-9bea-83a1c7d9eca8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ggc4c" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.349717 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5f0e45b7-fac8-406c-bbe9-e92490d95fda-apiservice-cert\") pod \"packageserver-d55dfcdfc-77jmk\" (UID: \"5f0e45b7-fac8-406c-bbe9-e92490d95fda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77jmk" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.353240 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5cb3c06a-e3cf-4a60-b180-82759b9d55fc-signing-key\") pod \"service-ca-9c57cc56f-brl6n\" (UID: \"5cb3c06a-e3cf-4a60-b180-82759b9d55fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-brl6n" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.360701 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.382036 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.401054 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.420742 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.469963 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4qvl\" (UniqueName: \"kubernetes.io/projected/6d20f42f-7f98-4c97-a28e-17749d977819-kube-api-access-r4qvl\") pod \"ingress-operator-5b745b69d9-fvjh7\" (UID: \"6d20f42f-7f98-4c97-a28e-17749d977819\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvjh7" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.490217 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0bcf0e9-75b6-443e-afd8-e2fb6f807e90-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-95tg7\" (UID: \"d0bcf0e9-75b6-443e-afd8-e2fb6f807e90\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-95tg7" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.498808 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xznt7\" (UniqueName: \"kubernetes.io/projected/e1e85ba5-52ed-4b2f-9901-b04090159f4c-kube-api-access-xznt7\") pod \"machine-approver-56656f9798-6fp76\" (UID: \"e1e85ba5-52ed-4b2f-9901-b04090159f4c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6fp76" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.518122 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78ttp\" (UniqueName: \"kubernetes.io/projected/2c662639-32a4-4f78-af37-9b1e65bab4e8-kube-api-access-78ttp\") pod \"route-controller-manager-6576b87f9c-jbmtp\" (UID: \"2c662639-32a4-4f78-af37-9b1e65bab4e8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.539886 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/36f26e27-d72e-42f2-9380-598616e5626b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-f998c\" (UID: \"36f26e27-d72e-42f2-9380-598616e5626b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f998c" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.561275 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac5f29d0-13ce-46eb-babc-70f32ac34feb-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sxw8j\" (UID: \"ac5f29d0-13ce-46eb-babc-70f32ac34feb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sxw8j" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.565807 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-95tg7" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.574398 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s255l\" (UniqueName: \"kubernetes.io/projected/f0c6d23c-1aff-4a2f-b92b-6cb07a0ca8b6-kube-api-access-s255l\") pod \"openshift-config-operator-7777fb866f-vmngg\" (UID: \"f0c6d23c-1aff-4a2f-b92b-6cb07a0ca8b6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vmngg" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.595353 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rv8j\" (UniqueName: \"kubernetes.io/projected/ff7c4e9d-5437-412b-867d-1e44dfc73df5-kube-api-access-6rv8j\") pod \"cluster-samples-operator-665b6dd947-r98n2\" (UID: \"ff7c4e9d-5437-412b-867d-1e44dfc73df5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r98n2" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.614149 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zltjk\" (UniqueName: \"kubernetes.io/projected/36f26e27-d72e-42f2-9380-598616e5626b-kube-api-access-zltjk\") pod \"cluster-image-registry-operator-dc59b4c8b-f998c\" (UID: \"36f26e27-d72e-42f2-9380-598616e5626b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f998c" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.623175 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.632084 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6fp76" Apr 02 13:41:30 crc kubenswrapper[4732]: W0402 13:41:30.646556 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1e85ba5_52ed_4b2f_9901_b04090159f4c.slice/crio-ab66e7087e460ffcf809ce5c407ca7a5d54a69d224faaaef6d28d668573b577b WatchSource:0}: Error finding container ab66e7087e460ffcf809ce5c407ca7a5d54a69d224faaaef6d28d668573b577b: Status 404 returned error can't find the container with id ab66e7087e460ffcf809ce5c407ca7a5d54a69d224faaaef6d28d668573b577b Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.663235 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d20f42f-7f98-4c97-a28e-17749d977819-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fvjh7\" (UID: \"6d20f42f-7f98-4c97-a28e-17749d977819\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvjh7" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.663645 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcckl\" (UniqueName: \"kubernetes.io/projected/4a34a201-8137-4efe-a99a-1ebd89e40c68-kube-api-access-zcckl\") pod \"console-operator-58897d9998-76hwt\" (UID: \"4a34a201-8137-4efe-a99a-1ebd89e40c68\") " pod="openshift-console-operator/console-operator-58897d9998-76hwt" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.675055 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj7fw\" (UniqueName: \"kubernetes.io/projected/2d75d692-d062-4dfa-b7e7-ea53683bc549-kube-api-access-zj7fw\") pod \"machine-config-controller-84d6567774-xl58c\" (UID: \"2d75d692-d062-4dfa-b7e7-ea53683bc549\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xl58c" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.679839 4732 request.go:700] Waited for 1.964537151s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/serviceaccounts/oauth-openshift/token Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.695171 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmhxj\" (UniqueName: \"kubernetes.io/projected/c70b5281-74d8-44ff-8f4b-326a3d7192aa-kube-api-access-tmhxj\") pod \"oauth-openshift-558db77b4-86zsp\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.696103 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vmngg" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.722413 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6qfk\" (UniqueName: \"kubernetes.io/projected/5425ff81-73c8-4fca-b208-9c9dbc6a949d-kube-api-access-q6qfk\") pod \"openshift-apiserver-operator-796bbdcf4f-m7sxt\" (UID: \"5425ff81-73c8-4fca-b208-9c9dbc6a949d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7sxt" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.736941 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r98n2" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.740714 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8btn6\" (UniqueName: \"kubernetes.io/projected/16ec121b-fdf2-452d-8963-08d6132f7c5c-kube-api-access-8btn6\") pod \"controller-manager-879f6c89f-54khq\" (UID: \"16ec121b-fdf2-452d-8963-08d6132f7c5c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.750292 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-76hwt" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.758527 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.760324 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.766017 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5mx6\" (UniqueName: \"kubernetes.io/projected/49c54f44-4a94-4b19-b03d-8469355931d0-kube-api-access-c5mx6\") pod \"etcd-operator-b45778765-qdvzn\" (UID: \"49c54f44-4a94-4b19-b03d-8469355931d0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qdvzn" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.767249 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f998c" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.777762 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvjh7" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.780967 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.802142 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.814519 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-95tg7"] Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.820747 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.852721 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.854851 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp"] Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.858816 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sxw8j" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.860847 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.872977 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xl58c" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.884206 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.887583 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vmngg"] Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.901176 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.917181 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-95tg7" event={"ID":"d0bcf0e9-75b6-443e-afd8-e2fb6f807e90","Type":"ContainerStarted","Data":"bad1ab1f60ef0544ebdab7e528c39971ab987c8e1621bdd2d038ad096c1c5c13"} Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.918148 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6fp76" event={"ID":"e1e85ba5-52ed-4b2f-9901-b04090159f4c","Type":"ContainerStarted","Data":"ab66e7087e460ffcf809ce5c407ca7a5d54a69d224faaaef6d28d668573b577b"} Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.920325 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp" event={"ID":"2c662639-32a4-4f78-af37-9b1e65bab4e8","Type":"ContainerStarted","Data":"e10769e8ff494320d92a409f25730369c48a152f098c3c74207c6950e2854368"} Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.921334 4732 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.941751 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.958157 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.974472 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7sxt" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.989119 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc9pl\" (UniqueName: \"kubernetes.io/projected/1eee8837-1a56-40df-b564-bb65ad94d593-kube-api-access-xc9pl\") pod \"control-plane-machine-set-operator-78cbb6b69f-szr89\" (UID: \"1eee8837-1a56-40df-b564-bb65ad94d593\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-szr89" Apr 02 13:41:30 crc kubenswrapper[4732]: I0402 13:41:30.997421 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8fmm\" (UniqueName: \"kubernetes.io/projected/4e9d3578-0893-4852-80b6-999e5a7ccdc5-kube-api-access-f8fmm\") pod \"downloads-7954f5f757-zmhn7\" (UID: \"4e9d3578-0893-4852-80b6-999e5a7ccdc5\") " pod="openshift-console/downloads-7954f5f757-zmhn7" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.018678 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dktjz\" (UniqueName: \"kubernetes.io/projected/cbd2d545-1f43-45dd-8f7e-23ea76dbc0a4-kube-api-access-dktjz\") pod \"service-ca-operator-777779d784-m4dzs\" (UID: \"cbd2d545-1f43-45dd-8f7e-23ea76dbc0a4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4dzs" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.023463 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qdvzn" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.032824 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zmhn7" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.035289 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzpjt\" (UniqueName: \"kubernetes.io/projected/5f0e45b7-fac8-406c-bbe9-e92490d95fda-kube-api-access-jzpjt\") pod \"packageserver-d55dfcdfc-77jmk\" (UID: \"5f0e45b7-fac8-406c-bbe9-e92490d95fda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77jmk" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.044829 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77jmk" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.058972 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6t4g\" (UniqueName: \"kubernetes.io/projected/bb62d6fb-d819-4fc1-aa43-35fb1012a1ba-kube-api-access-q6t4g\") pod \"olm-operator-6b444d44fb-845fh\" (UID: \"bb62d6fb-d819-4fc1-aa43-35fb1012a1ba\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-845fh" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.081019 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vmdm\" (UniqueName: \"kubernetes.io/projected/3568fcc7-10bd-4972-9782-b97aa3c9c8a0-kube-api-access-5vmdm\") pod \"machine-api-operator-5694c8668f-kpg9f\" (UID: \"3568fcc7-10bd-4972-9782-b97aa3c9c8a0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kpg9f" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.086343 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-54khq"] Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.101844 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4c6l\" (UniqueName: \"kubernetes.io/projected/446ace70-43fa-494c-8bce-13a5ea3ca452-kube-api-access-k4c6l\") pod \"dns-operator-744455d44c-t9dlm\" (UID: \"446ace70-43fa-494c-8bce-13a5ea3ca452\") " pod="openshift-dns-operator/dns-operator-744455d44c-t9dlm" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.116438 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82sjv\" (UniqueName: \"kubernetes.io/projected/803d2a64-1416-46cb-ae46-5a8462b057f9-kube-api-access-82sjv\") pod \"openshift-controller-manager-operator-756b6f6bc6-2gdhw\" (UID: \"803d2a64-1416-46cb-ae46-5a8462b057f9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2gdhw" Apr 02 13:41:31 crc kubenswrapper[4732]: W0402 13:41:31.134984 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16ec121b_fdf2_452d_8963_08d6132f7c5c.slice/crio-ae54703e3a6fed5d4bcd093b60943f903041de0c2e8f7e509ce77149953c3ed9 WatchSource:0}: Error finding container ae54703e3a6fed5d4bcd093b60943f903041de0c2e8f7e509ce77149953c3ed9: Status 404 returned error can't find the container with id ae54703e3a6fed5d4bcd093b60943f903041de0c2e8f7e509ce77149953c3ed9 Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.143691 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8qcb\" (UniqueName: \"kubernetes.io/projected/5cb3c06a-e3cf-4a60-b180-82759b9d55fc-kube-api-access-m8qcb\") pod \"service-ca-9c57cc56f-brl6n\" (UID: \"5cb3c06a-e3cf-4a60-b180-82759b9d55fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-brl6n" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.144712 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fvjh7"] Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.162818 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqcj6\" (UniqueName: \"kubernetes.io/projected/67577cc5-6dee-4465-beee-ea424d976972-kube-api-access-rqcj6\") pod \"collect-profiles-29585610-wbzhk\" (UID: \"67577cc5-6dee-4465-beee-ea424d976972\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585610-wbzhk" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.178198 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs7h4\" (UniqueName: \"kubernetes.io/projected/d8ff2a93-ff6b-4ef8-9109-de3c22a6f108-kube-api-access-cs7h4\") pod \"router-default-5444994796-6qzc4\" (UID: \"d8ff2a93-ff6b-4ef8-9109-de3c22a6f108\") " pod="openshift-ingress/router-default-5444994796-6qzc4" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.183408 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-szr89" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.198973 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-845fh" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.200919 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-76hwt"] Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.202130 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkkwm\" (UniqueName: \"kubernetes.io/projected/4d77c191-7d04-4381-838f-b7a355e7c2d4-kube-api-access-nkkwm\") pod \"console-f9d7485db-dq9x9\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.211700 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r98n2"] Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.222455 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4dzs" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.231560 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqtpb\" (UniqueName: \"kubernetes.io/projected/99e5508c-0d75-4f87-9c07-b53509e461aa-kube-api-access-kqtpb\") pod \"marketplace-operator-79b997595-s6sm2\" (UID: \"99e5508c-0d75-4f87-9c07-b53509e461aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-s6sm2" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.238605 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6kwg\" (UniqueName: \"kubernetes.io/projected/dd5add8c-a0d5-412d-925d-bd21c5893935-kube-api-access-v6kwg\") pod \"machine-config-operator-74547568cd-w7tht\" (UID: \"dd5add8c-a0d5-412d-925d-bd21c5893935\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w7tht" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.255930 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-s6sm2" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.262578 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfsxg\" (UniqueName: \"kubernetes.io/projected/64a68003-b71d-4ac2-aaaf-76b67ed758cd-kube-api-access-vfsxg\") pod \"authentication-operator-69f744f599-8qcmg\" (UID: \"64a68003-b71d-4ac2-aaaf-76b67ed758cd\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8qcmg" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.262817 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-kpg9f" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.270038 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-8qcmg" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.277344 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2pmz\" (UniqueName: \"kubernetes.io/projected/9d200061-d82a-4b89-9bea-83a1c7d9eca8-kube-api-access-v2pmz\") pod \"catalog-operator-68c6474976-ggc4c\" (UID: \"9d200061-d82a-4b89-9bea-83a1c7d9eca8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ggc4c" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.279869 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.289994 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2gdhw" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.302685 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8njx\" (UniqueName: \"kubernetes.io/projected/ff413ea9-0a19-4fc0-8067-9521bc9e472c-kube-api-access-f8njx\") pod \"multus-admission-controller-857f4d67dd-8qffd\" (UID: \"ff413ea9-0a19-4fc0-8067-9521bc9e472c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8qffd" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.304882 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6qzc4" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.316637 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w7tht" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.321263 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-t9dlm" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.329740 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f998c"] Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.338663 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ggc4c" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.351974 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29585610-wbzhk" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.361598 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-brl6n" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.362008 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jkkc\" (UniqueName: \"kubernetes.io/projected/9a82c61a-7d7e-4401-963a-1f1fe908002c-kube-api-access-2jkkc\") pod \"auto-csr-approver-29585620-t897v\" (UID: \"9a82c61a-7d7e-4401-963a-1f1fe908002c\") " pod="openshift-infra/auto-csr-approver-29585620-t897v" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.362095 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gjnv\" (UniqueName: \"kubernetes.io/projected/ea530987-3884-4994-9574-b73fc76fcdde-kube-api-access-5gjnv\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.362123 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc020ab9-6c58-4571-ad12-3e22c8472a85-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-t8m6h\" (UID: \"fc020ab9-6c58-4571-ad12-3e22c8472a85\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8m6h" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.362206 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea530987-3884-4994-9574-b73fc76fcdde-serving-cert\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.362232 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw692\" (UniqueName: \"kubernetes.io/projected/8f520dcd-7f31-4d5b-bd4c-369a0ede6d70-kube-api-access-zw692\") pod \"package-server-manager-789f6589d5-8dxlz\" (UID: \"8f520dcd-7f31-4d5b-bd4c-369a0ede6d70\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8dxlz" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.362252 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ea530987-3884-4994-9574-b73fc76fcdde-audit-dir\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.362312 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-config\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.362334 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2570535-673c-495c-a5aa-392f14ceebb1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.362379 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2570535-673c-495c-a5aa-392f14ceebb1-bound-sa-token\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.362426 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gdbx\" (UniqueName: \"kubernetes.io/projected/e2570535-673c-495c-a5aa-392f14ceebb1-kube-api-access-5gdbx\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.362443 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9dpz\" (UniqueName: \"kubernetes.io/projected/fc020ab9-6c58-4571-ad12-3e22c8472a85-kube-api-access-g9dpz\") pod \"kube-storage-version-migrator-operator-b67b599dd-t8m6h\" (UID: \"fc020ab9-6c58-4571-ad12-3e22c8472a85\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8m6h" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.362461 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2570535-673c-495c-a5aa-392f14ceebb1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.362502 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ea530987-3884-4994-9574-b73fc76fcdde-encryption-config\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.362546 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc020ab9-6c58-4571-ad12-3e22c8472a85-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-t8m6h\" (UID: \"fc020ab9-6c58-4571-ad12-3e22c8472a85\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8m6h" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.362583 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f520dcd-7f31-4d5b-bd4c-369a0ede6d70-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8dxlz\" (UID: \"8f520dcd-7f31-4d5b-bd4c-369a0ede6d70\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8dxlz" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.362604 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ea530987-3884-4994-9574-b73fc76fcdde-node-pullsecrets\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.362672 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-audit\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.362688 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ea530987-3884-4994-9574-b73fc76fcdde-etcd-client\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.362703 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2570535-673c-495c-a5aa-392f14ceebb1-registry-certificates\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.362754 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2570535-673c-495c-a5aa-392f14ceebb1-registry-tls\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.362770 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2570535-673c-495c-a5aa-392f14ceebb1-trusted-ca\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.362804 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.362821 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-image-import-ca\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.362855 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.362892 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-etcd-serving-ca\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: E0402 13:41:31.364215 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:31.864201188 +0000 UTC m=+248.768608741 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.409883 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sxw8j"] Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.414075 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-xl58c"] Apr 02 13:41:31 crc kubenswrapper[4732]: W0402 13:41:31.444374 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36f26e27_d72e_42f2_9380_598616e5626b.slice/crio-95211b1e7130d0fbcce12dae1cc7a2122a3af90b313fb53bc07636fa6e8b2fdf WatchSource:0}: Error finding container 95211b1e7130d0fbcce12dae1cc7a2122a3af90b313fb53bc07636fa6e8b2fdf: Status 404 returned error can't find the container with id 95211b1e7130d0fbcce12dae1cc7a2122a3af90b313fb53bc07636fa6e8b2fdf Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.464394 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.464568 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjlds\" (UniqueName: \"kubernetes.io/projected/c314f9c9-f0e4-450b-a152-3be055d1bd46-kube-api-access-jjlds\") pod \"machine-config-server-tfwcw\" (UID: \"c314f9c9-f0e4-450b-a152-3be055d1bd46\") " pod="openshift-machine-config-operator/machine-config-server-tfwcw" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.464593 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f520dcd-7f31-4d5b-bd4c-369a0ede6d70-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8dxlz\" (UID: \"8f520dcd-7f31-4d5b-bd4c-369a0ede6d70\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8dxlz" Apr 02 13:41:31 crc kubenswrapper[4732]: E0402 13:41:31.464635 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:31.964603243 +0000 UTC m=+248.869010796 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.464704 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ea530987-3884-4994-9574-b73fc76fcdde-node-pullsecrets\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.464774 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a7ade12-5bbc-4ece-a3b6-b85a31c36f16-metrics-tls\") pod \"dns-default-wbcwp\" (UID: \"1a7ade12-5bbc-4ece-a3b6-b85a31c36f16\") " pod="openshift-dns/dns-default-wbcwp" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.464803 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-audit\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.464822 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4xnx\" (UniqueName: \"kubernetes.io/projected/76af458d-5d8b-44eb-84f9-8a142a0a9169-kube-api-access-s4xnx\") pod \"ingress-canary-cbp4s\" (UID: \"76af458d-5d8b-44eb-84f9-8a142a0a9169\") " pod="openshift-ingress-canary/ingress-canary-cbp4s" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.464874 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ea530987-3884-4994-9574-b73fc76fcdde-etcd-client\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.464893 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2570535-673c-495c-a5aa-392f14ceebb1-registry-certificates\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.464910 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c314f9c9-f0e4-450b-a152-3be055d1bd46-certs\") pod \"machine-config-server-tfwcw\" (UID: \"c314f9c9-f0e4-450b-a152-3be055d1bd46\") " pod="openshift-machine-config-operator/machine-config-server-tfwcw" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465010 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2570535-673c-495c-a5aa-392f14ceebb1-registry-tls\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465025 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2570535-673c-495c-a5aa-392f14ceebb1-trusted-ca\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465040 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10c1127e-ac61-4432-b5a8-828e3b84d61e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ntdkq\" (UID: \"10c1127e-ac61-4432-b5a8-828e3b84d61e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntdkq" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465060 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465075 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9b3f7b50-e118-4aca-986d-0f52e772edc3-plugins-dir\") pod \"csi-hostpathplugin-k2sdm\" (UID: \"9b3f7b50-e118-4aca-986d-0f52e772edc3\") " pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465091 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-image-import-ca\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465107 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a1a685af-f17e-4937-a7d6-edca2c96e842-etcd-client\") pod \"apiserver-7bbb656c7d-j2z2d\" (UID: \"a1a685af-f17e-4937-a7d6-edca2c96e842\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465138 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465170 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-etcd-serving-ca\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465216 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9b3f7b50-e118-4aca-986d-0f52e772edc3-mountpoint-dir\") pod \"csi-hostpathplugin-k2sdm\" (UID: \"9b3f7b50-e118-4aca-986d-0f52e772edc3\") " pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465253 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1a685af-f17e-4937-a7d6-edca2c96e842-serving-cert\") pod \"apiserver-7bbb656c7d-j2z2d\" (UID: \"a1a685af-f17e-4937-a7d6-edca2c96e842\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465283 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jkkc\" (UniqueName: \"kubernetes.io/projected/9a82c61a-7d7e-4401-963a-1f1fe908002c-kube-api-access-2jkkc\") pod \"auto-csr-approver-29585620-t897v\" (UID: \"9a82c61a-7d7e-4401-963a-1f1fe908002c\") " pod="openshift-infra/auto-csr-approver-29585620-t897v" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465299 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gjnv\" (UniqueName: \"kubernetes.io/projected/ea530987-3884-4994-9574-b73fc76fcdde-kube-api-access-5gjnv\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465315 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a1a685af-f17e-4937-a7d6-edca2c96e842-encryption-config\") pod \"apiserver-7bbb656c7d-j2z2d\" (UID: \"a1a685af-f17e-4937-a7d6-edca2c96e842\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465341 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc020ab9-6c58-4571-ad12-3e22c8472a85-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-t8m6h\" (UID: \"fc020ab9-6c58-4571-ad12-3e22c8472a85\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8m6h" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465376 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a1a685af-f17e-4937-a7d6-edca2c96e842-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-j2z2d\" (UID: \"a1a685af-f17e-4937-a7d6-edca2c96e842\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465421 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea530987-3884-4994-9574-b73fc76fcdde-serving-cert\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465437 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rtgs\" (UniqueName: \"kubernetes.io/projected/1a7ade12-5bbc-4ece-a3b6-b85a31c36f16-kube-api-access-6rtgs\") pod \"dns-default-wbcwp\" (UID: \"1a7ade12-5bbc-4ece-a3b6-b85a31c36f16\") " pod="openshift-dns/dns-default-wbcwp" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465474 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw692\" (UniqueName: \"kubernetes.io/projected/8f520dcd-7f31-4d5b-bd4c-369a0ede6d70-kube-api-access-zw692\") pod \"package-server-manager-789f6589d5-8dxlz\" (UID: \"8f520dcd-7f31-4d5b-bd4c-369a0ede6d70\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8dxlz" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465490 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a1a685af-f17e-4937-a7d6-edca2c96e842-audit-dir\") pod \"apiserver-7bbb656c7d-j2z2d\" (UID: \"a1a685af-f17e-4937-a7d6-edca2c96e842\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465516 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10c1127e-ac61-4432-b5a8-828e3b84d61e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ntdkq\" (UID: \"10c1127e-ac61-4432-b5a8-828e3b84d61e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntdkq" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465534 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ea530987-3884-4994-9574-b73fc76fcdde-audit-dir\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465604 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-config\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465657 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9b3f7b50-e118-4aca-986d-0f52e772edc3-socket-dir\") pod \"csi-hostpathplugin-k2sdm\" (UID: \"9b3f7b50-e118-4aca-986d-0f52e772edc3\") " pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465683 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9b3f7b50-e118-4aca-986d-0f52e772edc3-csi-data-dir\") pod \"csi-hostpathplugin-k2sdm\" (UID: \"9b3f7b50-e118-4aca-986d-0f52e772edc3\") " pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465736 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2570535-673c-495c-a5aa-392f14ceebb1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465752 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10c1127e-ac61-4432-b5a8-828e3b84d61e-config\") pod \"kube-controller-manager-operator-78b949d7b-ntdkq\" (UID: \"10c1127e-ac61-4432-b5a8-828e3b84d61e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntdkq" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465786 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9b3f7b50-e118-4aca-986d-0f52e772edc3-registration-dir\") pod \"csi-hostpathplugin-k2sdm\" (UID: \"9b3f7b50-e118-4aca-986d-0f52e772edc3\") " pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465840 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a7ade12-5bbc-4ece-a3b6-b85a31c36f16-config-volume\") pod \"dns-default-wbcwp\" (UID: \"1a7ade12-5bbc-4ece-a3b6-b85a31c36f16\") " pod="openshift-dns/dns-default-wbcwp" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465862 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bppl7\" (UniqueName: \"kubernetes.io/projected/93a3cb6d-2580-4709-a329-67820678af15-kube-api-access-bppl7\") pod \"migrator-59844c95c7-dcwb2\" (UID: \"93a3cb6d-2580-4709-a329-67820678af15\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dcwb2" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465898 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2570535-673c-495c-a5aa-392f14ceebb1-bound-sa-token\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.465918 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g75v\" (UniqueName: \"kubernetes.io/projected/9b3f7b50-e118-4aca-986d-0f52e772edc3-kube-api-access-6g75v\") pod \"csi-hostpathplugin-k2sdm\" (UID: \"9b3f7b50-e118-4aca-986d-0f52e772edc3\") " pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.466000 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gdbx\" (UniqueName: \"kubernetes.io/projected/e2570535-673c-495c-a5aa-392f14ceebb1-kube-api-access-5gdbx\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.466021 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c314f9c9-f0e4-450b-a152-3be055d1bd46-node-bootstrap-token\") pod \"machine-config-server-tfwcw\" (UID: \"c314f9c9-f0e4-450b-a152-3be055d1bd46\") " pod="openshift-machine-config-operator/machine-config-server-tfwcw" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.466050 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f92dw\" (UniqueName: \"kubernetes.io/projected/a1a685af-f17e-4937-a7d6-edca2c96e842-kube-api-access-f92dw\") pod \"apiserver-7bbb656c7d-j2z2d\" (UID: \"a1a685af-f17e-4937-a7d6-edca2c96e842\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.466067 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76af458d-5d8b-44eb-84f9-8a142a0a9169-cert\") pod \"ingress-canary-cbp4s\" (UID: \"76af458d-5d8b-44eb-84f9-8a142a0a9169\") " pod="openshift-ingress-canary/ingress-canary-cbp4s" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.466085 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9dpz\" (UniqueName: \"kubernetes.io/projected/fc020ab9-6c58-4571-ad12-3e22c8472a85-kube-api-access-g9dpz\") pod \"kube-storage-version-migrator-operator-b67b599dd-t8m6h\" (UID: \"fc020ab9-6c58-4571-ad12-3e22c8472a85\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8m6h" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.466105 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2570535-673c-495c-a5aa-392f14ceebb1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.466135 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a1a685af-f17e-4937-a7d6-edca2c96e842-audit-policies\") pod \"apiserver-7bbb656c7d-j2z2d\" (UID: \"a1a685af-f17e-4937-a7d6-edca2c96e842\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.466160 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1a685af-f17e-4937-a7d6-edca2c96e842-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-j2z2d\" (UID: \"a1a685af-f17e-4937-a7d6-edca2c96e842\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.466177 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ea530987-3884-4994-9574-b73fc76fcdde-encryption-config\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.466252 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc020ab9-6c58-4571-ad12-3e22c8472a85-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-t8m6h\" (UID: \"fc020ab9-6c58-4571-ad12-3e22c8472a85\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8m6h" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.467332 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ea530987-3884-4994-9574-b73fc76fcdde-node-pullsecrets\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: E0402 13:41:31.468566 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:31.968552597 +0000 UTC m=+248.872960220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.469326 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2570535-673c-495c-a5aa-392f14ceebb1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.469875 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f520dcd-7f31-4d5b-bd4c-369a0ede6d70-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8dxlz\" (UID: \"8f520dcd-7f31-4d5b-bd4c-369a0ede6d70\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8dxlz" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.470130 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ea530987-3884-4994-9574-b73fc76fcdde-audit-dir\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.470137 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc020ab9-6c58-4571-ad12-3e22c8472a85-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-t8m6h\" (UID: \"fc020ab9-6c58-4571-ad12-3e22c8472a85\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8m6h" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.475445 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2570535-673c-495c-a5aa-392f14ceebb1-trusted-ca\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.476004 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2570535-673c-495c-a5aa-392f14ceebb1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.476132 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea530987-3884-4994-9574-b73fc76fcdde-serving-cert\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.477011 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2570535-673c-495c-a5aa-392f14ceebb1-registry-tls\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.477674 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.479553 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-audit\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.479675 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-config\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.480250 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc020ab9-6c58-4571-ad12-3e22c8472a85-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-t8m6h\" (UID: \"fc020ab9-6c58-4571-ad12-3e22c8472a85\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8m6h" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.480599 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2570535-673c-495c-a5aa-392f14ceebb1-registry-certificates\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.481001 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-image-import-ca\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.481064 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ea530987-3884-4994-9574-b73fc76fcdde-etcd-client\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.481302 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ea530987-3884-4994-9574-b73fc76fcdde-encryption-config\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.486211 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-etcd-serving-ca\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.504954 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8qffd" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.514727 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2570535-673c-495c-a5aa-392f14ceebb1-bound-sa-token\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.518761 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7sxt"] Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.538338 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9dpz\" (UniqueName: \"kubernetes.io/projected/fc020ab9-6c58-4571-ad12-3e22c8472a85-kube-api-access-g9dpz\") pod \"kube-storage-version-migrator-operator-b67b599dd-t8m6h\" (UID: \"fc020ab9-6c58-4571-ad12-3e22c8472a85\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8m6h" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.559277 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-86zsp"] Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567067 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567242 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10c1127e-ac61-4432-b5a8-828e3b84d61e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ntdkq\" (UID: \"10c1127e-ac61-4432-b5a8-828e3b84d61e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntdkq" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567304 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9b3f7b50-e118-4aca-986d-0f52e772edc3-plugins-dir\") pod \"csi-hostpathplugin-k2sdm\" (UID: \"9b3f7b50-e118-4aca-986d-0f52e772edc3\") " pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567325 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a1a685af-f17e-4937-a7d6-edca2c96e842-etcd-client\") pod \"apiserver-7bbb656c7d-j2z2d\" (UID: \"a1a685af-f17e-4937-a7d6-edca2c96e842\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567378 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9b3f7b50-e118-4aca-986d-0f52e772edc3-mountpoint-dir\") pod \"csi-hostpathplugin-k2sdm\" (UID: \"9b3f7b50-e118-4aca-986d-0f52e772edc3\") " pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567400 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1a685af-f17e-4937-a7d6-edca2c96e842-serving-cert\") pod \"apiserver-7bbb656c7d-j2z2d\" (UID: \"a1a685af-f17e-4937-a7d6-edca2c96e842\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567429 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a1a685af-f17e-4937-a7d6-edca2c96e842-encryption-config\") pod \"apiserver-7bbb656c7d-j2z2d\" (UID: \"a1a685af-f17e-4937-a7d6-edca2c96e842\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567554 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a1a685af-f17e-4937-a7d6-edca2c96e842-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-j2z2d\" (UID: \"a1a685af-f17e-4937-a7d6-edca2c96e842\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567581 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rtgs\" (UniqueName: \"kubernetes.io/projected/1a7ade12-5bbc-4ece-a3b6-b85a31c36f16-kube-api-access-6rtgs\") pod \"dns-default-wbcwp\" (UID: \"1a7ade12-5bbc-4ece-a3b6-b85a31c36f16\") " pod="openshift-dns/dns-default-wbcwp" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567605 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a1a685af-f17e-4937-a7d6-edca2c96e842-audit-dir\") pod \"apiserver-7bbb656c7d-j2z2d\" (UID: \"a1a685af-f17e-4937-a7d6-edca2c96e842\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567646 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10c1127e-ac61-4432-b5a8-828e3b84d61e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ntdkq\" (UID: \"10c1127e-ac61-4432-b5a8-828e3b84d61e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntdkq" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567682 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9b3f7b50-e118-4aca-986d-0f52e772edc3-socket-dir\") pod \"csi-hostpathplugin-k2sdm\" (UID: \"9b3f7b50-e118-4aca-986d-0f52e772edc3\") " pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567698 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9b3f7b50-e118-4aca-986d-0f52e772edc3-csi-data-dir\") pod \"csi-hostpathplugin-k2sdm\" (UID: \"9b3f7b50-e118-4aca-986d-0f52e772edc3\") " pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567730 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10c1127e-ac61-4432-b5a8-828e3b84d61e-config\") pod \"kube-controller-manager-operator-78b949d7b-ntdkq\" (UID: \"10c1127e-ac61-4432-b5a8-828e3b84d61e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntdkq" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567760 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9b3f7b50-e118-4aca-986d-0f52e772edc3-registration-dir\") pod \"csi-hostpathplugin-k2sdm\" (UID: \"9b3f7b50-e118-4aca-986d-0f52e772edc3\") " pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567784 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a7ade12-5bbc-4ece-a3b6-b85a31c36f16-config-volume\") pod \"dns-default-wbcwp\" (UID: \"1a7ade12-5bbc-4ece-a3b6-b85a31c36f16\") " pod="openshift-dns/dns-default-wbcwp" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567808 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bppl7\" (UniqueName: \"kubernetes.io/projected/93a3cb6d-2580-4709-a329-67820678af15-kube-api-access-bppl7\") pod \"migrator-59844c95c7-dcwb2\" (UID: \"93a3cb6d-2580-4709-a329-67820678af15\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dcwb2" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567831 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g75v\" (UniqueName: \"kubernetes.io/projected/9b3f7b50-e118-4aca-986d-0f52e772edc3-kube-api-access-6g75v\") pod \"csi-hostpathplugin-k2sdm\" (UID: \"9b3f7b50-e118-4aca-986d-0f52e772edc3\") " pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567855 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c314f9c9-f0e4-450b-a152-3be055d1bd46-node-bootstrap-token\") pod \"machine-config-server-tfwcw\" (UID: \"c314f9c9-f0e4-450b-a152-3be055d1bd46\") " pod="openshift-machine-config-operator/machine-config-server-tfwcw" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567869 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f92dw\" (UniqueName: \"kubernetes.io/projected/a1a685af-f17e-4937-a7d6-edca2c96e842-kube-api-access-f92dw\") pod \"apiserver-7bbb656c7d-j2z2d\" (UID: \"a1a685af-f17e-4937-a7d6-edca2c96e842\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567884 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76af458d-5d8b-44eb-84f9-8a142a0a9169-cert\") pod \"ingress-canary-cbp4s\" (UID: \"76af458d-5d8b-44eb-84f9-8a142a0a9169\") " pod="openshift-ingress-canary/ingress-canary-cbp4s" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567902 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a1a685af-f17e-4937-a7d6-edca2c96e842-audit-policies\") pod \"apiserver-7bbb656c7d-j2z2d\" (UID: \"a1a685af-f17e-4937-a7d6-edca2c96e842\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567917 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1a685af-f17e-4937-a7d6-edca2c96e842-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-j2z2d\" (UID: \"a1a685af-f17e-4937-a7d6-edca2c96e842\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567957 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjlds\" (UniqueName: \"kubernetes.io/projected/c314f9c9-f0e4-450b-a152-3be055d1bd46-kube-api-access-jjlds\") pod \"machine-config-server-tfwcw\" (UID: \"c314f9c9-f0e4-450b-a152-3be055d1bd46\") " pod="openshift-machine-config-operator/machine-config-server-tfwcw" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567974 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a7ade12-5bbc-4ece-a3b6-b85a31c36f16-metrics-tls\") pod \"dns-default-wbcwp\" (UID: \"1a7ade12-5bbc-4ece-a3b6-b85a31c36f16\") " pod="openshift-dns/dns-default-wbcwp" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567990 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4xnx\" (UniqueName: \"kubernetes.io/projected/76af458d-5d8b-44eb-84f9-8a142a0a9169-kube-api-access-s4xnx\") pod \"ingress-canary-cbp4s\" (UID: \"76af458d-5d8b-44eb-84f9-8a142a0a9169\") " pod="openshift-ingress-canary/ingress-canary-cbp4s" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.568006 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c314f9c9-f0e4-450b-a152-3be055d1bd46-certs\") pod \"machine-config-server-tfwcw\" (UID: \"c314f9c9-f0e4-450b-a152-3be055d1bd46\") " pod="openshift-machine-config-operator/machine-config-server-tfwcw" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.568133 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9b3f7b50-e118-4aca-986d-0f52e772edc3-socket-dir\") pod \"csi-hostpathplugin-k2sdm\" (UID: \"9b3f7b50-e118-4aca-986d-0f52e772edc3\") " pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" Apr 02 13:41:31 crc kubenswrapper[4732]: E0402 13:41:31.568211 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:32.068195551 +0000 UTC m=+248.972603104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.568346 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9b3f7b50-e118-4aca-986d-0f52e772edc3-plugins-dir\") pod \"csi-hostpathplugin-k2sdm\" (UID: \"9b3f7b50-e118-4aca-986d-0f52e772edc3\") " pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.570498 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9b3f7b50-e118-4aca-986d-0f52e772edc3-mountpoint-dir\") pod \"csi-hostpathplugin-k2sdm\" (UID: \"9b3f7b50-e118-4aca-986d-0f52e772edc3\") " pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.572341 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a1a685af-f17e-4937-a7d6-edca2c96e842-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-j2z2d\" (UID: \"a1a685af-f17e-4937-a7d6-edca2c96e842\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.572466 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a1a685af-f17e-4937-a7d6-edca2c96e842-audit-dir\") pod \"apiserver-7bbb656c7d-j2z2d\" (UID: \"a1a685af-f17e-4937-a7d6-edca2c96e842\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.567919 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw692\" (UniqueName: \"kubernetes.io/projected/8f520dcd-7f31-4d5b-bd4c-369a0ede6d70-kube-api-access-zw692\") pod \"package-server-manager-789f6589d5-8dxlz\" (UID: \"8f520dcd-7f31-4d5b-bd4c-369a0ede6d70\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8dxlz" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.572727 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9b3f7b50-e118-4aca-986d-0f52e772edc3-csi-data-dir\") pod \"csi-hostpathplugin-k2sdm\" (UID: \"9b3f7b50-e118-4aca-986d-0f52e772edc3\") " pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.573012 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9b3f7b50-e118-4aca-986d-0f52e772edc3-registration-dir\") pod \"csi-hostpathplugin-k2sdm\" (UID: \"9b3f7b50-e118-4aca-986d-0f52e772edc3\") " pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.573613 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1a685af-f17e-4937-a7d6-edca2c96e842-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-j2z2d\" (UID: \"a1a685af-f17e-4937-a7d6-edca2c96e842\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.574294 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a1a685af-f17e-4937-a7d6-edca2c96e842-audit-policies\") pod \"apiserver-7bbb656c7d-j2z2d\" (UID: \"a1a685af-f17e-4937-a7d6-edca2c96e842\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.574861 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10c1127e-ac61-4432-b5a8-828e3b84d61e-config\") pod \"kube-controller-manager-operator-78b949d7b-ntdkq\" (UID: \"10c1127e-ac61-4432-b5a8-828e3b84d61e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntdkq" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.577208 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c314f9c9-f0e4-450b-a152-3be055d1bd46-certs\") pod \"machine-config-server-tfwcw\" (UID: \"c314f9c9-f0e4-450b-a152-3be055d1bd46\") " pod="openshift-machine-config-operator/machine-config-server-tfwcw" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.578481 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a1a685af-f17e-4937-a7d6-edca2c96e842-etcd-client\") pod \"apiserver-7bbb656c7d-j2z2d\" (UID: \"a1a685af-f17e-4937-a7d6-edca2c96e842\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.582244 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1a685af-f17e-4937-a7d6-edca2c96e842-serving-cert\") pod \"apiserver-7bbb656c7d-j2z2d\" (UID: \"a1a685af-f17e-4937-a7d6-edca2c96e842\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.583358 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8m6h" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.583551 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a7ade12-5bbc-4ece-a3b6-b85a31c36f16-config-volume\") pod \"dns-default-wbcwp\" (UID: \"1a7ade12-5bbc-4ece-a3b6-b85a31c36f16\") " pod="openshift-dns/dns-default-wbcwp" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.584421 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a1a685af-f17e-4937-a7d6-edca2c96e842-encryption-config\") pod \"apiserver-7bbb656c7d-j2z2d\" (UID: \"a1a685af-f17e-4937-a7d6-edca2c96e842\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.591049 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c314f9c9-f0e4-450b-a152-3be055d1bd46-node-bootstrap-token\") pod \"machine-config-server-tfwcw\" (UID: \"c314f9c9-f0e4-450b-a152-3be055d1bd46\") " pod="openshift-machine-config-operator/machine-config-server-tfwcw" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.592177 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10c1127e-ac61-4432-b5a8-828e3b84d61e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ntdkq\" (UID: \"10c1127e-ac61-4432-b5a8-828e3b84d61e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntdkq" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.594979 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76af458d-5d8b-44eb-84f9-8a142a0a9169-cert\") pod \"ingress-canary-cbp4s\" (UID: \"76af458d-5d8b-44eb-84f9-8a142a0a9169\") " pod="openshift-ingress-canary/ingress-canary-cbp4s" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.599369 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gjnv\" (UniqueName: \"kubernetes.io/projected/ea530987-3884-4994-9574-b73fc76fcdde-kube-api-access-5gjnv\") pod \"apiserver-76f77b778f-xzg9j\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.600405 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jkkc\" (UniqueName: \"kubernetes.io/projected/9a82c61a-7d7e-4401-963a-1f1fe908002c-kube-api-access-2jkkc\") pod \"auto-csr-approver-29585620-t897v\" (UID: \"9a82c61a-7d7e-4401-963a-1f1fe908002c\") " pod="openshift-infra/auto-csr-approver-29585620-t897v" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.603919 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77jmk"] Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.607577 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a7ade12-5bbc-4ece-a3b6-b85a31c36f16-metrics-tls\") pod \"dns-default-wbcwp\" (UID: \"1a7ade12-5bbc-4ece-a3b6-b85a31c36f16\") " pod="openshift-dns/dns-default-wbcwp" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.616193 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zmhn7"] Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.625908 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gdbx\" (UniqueName: \"kubernetes.io/projected/e2570535-673c-495c-a5aa-392f14ceebb1-kube-api-access-5gdbx\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.651838 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-845fh"] Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.663776 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10c1127e-ac61-4432-b5a8-828e3b84d61e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ntdkq\" (UID: \"10c1127e-ac61-4432-b5a8-828e3b84d61e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntdkq" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.678682 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntdkq" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.679295 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:31 crc kubenswrapper[4732]: E0402 13:41:31.679639 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:32.179599896 +0000 UTC m=+249.084007459 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.696152 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rtgs\" (UniqueName: \"kubernetes.io/projected/1a7ade12-5bbc-4ece-a3b6-b85a31c36f16-kube-api-access-6rtgs\") pod \"dns-default-wbcwp\" (UID: \"1a7ade12-5bbc-4ece-a3b6-b85a31c36f16\") " pod="openshift-dns/dns-default-wbcwp" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.696683 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f92dw\" (UniqueName: \"kubernetes.io/projected/a1a685af-f17e-4937-a7d6-edca2c96e842-kube-api-access-f92dw\") pod \"apiserver-7bbb656c7d-j2z2d\" (UID: \"a1a685af-f17e-4937-a7d6-edca2c96e842\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.702712 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585620-t897v" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.723335 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4xnx\" (UniqueName: \"kubernetes.io/projected/76af458d-5d8b-44eb-84f9-8a142a0a9169-kube-api-access-s4xnx\") pod \"ingress-canary-cbp4s\" (UID: \"76af458d-5d8b-44eb-84f9-8a142a0a9169\") " pod="openshift-ingress-canary/ingress-canary-cbp4s" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.726851 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qdvzn"] Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.726899 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-szr89"] Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.750892 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjlds\" (UniqueName: \"kubernetes.io/projected/c314f9c9-f0e4-450b-a152-3be055d1bd46-kube-api-access-jjlds\") pod \"machine-config-server-tfwcw\" (UID: \"c314f9c9-f0e4-450b-a152-3be055d1bd46\") " pod="openshift-machine-config-operator/machine-config-server-tfwcw" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.780090 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:31 crc kubenswrapper[4732]: E0402 13:41:31.780436 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:32.280421411 +0000 UTC m=+249.184828964 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.782583 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bppl7\" (UniqueName: \"kubernetes.io/projected/93a3cb6d-2580-4709-a329-67820678af15-kube-api-access-bppl7\") pod \"migrator-59844c95c7-dcwb2\" (UID: \"93a3cb6d-2580-4709-a329-67820678af15\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dcwb2" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.815721 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8dxlz" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.815947 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-w7tht"] Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.819693 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g75v\" (UniqueName: \"kubernetes.io/projected/9b3f7b50-e118-4aca-986d-0f52e772edc3-kube-api-access-6g75v\") pod \"csi-hostpathplugin-k2sdm\" (UID: \"9b3f7b50-e118-4aca-986d-0f52e772edc3\") " pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.883522 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:31 crc kubenswrapper[4732]: E0402 13:41:31.884646 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:32.384633556 +0000 UTC m=+249.289041109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.895999 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.896546 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-m4dzs"] Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.907724 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-s6sm2"] Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.964275 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dcwb2" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.983057 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wbcwp" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.984797 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.983232 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp" event={"ID":"2c662639-32a4-4f78-af37-9b1e65bab4e8","Type":"ContainerStarted","Data":"09340322997b1875ead4db97dff94bfa5238c8d8eedab284b0d9b75216685e5c"} Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.985170 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.985293 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:31 crc kubenswrapper[4732]: E0402 13:41:31.985598 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:32.485575374 +0000 UTC m=+249.389982927 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.993189 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" event={"ID":"c70b5281-74d8-44ff-8f4b-326a3d7192aa","Type":"ContainerStarted","Data":"bec58acb001e8a9e7a6f7d0fc19d800961db922cfa11c56d9ccf308011473970"} Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.993283 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-tfwcw" Apr 02 13:41:31 crc kubenswrapper[4732]: I0402 13:41:31.994706 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f998c" event={"ID":"36f26e27-d72e-42f2-9380-598616e5626b","Type":"ContainerStarted","Data":"95211b1e7130d0fbcce12dae1cc7a2122a3af90b313fb53bc07636fa6e8b2fdf"} Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:31.995915 4732 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-jbmtp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:31.995946 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp" podUID="2c662639-32a4-4f78-af37-9b1e65bab4e8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.007285 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.007952 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zmhn7" event={"ID":"4e9d3578-0893-4852-80b6-999e5a7ccdc5","Type":"ContainerStarted","Data":"1c1e95fca7203f61f4ec05d0d87212001fd87798e74dd534c3e0449ee576df5f"} Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.008190 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cbp4s" Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.011286 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.031516 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-95tg7" event={"ID":"d0bcf0e9-75b6-443e-afd8-e2fb6f807e90","Type":"ContainerStarted","Data":"14f469ff307311640c12eb90b518e9de9142f56c60b9bbd2a39f4c66c4f401ec"} Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.042637 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sxw8j" event={"ID":"ac5f29d0-13ce-46eb-babc-70f32ac34feb","Type":"ContainerStarted","Data":"4c7538554767485594891f397f4b494e6ce3c084ffb823228023d2dc47a9b223"} Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.043697 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7sxt" event={"ID":"5425ff81-73c8-4fca-b208-9c9dbc6a949d","Type":"ContainerStarted","Data":"7745952e71a51f2222dc277443f3fd64a4070be085dbfb1e8c46baa5db23e834"} Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.045495 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" event={"ID":"16ec121b-fdf2-452d-8963-08d6132f7c5c","Type":"ContainerStarted","Data":"3989604763ea02ad03b6f003c903eb3ed34bff775fbc928643ea3514f5743a30"} Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.045517 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" event={"ID":"16ec121b-fdf2-452d-8963-08d6132f7c5c","Type":"ContainerStarted","Data":"ae54703e3a6fed5d4bcd093b60943f903041de0c2e8f7e509ce77149953c3ed9"} Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.051036 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.058210 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qdvzn" event={"ID":"49c54f44-4a94-4b19-b03d-8469355931d0","Type":"ContainerStarted","Data":"5d7c96cd3796affa47c79192fd90db9dd430495bc09857d38ed1246c24ac4673"} Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.067601 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r98n2" event={"ID":"ff7c4e9d-5437-412b-867d-1e44dfc73df5","Type":"ContainerStarted","Data":"ad448a02d11bd76c4ff7a501a9c3149d4a50e0daca6f2733c22d2710cde1fbb4"} Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.069927 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xl58c" event={"ID":"2d75d692-d062-4dfa-b7e7-ea53683bc549","Type":"ContainerStarted","Data":"e097189d001fd0e1d10dac8a61a2621cc51f4fe4a95bdd20e1e2c246f14c9844"} Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.074963 4732 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-54khq container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.075009 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" podUID="16ec121b-fdf2-452d-8963-08d6132f7c5c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Apr 02 13:41:32 crc kubenswrapper[4732]: W0402 13:41:32.078932 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbd2d545_1f43_45dd_8f7e_23ea76dbc0a4.slice/crio-d28e3f596f731689ff3b894186d8a97cd6de2f1ec27ad096902b928fdfd8af6e WatchSource:0}: Error finding container d28e3f596f731689ff3b894186d8a97cd6de2f1ec27ad096902b928fdfd8af6e: Status 404 returned error can't find the container with id d28e3f596f731689ff3b894186d8a97cd6de2f1ec27ad096902b928fdfd8af6e Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.086482 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:32 crc kubenswrapper[4732]: E0402 13:41:32.089137 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:32.589118222 +0000 UTC m=+249.493525845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:32 crc kubenswrapper[4732]: W0402 13:41:32.093182 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99e5508c_0d75_4f87_9c07_b53509e461aa.slice/crio-57b5ddb6afef1b202b4d135b6404a217d2103eeb512b85bddd0f568d4a52fc1a WatchSource:0}: Error finding container 57b5ddb6afef1b202b4d135b6404a217d2103eeb512b85bddd0f568d4a52fc1a: Status 404 returned error can't find the container with id 57b5ddb6afef1b202b4d135b6404a217d2103eeb512b85bddd0f568d4a52fc1a Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.093316 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-845fh" event={"ID":"bb62d6fb-d819-4fc1-aa43-35fb1012a1ba","Type":"ContainerStarted","Data":"148e5ec73d1ea9dff74bb24855fae722dccca0644d862adb8f1c70dfd2386e30"} Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.102206 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6qzc4" event={"ID":"d8ff2a93-ff6b-4ef8-9109-de3c22a6f108","Type":"ContainerStarted","Data":"8ec1d039f9f0e47b80c6ad31a95ef70d3a5f3e7a14d48effe8471c953c31ed82"} Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.107356 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvjh7" event={"ID":"6d20f42f-7f98-4c97-a28e-17749d977819","Type":"ContainerStarted","Data":"9bb2204fb2beb263f43814e61f4fe18c934876f989ac9711c37eb67404c27142"} Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.107433 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvjh7" event={"ID":"6d20f42f-7f98-4c97-a28e-17749d977819","Type":"ContainerStarted","Data":"830ccafbe3add9ad96e9c1a56ccccf6a3a493408be62d22861acb2b52e8b838a"} Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.111008 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-szr89" event={"ID":"1eee8837-1a56-40df-b564-bb65ad94d593","Type":"ContainerStarted","Data":"86fcf2665a717f1817353f93badee09adbcb4858bd7ff4a4f0d48f81d2b8f0d8"} Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.114660 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77jmk" event={"ID":"5f0e45b7-fac8-406c-bbe9-e92490d95fda","Type":"ContainerStarted","Data":"11089a66b3005574c418480523de14220776dc3d5e276d5ae6dfeda5f486c515"} Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.124928 4732 generic.go:334] "Generic (PLEG): container finished" podID="f0c6d23c-1aff-4a2f-b92b-6cb07a0ca8b6" containerID="dacaf6b29c8d693399640dc190aae4180ae813bd668cf5c050192bd29575ca7f" exitCode=0 Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.125126 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vmngg" event={"ID":"f0c6d23c-1aff-4a2f-b92b-6cb07a0ca8b6","Type":"ContainerDied","Data":"dacaf6b29c8d693399640dc190aae4180ae813bd668cf5c050192bd29575ca7f"} Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.125214 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vmngg" event={"ID":"f0c6d23c-1aff-4a2f-b92b-6cb07a0ca8b6","Type":"ContainerStarted","Data":"2b09fc546ad4840aba942f9e6ce561086e4641a9edd13c46a7b3c342095f0494"} Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.142785 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6fp76" event={"ID":"e1e85ba5-52ed-4b2f-9901-b04090159f4c","Type":"ContainerStarted","Data":"d4a665c736db77740a8bf2b40fed9caea9e70b6ccd3cb21cad7759c840b3b271"} Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.142844 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6fp76" event={"ID":"e1e85ba5-52ed-4b2f-9901-b04090159f4c","Type":"ContainerStarted","Data":"e36dd3dfe71433b4c7e26fbb8a2916663f1135578a69ac6ba9648c95b4e579df"} Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.143663 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-76hwt" event={"ID":"4a34a201-8137-4efe-a99a-1ebd89e40c68","Type":"ContainerStarted","Data":"90fe6d65b20f3fd0bf4d34fdfba2175c7c517e0dbb295feeadae6b483e3ce60d"} Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.144251 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-76hwt" event={"ID":"4a34a201-8137-4efe-a99a-1ebd89e40c68","Type":"ContainerStarted","Data":"a8505fd0eab6f5d0a8a174e9bb00f3e048266ef2c4f1df020bd933f130333ac0"} Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.144281 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-76hwt" Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.146424 4732 patch_prober.go:28] interesting pod/console-operator-58897d9998-76hwt container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.146461 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-76hwt" podUID="4a34a201-8137-4efe-a99a-1ebd89e40c68" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.187585 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:32 crc kubenswrapper[4732]: E0402 13:41:32.189427 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:32.689410383 +0000 UTC m=+249.593817936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.208697 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-8qcmg"] Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.231858 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kpg9f"] Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.273787 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2gdhw"] Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.283745 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-t9dlm"] Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.290400 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:32 crc kubenswrapper[4732]: E0402 13:41:32.290912 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:32.790900026 +0000 UTC m=+249.695307579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:32 crc kubenswrapper[4732]: W0402 13:41:32.325444 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3568fcc7_10bd_4972_9782_b97aa3c9c8a0.slice/crio-3de08b6ac2690532b55ee5160be23e80e62ef79892eab95ae25f36eaaa5f2484 WatchSource:0}: Error finding container 3de08b6ac2690532b55ee5160be23e80e62ef79892eab95ae25f36eaaa5f2484: Status 404 returned error can't find the container with id 3de08b6ac2690532b55ee5160be23e80e62ef79892eab95ae25f36eaaa5f2484 Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.382279 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8qffd"] Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.394149 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:32 crc kubenswrapper[4732]: E0402 13:41:32.394370 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:32.894353631 +0000 UTC m=+249.798761184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.394432 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:32 crc kubenswrapper[4732]: E0402 13:41:32.394716 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:32.8947082 +0000 UTC m=+249.799115753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.399402 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-dq9x9"] Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.443601 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8dxlz"] Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.467981 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29585610-wbzhk"] Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.493211 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-brl6n"] Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.495987 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:32 crc kubenswrapper[4732]: E0402 13:41:32.496268 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:32.996254274 +0000 UTC m=+249.900661827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.497022 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ggc4c"] Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.549134 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585620-t897v"] Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.553991 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8m6h"] Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.564003 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntdkq"] Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.597310 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:32 crc kubenswrapper[4732]: E0402 13:41:32.600104 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:33.100085169 +0000 UTC m=+250.004492722 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:32 crc kubenswrapper[4732]: W0402 13:41:32.625351 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d200061_d82a_4b89_9bea_83a1c7d9eca8.slice/crio-1c14d79aee05af9ce6377cb7a33da5da073c72dc0a1e624cb6151ff4da7b2b43 WatchSource:0}: Error finding container 1c14d79aee05af9ce6377cb7a33da5da073c72dc0a1e624cb6151ff4da7b2b43: Status 404 returned error can't find the container with id 1c14d79aee05af9ce6377cb7a33da5da073c72dc0a1e624cb6151ff4da7b2b43 Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.698046 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:32 crc kubenswrapper[4732]: W0402 13:41:32.698081 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f520dcd_7f31_4d5b_bd4c_369a0ede6d70.slice/crio-d0d679d73260bc5906e284a49475357a0b6956959d5de7ceddc6017b6ee5134c WatchSource:0}: Error finding container d0d679d73260bc5906e284a49475357a0b6956959d5de7ceddc6017b6ee5134c: Status 404 returned error can't find the container with id d0d679d73260bc5906e284a49475357a0b6956959d5de7ceddc6017b6ee5134c Apr 02 13:41:32 crc kubenswrapper[4732]: E0402 13:41:32.698235 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:33.198191313 +0000 UTC m=+250.102598876 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.698569 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:32 crc kubenswrapper[4732]: E0402 13:41:32.698855 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:33.19884328 +0000 UTC m=+250.103250833 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.715692 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" podStartSLOduration=172.715671835 podStartE2EDuration="2m52.715671835s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:32.71362217 +0000 UTC m=+249.618029753" watchObservedRunningTime="2026-04-02 13:41:32.715671835 +0000 UTC m=+249.620079398" Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.752145 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.799129 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:32 crc kubenswrapper[4732]: E0402 13:41:32.799924 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:33.299903991 +0000 UTC m=+250.204311544 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.872807 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-k2sdm"] Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.880553 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp" podStartSLOduration=171.880532163 podStartE2EDuration="2m51.880532163s" podCreationTimestamp="2026-04-02 13:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:32.876505406 +0000 UTC m=+249.780912989" watchObservedRunningTime="2026-04-02 13:41:32.880532163 +0000 UTC m=+249.784939706" Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.899232 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cbp4s"] Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.906373 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:32 crc kubenswrapper[4732]: E0402 13:41:32.906762 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:33.406749076 +0000 UTC m=+250.311156619 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.922812 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d"] Apr 02 13:41:32 crc kubenswrapper[4732]: I0402 13:41:32.928743 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wbcwp"] Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.008174 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:33 crc kubenswrapper[4732]: E0402 13:41:33.008479 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:33.508464735 +0000 UTC m=+250.412872288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.023375 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xzg9j"] Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.062383 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dcwb2"] Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.110670 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:33 crc kubenswrapper[4732]: E0402 13:41:33.111649 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:33.611632042 +0000 UTC m=+250.516039595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.168791 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" event={"ID":"a1a685af-f17e-4937-a7d6-edca2c96e842","Type":"ContainerStarted","Data":"bfbf9c8cdb24d8db8bf5611d4fc850219ad80ab345e7cb7758a3434b9a866c53"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.183994 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-s6sm2" event={"ID":"99e5508c-0d75-4f87-9c07-b53509e461aa","Type":"ContainerStarted","Data":"9e2bd8a777276c58aff3c69d6184b247fdaf13ed2a075b2b5949f6250566deb5"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.184051 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-s6sm2" event={"ID":"99e5508c-0d75-4f87-9c07-b53509e461aa","Type":"ContainerStarted","Data":"57b5ddb6afef1b202b4d135b6404a217d2103eeb512b85bddd0f568d4a52fc1a"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.184907 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-s6sm2" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.187250 4732 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-s6sm2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.187293 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-s6sm2" podUID="99e5508c-0d75-4f87-9c07-b53509e461aa" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.190333 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dq9x9" event={"ID":"4d77c191-7d04-4381-838f-b7a355e7c2d4","Type":"ContainerStarted","Data":"66fdd9ebf76077e088115e7e3707900621ed21f82c654455ea29e90ede4618f3"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.202008 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kpg9f" event={"ID":"3568fcc7-10bd-4972-9782-b97aa3c9c8a0","Type":"ContainerStarted","Data":"7198a89c32678d74373911a60bdb1c927cfa9e9067a4a26b9bb7d034d0e739c3"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.202062 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kpg9f" event={"ID":"3568fcc7-10bd-4972-9782-b97aa3c9c8a0","Type":"ContainerStarted","Data":"3de08b6ac2690532b55ee5160be23e80e62ef79892eab95ae25f36eaaa5f2484"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.212257 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:33 crc kubenswrapper[4732]: E0402 13:41:33.212508 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:33.712473048 +0000 UTC m=+250.616880601 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.241272 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ggc4c" event={"ID":"9d200061-d82a-4b89-9bea-83a1c7d9eca8","Type":"ContainerStarted","Data":"1c14d79aee05af9ce6377cb7a33da5da073c72dc0a1e624cb6151ff4da7b2b43"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.266556 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sxw8j" event={"ID":"ac5f29d0-13ce-46eb-babc-70f32ac34feb","Type":"ContainerStarted","Data":"a38446738c3b02bb950d259ff1359bb753417d4cfcd5b6717155d6ea4a15d11d"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.275458 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zmhn7" event={"ID":"4e9d3578-0893-4852-80b6-999e5a7ccdc5","Type":"ContainerStarted","Data":"9853db89cbac9efc6c09ef369c6bf8b691f90eb8b2ac818d4200724a593933f0"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.276246 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-zmhn7" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.277733 4732 patch_prober.go:28] interesting pod/downloads-7954f5f757-zmhn7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.277786 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zmhn7" podUID="4e9d3578-0893-4852-80b6-999e5a7ccdc5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.281316 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r98n2" event={"ID":"ff7c4e9d-5437-412b-867d-1e44dfc73df5","Type":"ContainerStarted","Data":"4fb56712a2f1e001912270393323570c7b63626f192f74c0fed77beab80d309a"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.283021 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-95tg7" podStartSLOduration=173.283004972 podStartE2EDuration="2m53.283004972s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:33.281992915 +0000 UTC m=+250.186400468" watchObservedRunningTime="2026-04-02 13:41:33.283004972 +0000 UTC m=+250.187412525" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.290213 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-szr89" event={"ID":"1eee8837-1a56-40df-b564-bb65ad94d593","Type":"ContainerStarted","Data":"c05ff3d39e898965473bc56a7ee1ea55925094c47e041e9022851b40aed3057b"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.301640 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8dxlz" event={"ID":"8f520dcd-7f31-4d5b-bd4c-369a0ede6d70","Type":"ContainerStarted","Data":"d0d679d73260bc5906e284a49475357a0b6956959d5de7ceddc6017b6ee5134c"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.311227 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cbp4s" event={"ID":"76af458d-5d8b-44eb-84f9-8a142a0a9169","Type":"ContainerStarted","Data":"b3db786f7241f99fdf9d510100eba444c733b4d957f7e85a8245d54922387e9d"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.318512 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:33 crc kubenswrapper[4732]: E0402 13:41:33.318915 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:33.818902171 +0000 UTC m=+250.723309714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.343041 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-8qcmg" event={"ID":"64a68003-b71d-4ac2-aaaf-76b67ed758cd","Type":"ContainerStarted","Data":"589182311f2c5ce767cc69a22a707d8a821dad9f1f96d1f97d12d6b7ee5c6bf1"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.343084 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-8qcmg" event={"ID":"64a68003-b71d-4ac2-aaaf-76b67ed758cd","Type":"ContainerStarted","Data":"2173f09982ebdeeb865e08c00753079d5caf7999c505d6644781ba08406193f1"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.353686 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" event={"ID":"c70b5281-74d8-44ff-8f4b-326a3d7192aa","Type":"ContainerStarted","Data":"30a82c816feba70dd1579b0c688c30adec2675441e955350d7487435fabf989c"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.354134 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.362099 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-845fh" event={"ID":"bb62d6fb-d819-4fc1-aa43-35fb1012a1ba","Type":"ContainerStarted","Data":"77f31582a3d50f89a3db47a5f3250a9e4742a8af951525082895df89310f0cd0"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.362456 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-845fh" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.374248 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8qffd" event={"ID":"ff413ea9-0a19-4fc0-8067-9521bc9e472c","Type":"ContainerStarted","Data":"6a611b027bb3a92e173ce76db23efb026ee04dad64d2c0165936453605dce25c"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.374978 4732 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-845fh container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.375009 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-845fh" podUID="bb62d6fb-d819-4fc1-aa43-35fb1012a1ba" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.375110 4732 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-86zsp container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.6:6443/healthz\": dial tcp 10.217.0.6:6443: connect: connection refused" start-of-body= Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.375138 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" podUID="c70b5281-74d8-44ff-8f4b-326a3d7192aa" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.6:6443/healthz\": dial tcp 10.217.0.6:6443: connect: connection refused" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.395995 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77jmk" event={"ID":"5f0e45b7-fac8-406c-bbe9-e92490d95fda","Type":"ContainerStarted","Data":"dad2d2415bc8e758502b7a08e58e44d4927ca428cbd63e8aa6a1a7f12461d0c9"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.396830 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-szr89" podStartSLOduration=172.396783999 podStartE2EDuration="2m52.396783999s" podCreationTimestamp="2026-04-02 13:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:33.393919803 +0000 UTC m=+250.298327366" watchObservedRunningTime="2026-04-02 13:41:33.396783999 +0000 UTC m=+250.301191552" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.397343 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77jmk" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.399137 4732 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-77jmk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" start-of-body= Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.399181 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77jmk" podUID="5f0e45b7-fac8-406c-bbe9-e92490d95fda" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.417495 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585620-t897v" event={"ID":"9a82c61a-7d7e-4401-963a-1f1fe908002c","Type":"ContainerStarted","Data":"e9e13dddf5e7bb8997e5ced0af1a4f10fa0c20db9f28cc0d660d72a182235460"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.420805 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:33 crc kubenswrapper[4732]: E0402 13:41:33.422776 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:33.922760306 +0000 UTC m=+250.827167849 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.428915 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w7tht" event={"ID":"dd5add8c-a0d5-412d-925d-bd21c5893935","Type":"ContainerStarted","Data":"155ab18eb3b5183a497a303010e45260f32e09811587842a3a1d9abc5664c307"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.428969 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w7tht" event={"ID":"dd5add8c-a0d5-412d-925d-bd21c5893935","Type":"ContainerStarted","Data":"d7e9e0354e007da6d152351b1f3ce9148fd3321c1cc8b50669c91b36357096ed"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.439097 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" event={"ID":"ea530987-3884-4994-9574-b73fc76fcdde","Type":"ContainerStarted","Data":"98d365218263d02c3a66886f2016109cec8a6fc84a5db66cb0647bd906d66fd7"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.441075 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sxw8j" podStartSLOduration=173.441052019 podStartE2EDuration="2m53.441052019s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:33.438307957 +0000 UTC m=+250.342715530" watchObservedRunningTime="2026-04-02 13:41:33.441052019 +0000 UTC m=+250.345459572" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.449932 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qdvzn" event={"ID":"49c54f44-4a94-4b19-b03d-8469355931d0","Type":"ContainerStarted","Data":"46f5da7593589bd05c00cedd6eb69a24b8824f34adf32da57810f78efd3d8a0c"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.475896 4732 ???:1] "http: TLS handshake error from 192.168.126.11:36682: no serving certificate available for the kubelet" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.478541 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6fp76" podStartSLOduration=173.47852333 podStartE2EDuration="2m53.47852333s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:33.477124983 +0000 UTC m=+250.381532556" watchObservedRunningTime="2026-04-02 13:41:33.47852333 +0000 UTC m=+250.382930883" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.479562 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7sxt" event={"ID":"5425ff81-73c8-4fca-b208-9c9dbc6a949d","Type":"ContainerStarted","Data":"543cf36fcde29b722d5119bf8091e93d966b4f0d31fd814907f5760a3a6e6d45"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.506501 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xl58c" event={"ID":"2d75d692-d062-4dfa-b7e7-ea53683bc549","Type":"ContainerStarted","Data":"a763c332f22d230643db60fd4fae992ae193dadd92d7c8719644d5c888f421d1"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.506871 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xl58c" event={"ID":"2d75d692-d062-4dfa-b7e7-ea53683bc549","Type":"ContainerStarted","Data":"89a7b9c7cb986e13a6cd1098a6c93616ca3e4421dc7acef9f6a659ff6fa23e4d"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.510731 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29585610-wbzhk" event={"ID":"67577cc5-6dee-4465-beee-ea424d976972","Type":"ContainerStarted","Data":"16cea3f04af541b5904b37a103b84d956f23b721989734df32189ea2aed43667"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.519317 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f998c" event={"ID":"36f26e27-d72e-42f2-9380-598616e5626b","Type":"ContainerStarted","Data":"8edceb65839c67f66eed8bc8e4d52eb4c06c7266fbc4da72f6aeb242445e8805"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.521858 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:33 crc kubenswrapper[4732]: E0402 13:41:33.522686 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:34.022670097 +0000 UTC m=+250.927077640 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.543336 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vmngg" event={"ID":"f0c6d23c-1aff-4a2f-b92b-6cb07a0ca8b6","Type":"ContainerStarted","Data":"1dae3f93711eab60aaaafe1091230be9890464b600d59f4ec28b86c09c09a4eb"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.544328 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vmngg" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.558764 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8m6h" event={"ID":"fc020ab9-6c58-4571-ad12-3e22c8472a85","Type":"ContainerStarted","Data":"80b8ae2bd9e19f04b0a39b8f22d0be2fb2b266bc43cf446ad26b326051aaa0e7"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.561009 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-brl6n" event={"ID":"5cb3c06a-e3cf-4a60-b180-82759b9d55fc","Type":"ContainerStarted","Data":"0074f780bcde7f9c803e94a8a99c5131a2501fea2d0f047409284c745205680b"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.564332 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-76hwt" podStartSLOduration=173.564317848 podStartE2EDuration="2m53.564317848s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:33.562002866 +0000 UTC m=+250.466410419" watchObservedRunningTime="2026-04-02 13:41:33.564317848 +0000 UTC m=+250.468725401" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.566584 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wbcwp" event={"ID":"1a7ade12-5bbc-4ece-a3b6-b85a31c36f16","Type":"ContainerStarted","Data":"77c0eec6527cdefc482fcd9f1c43ed81ef7631e5341b1be9f371785211edc29c"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.569692 4732 ???:1] "http: TLS handshake error from 192.168.126.11:36698: no serving certificate available for the kubelet" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.573269 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-tfwcw" event={"ID":"c314f9c9-f0e4-450b-a152-3be055d1bd46","Type":"ContainerStarted","Data":"1bb34408ed7ba73fc4711de8660c7a48987663cd735d10b844b370e8ca2bf149"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.574835 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntdkq" event={"ID":"10c1127e-ac61-4432-b5a8-828e3b84d61e","Type":"ContainerStarted","Data":"07a1b4ce85e5cd296654919b9d7ce18a00d07b8acad9b15478c517e69da6c6c5"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.576282 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" event={"ID":"9b3f7b50-e118-4aca-986d-0f52e772edc3","Type":"ContainerStarted","Data":"de29ab4d30d47abf87d65255575cc8e8c24ba16a3f8f565b4bde7fdeaaf1f5d0"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.582899 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-t9dlm" event={"ID":"446ace70-43fa-494c-8bce-13a5ea3ca452","Type":"ContainerStarted","Data":"88a27d8708f074a2e6a1613ee7305db0627c68ce38d3379e3fedb23ec4f106ff"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.585631 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4dzs" event={"ID":"cbd2d545-1f43-45dd-8f7e-23ea76dbc0a4","Type":"ContainerStarted","Data":"c7eedc6f02caca40d1448c1c6e2ce5b09f9d0cfca0886704d5fce80ccf8c6447"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.585675 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4dzs" event={"ID":"cbd2d545-1f43-45dd-8f7e-23ea76dbc0a4","Type":"ContainerStarted","Data":"d28e3f596f731689ff3b894186d8a97cd6de2f1ec27ad096902b928fdfd8af6e"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.601483 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvjh7" event={"ID":"6d20f42f-7f98-4c97-a28e-17749d977819","Type":"ContainerStarted","Data":"4f675a80447ab36a999bddba20b47f3c2351046ffc85e0d7f350c0012d36f442"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.607808 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" podStartSLOduration=173.607793907 podStartE2EDuration="2m53.607793907s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:33.602358713 +0000 UTC m=+250.506766276" watchObservedRunningTime="2026-04-02 13:41:33.607793907 +0000 UTC m=+250.512201460" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.622473 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2gdhw" event={"ID":"803d2a64-1416-46cb-ae46-5a8462b057f9","Type":"ContainerStarted","Data":"c4d45e51907ae6ac3fd9f938f60ea1cf8ecef248bf3706c0f8b6d1fcee228218"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.622517 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2gdhw" event={"ID":"803d2a64-1416-46cb-ae46-5a8462b057f9","Type":"ContainerStarted","Data":"da0f47cb6054fb3cd3bf3977490370216a331082d33876fadf82fa64c5b1a341"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.622919 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:33 crc kubenswrapper[4732]: E0402 13:41:33.624447 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:34.124406646 +0000 UTC m=+251.028814269 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.632591 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6qzc4" event={"ID":"d8ff2a93-ff6b-4ef8-9109-de3c22a6f108","Type":"ContainerStarted","Data":"a0fcdb43795b25193020b8f207332cd4cb9e19653beac822bcb6225597f8f173"} Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.636187 4732 patch_prober.go:28] interesting pod/console-operator-58897d9998-76hwt container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.636224 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-76hwt" podUID="4a34a201-8137-4efe-a99a-1ebd89e40c68" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.640946 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.672543 4732 ???:1] "http: TLS handshake error from 192.168.126.11:36714: no serving certificate available for the kubelet" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.674834 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-s6sm2" podStartSLOduration=172.674816789 podStartE2EDuration="2m52.674816789s" podCreationTimestamp="2026-04-02 13:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:33.673562296 +0000 UTC m=+250.577969879" watchObservedRunningTime="2026-04-02 13:41:33.674816789 +0000 UTC m=+250.579224342" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.676662 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-zmhn7" podStartSLOduration=173.676647707 podStartE2EDuration="2m53.676647707s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:33.636557267 +0000 UTC m=+250.540964830" watchObservedRunningTime="2026-04-02 13:41:33.676647707 +0000 UTC m=+250.581055260" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.719322 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-8qcmg" podStartSLOduration=173.719304975 podStartE2EDuration="2m53.719304975s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:33.715268768 +0000 UTC m=+250.619676341" watchObservedRunningTime="2026-04-02 13:41:33.719304975 +0000 UTC m=+250.623712528" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.724151 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:33 crc kubenswrapper[4732]: E0402 13:41:33.725561 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:34.22554948 +0000 UTC m=+251.129957023 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.757797 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-6qzc4" podStartSLOduration=173.757781112 podStartE2EDuration="2m53.757781112s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:33.755210434 +0000 UTC m=+250.659618007" watchObservedRunningTime="2026-04-02 13:41:33.757781112 +0000 UTC m=+250.662188655" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.801176 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8m6h" podStartSLOduration=172.801153688 podStartE2EDuration="2m52.801153688s" podCreationTimestamp="2026-04-02 13:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:33.799956297 +0000 UTC m=+250.704363870" watchObservedRunningTime="2026-04-02 13:41:33.801153688 +0000 UTC m=+250.705561241" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.805732 4732 ???:1] "http: TLS handshake error from 192.168.126.11:36716: no serving certificate available for the kubelet" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.829002 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:33 crc kubenswrapper[4732]: E0402 13:41:33.829314 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:34.329297572 +0000 UTC m=+251.233705125 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.908463 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f998c" podStartSLOduration=173.908446615 podStartE2EDuration="2m53.908446615s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:33.842758858 +0000 UTC m=+250.747166411" watchObservedRunningTime="2026-04-02 13:41:33.908446615 +0000 UTC m=+250.812854168" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.932257 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:33 crc kubenswrapper[4732]: E0402 13:41:33.960875 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:34.46085908 +0000 UTC m=+251.365266633 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.988082 4732 ???:1] "http: TLS handshake error from 192.168.126.11:36726: no serving certificate available for the kubelet" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.993701 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-qdvzn" podStartSLOduration=173.993680128 podStartE2EDuration="2m53.993680128s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:33.930683783 +0000 UTC m=+250.835091356" watchObservedRunningTime="2026-04-02 13:41:33.993680128 +0000 UTC m=+250.898087681" Apr 02 13:41:33 crc kubenswrapper[4732]: I0402 13:41:33.994197 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-845fh" podStartSLOduration=172.994191641 podStartE2EDuration="2m52.994191641s" podCreationTimestamp="2026-04-02 13:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:33.907559061 +0000 UTC m=+250.811966624" watchObservedRunningTime="2026-04-02 13:41:33.994191641 +0000 UTC m=+250.898599184" Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.004383 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vmngg" podStartSLOduration=174.00436494 podStartE2EDuration="2m54.00436494s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:33.978669841 +0000 UTC m=+250.883077394" watchObservedRunningTime="2026-04-02 13:41:34.00436494 +0000 UTC m=+250.908772493" Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.017499 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-tfwcw" podStartSLOduration=6.017478957 podStartE2EDuration="6.017478957s" podCreationTimestamp="2026-04-02 13:41:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:34.004146575 +0000 UTC m=+250.908554128" watchObservedRunningTime="2026-04-02 13:41:34.017478957 +0000 UTC m=+250.921886510" Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.037971 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2gdhw" podStartSLOduration=174.037953188 podStartE2EDuration="2m54.037953188s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:34.034256111 +0000 UTC m=+250.938663664" watchObservedRunningTime="2026-04-02 13:41:34.037953188 +0000 UTC m=+250.942360741" Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.043419 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:34 crc kubenswrapper[4732]: E0402 13:41:34.044909 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:34.544890622 +0000 UTC m=+251.449298175 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.044976 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:34 crc kubenswrapper[4732]: E0402 13:41:34.045378 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:34.545371364 +0000 UTC m=+251.449778917 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.056294 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp" Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.148569 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:34 crc kubenswrapper[4732]: E0402 13:41:34.149187 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:34.649172628 +0000 UTC m=+251.553580181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.250076 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:34 crc kubenswrapper[4732]: E0402 13:41:34.250460 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:34.750445696 +0000 UTC m=+251.654853249 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.256077 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4dzs" podStartSLOduration=173.256062154 podStartE2EDuration="2m53.256062154s" podCreationTimestamp="2026-04-02 13:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:34.254049471 +0000 UTC m=+251.158457034" watchObservedRunningTime="2026-04-02 13:41:34.256062154 +0000 UTC m=+251.160469707" Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.306823 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-6qzc4" Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.321475 4732 patch_prober.go:28] interesting pod/router-default-5444994796-6qzc4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 02 13:41:34 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Apr 02 13:41:34 crc kubenswrapper[4732]: [+]process-running ok Apr 02 13:41:34 crc kubenswrapper[4732]: healthz check failed Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.321519 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6qzc4" podUID="d8ff2a93-ff6b-4ef8-9109-de3c22a6f108" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.328205 4732 ???:1] "http: TLS handshake error from 192.168.126.11:36732: no serving certificate available for the kubelet" Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.329573 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77jmk" podStartSLOduration=173.329557017 podStartE2EDuration="2m53.329557017s" podCreationTimestamp="2026-04-02 13:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:34.328563181 +0000 UTC m=+251.232970744" watchObservedRunningTime="2026-04-02 13:41:34.329557017 +0000 UTC m=+251.233964560" Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.351373 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7sxt" podStartSLOduration=174.351355693 podStartE2EDuration="2m54.351355693s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:34.350597283 +0000 UTC m=+251.255004856" watchObservedRunningTime="2026-04-02 13:41:34.351355693 +0000 UTC m=+251.255763256" Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.351884 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:34 crc kubenswrapper[4732]: E0402 13:41:34.352173 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:34.852158994 +0000 UTC m=+251.756566547 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.382785 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvjh7" podStartSLOduration=174.382764193 podStartE2EDuration="2m54.382764193s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:34.380183855 +0000 UTC m=+251.284591418" watchObservedRunningTime="2026-04-02 13:41:34.382764193 +0000 UTC m=+251.287171746" Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.431227 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xl58c" podStartSLOduration=173.431215804 podStartE2EDuration="2m53.431215804s" podCreationTimestamp="2026-04-02 13:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:34.4291786 +0000 UTC m=+251.333586153" watchObservedRunningTime="2026-04-02 13:41:34.431215804 +0000 UTC m=+251.335623357" Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.455677 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:34 crc kubenswrapper[4732]: E0402 13:41:34.456177 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:34.956151813 +0000 UTC m=+251.860559366 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.559099 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:34 crc kubenswrapper[4732]: E0402 13:41:34.559770 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:35.059754952 +0000 UTC m=+251.964162505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.574381 4732 ???:1] "http: TLS handshake error from 192.168.126.11:36746: no serving certificate available for the kubelet" Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.661188 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:34 crc kubenswrapper[4732]: E0402 13:41:34.661543 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:35.161532113 +0000 UTC m=+252.065939666 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.726710 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29585610-wbzhk" event={"ID":"67577cc5-6dee-4465-beee-ea424d976972","Type":"ContainerStarted","Data":"93bddc4cc6346d65d466d01bb6e7bdfa76ee89770919cb5fa9a55eb9760f977f"} Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.763412 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.766644 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8dxlz" event={"ID":"8f520dcd-7f31-4d5b-bd4c-369a0ede6d70","Type":"ContainerStarted","Data":"73ec64ffc05b0b86bb482f54bae6a20e39593353e9175b03938f2f27c274198a"} Apr 02 13:41:34 crc kubenswrapper[4732]: E0402 13:41:34.779521 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:35.279484841 +0000 UTC m=+252.183892394 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.784524 4732 generic.go:334] "Generic (PLEG): container finished" podID="ea530987-3884-4994-9574-b73fc76fcdde" containerID="d0aa32342845cdf2f26edec7f7697e0928ef564195a539e052a57ddd8339f80f" exitCode=0 Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.784631 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" event={"ID":"ea530987-3884-4994-9574-b73fc76fcdde","Type":"ContainerDied","Data":"d0aa32342845cdf2f26edec7f7697e0928ef564195a539e052a57ddd8339f80f"} Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.807477 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dcwb2" event={"ID":"93a3cb6d-2580-4709-a329-67820678af15","Type":"ContainerStarted","Data":"84e2e55fa7a40b0db78ae14d7eb4068502f04c649077bce364258e13343f687f"} Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.807542 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dcwb2" event={"ID":"93a3cb6d-2580-4709-a329-67820678af15","Type":"ContainerStarted","Data":"6ee62a9cb7876edcb4b07c6a7b70901dbbfb03152ce4ad118ec88203e55cbff2"} Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.807555 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dcwb2" event={"ID":"93a3cb6d-2580-4709-a329-67820678af15","Type":"ContainerStarted","Data":"83c9e92a809bb2697b19bf9c734218a732f9618dac23182909b7910a47ed66e1"} Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.820368 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kpg9f" event={"ID":"3568fcc7-10bd-4972-9782-b97aa3c9c8a0","Type":"ContainerStarted","Data":"18c5da07bd98ced271b3514055696e7cefa2f8713f76ba4bb21bc03a7ddf2f81"} Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.848413 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r98n2" event={"ID":"ff7c4e9d-5437-412b-867d-1e44dfc73df5","Type":"ContainerStarted","Data":"49269f233a5a1c01809d9dd9bfb70a94fc144fa2b76e6bb6e9cf327c63782a3e"} Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.867540 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:34 crc kubenswrapper[4732]: E0402 13:41:34.869746 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:35.369733667 +0000 UTC m=+252.274141210 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.875952 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" event={"ID":"a1a685af-f17e-4937-a7d6-edca2c96e842","Type":"ContainerStarted","Data":"7249bf341fb84c8f31b0d5dd5757fc386bd703008cb8dc1973574abf17796856"} Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.885132 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntdkq" event={"ID":"10c1127e-ac61-4432-b5a8-828e3b84d61e","Type":"ContainerStarted","Data":"9008b86327a829df7eb0928ebaae8f1272baf080e83af4a35d37bc520feb9f39"} Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.953041 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8qffd" event={"ID":"ff413ea9-0a19-4fc0-8067-9521bc9e472c","Type":"ContainerStarted","Data":"8dd8738dd3ea169cd17e37ae1c05670cfd6a37068c1635b806fcd8288e26172f"} Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.973595 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:34 crc kubenswrapper[4732]: E0402 13:41:34.975491 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:35.475464682 +0000 UTC m=+252.379872235 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:34 crc kubenswrapper[4732]: I0402 13:41:34.995637 4732 ???:1] "http: TLS handshake error from 192.168.126.11:36760: no serving certificate available for the kubelet" Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.066458 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-t9dlm" event={"ID":"446ace70-43fa-494c-8bce-13a5ea3ca452","Type":"ContainerStarted","Data":"e67102bd7767bac2424e622410eeb90aa3fca735147efec87f749977b8ce02ac"} Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.066510 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-t9dlm" event={"ID":"446ace70-43fa-494c-8bce-13a5ea3ca452","Type":"ContainerStarted","Data":"a5a79903dd134228fb4688978b9830e0c319dc1f93e433af3776599d189b4a02"} Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.081256 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:35 crc kubenswrapper[4732]: E0402 13:41:35.081815 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:35.581799563 +0000 UTC m=+252.486207116 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.082324 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-brl6n" event={"ID":"5cb3c06a-e3cf-4a60-b180-82759b9d55fc","Type":"ContainerStarted","Data":"02eaba1c5190d6d2ba1493c6ccd7cd1055d8a32472c005b0b4c1674c96732f1f"} Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.124134 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-8qffd" podStartSLOduration=174.124117071 podStartE2EDuration="2m54.124117071s" podCreationTimestamp="2026-04-02 13:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:35.108096428 +0000 UTC m=+252.012504001" watchObservedRunningTime="2026-04-02 13:41:35.124117071 +0000 UTC m=+252.028524624" Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.186878 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:35 crc kubenswrapper[4732]: E0402 13:41:35.188934 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:35.688915784 +0000 UTC m=+252.593323337 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.227977 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cbp4s" event={"ID":"76af458d-5d8b-44eb-84f9-8a142a0a9169","Type":"ContainerStarted","Data":"ec817b82a8c1041441de3cc8b9ab2f7ee89d816be04a421de348add4372a6089"} Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.277945 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w7tht" event={"ID":"dd5add8c-a0d5-412d-925d-bd21c5893935","Type":"ContainerStarted","Data":"bb16f5d114d19e24408a55938911f4db962ba653b723772e465793659b1179c9"} Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.288489 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r98n2" podStartSLOduration=175.288472366 podStartE2EDuration="2m55.288472366s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:35.287962352 +0000 UTC m=+252.192369905" watchObservedRunningTime="2026-04-02 13:41:35.288472366 +0000 UTC m=+252.192879919" Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.289210 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:35 crc kubenswrapper[4732]: E0402 13:41:35.290812 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:35.790798807 +0000 UTC m=+252.695206360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.297371 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-tfwcw" event={"ID":"c314f9c9-f0e4-450b-a152-3be055d1bd46","Type":"ContainerStarted","Data":"7ad190743573e54dde54141cedf8d9f3a409676b75a9b8fba0e663eedf860eae"} Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.308148 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dq9x9" event={"ID":"4d77c191-7d04-4381-838f-b7a355e7c2d4","Type":"ContainerStarted","Data":"73e91406ab034efa71f59bdd70be88e46d545b72f509e189125711058eb982eb"} Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.310298 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8m6h" event={"ID":"fc020ab9-6c58-4571-ad12-3e22c8472a85","Type":"ContainerStarted","Data":"6414e232a13670a015e78773cfa4181799b39950e3f68a56932b2e5ff06545ad"} Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.324937 4732 patch_prober.go:28] interesting pod/router-default-5444994796-6qzc4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 02 13:41:35 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Apr 02 13:41:35 crc kubenswrapper[4732]: [+]process-running ok Apr 02 13:41:35 crc kubenswrapper[4732]: healthz check failed Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.324992 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6qzc4" podUID="d8ff2a93-ff6b-4ef8-9109-de3c22a6f108" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.331446 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wbcwp" event={"ID":"1a7ade12-5bbc-4ece-a3b6-b85a31c36f16","Type":"ContainerStarted","Data":"987fe7ab909b9322a4e2d3488fc8337c2c5cf8a075e29964cb2a957cf7ece8ca"} Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.332279 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-wbcwp" Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.333556 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-kpg9f" podStartSLOduration=174.333538597 podStartE2EDuration="2m54.333538597s" podCreationTimestamp="2026-04-02 13:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:35.332214422 +0000 UTC m=+252.236621985" watchObservedRunningTime="2026-04-02 13:41:35.333538597 +0000 UTC m=+252.237946150" Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.335112 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ggc4c" event={"ID":"9d200061-d82a-4b89-9bea-83a1c7d9eca8","Type":"ContainerStarted","Data":"4ca9f59a89c8df3c4c968c7f832ced26b454ecdbaa587b85749c12178f8f99f5"} Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.335143 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ggc4c" Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.336191 4732 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-s6sm2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.336226 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-s6sm2" podUID="99e5508c-0d75-4f87-9c07-b53509e461aa" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.338994 4732 patch_prober.go:28] interesting pod/downloads-7954f5f757-zmhn7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.339040 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zmhn7" podUID="4e9d3578-0893-4852-80b6-999e5a7ccdc5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.362572 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-845fh" Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.386792 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77jmk" Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.390024 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:35 crc kubenswrapper[4732]: E0402 13:41:35.391797 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:35.891783337 +0000 UTC m=+252.796190890 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.403796 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.430324 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29585610-wbzhk" podStartSLOduration=175.430306225 podStartE2EDuration="2m55.430306225s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:35.389304441 +0000 UTC m=+252.293711994" watchObservedRunningTime="2026-04-02 13:41:35.430306225 +0000 UTC m=+252.334713778" Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.430806 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-76hwt" Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.451960 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ggc4c" Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.492786 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:35 crc kubenswrapper[4732]: E0402 13:41:35.493435 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:35.993414554 +0000 UTC m=+252.897822107 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.503560 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dcwb2" podStartSLOduration=174.503541151 podStartE2EDuration="2m54.503541151s" podCreationTimestamp="2026-04-02 13:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:35.501405325 +0000 UTC m=+252.405812868" watchObservedRunningTime="2026-04-02 13:41:35.503541151 +0000 UTC m=+252.407948704" Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.540867 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntdkq" podStartSLOduration=175.540850828 podStartE2EDuration="2m55.540850828s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:35.540240971 +0000 UTC m=+252.444648544" watchObservedRunningTime="2026-04-02 13:41:35.540850828 +0000 UTC m=+252.445258381" Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.607406 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:35 crc kubenswrapper[4732]: E0402 13:41:35.607744 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:36.107726145 +0000 UTC m=+253.012133698 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.607778 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:35 crc kubenswrapper[4732]: E0402 13:41:35.608129 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:36.108122856 +0000 UTC m=+253.012530409 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.681469 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ggc4c" podStartSLOduration=174.681452614 podStartE2EDuration="2m54.681452614s" podCreationTimestamp="2026-04-02 13:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:35.680552501 +0000 UTC m=+252.584960074" watchObservedRunningTime="2026-04-02 13:41:35.681452614 +0000 UTC m=+252.585860167" Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.709190 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:35 crc kubenswrapper[4732]: E0402 13:41:35.709812 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:36.209794804 +0000 UTC m=+253.114202357 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.797847 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w7tht" podStartSLOduration=174.79780682 podStartE2EDuration="2m54.79780682s" podCreationTimestamp="2026-04-02 13:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:35.748829986 +0000 UTC m=+252.653237539" watchObservedRunningTime="2026-04-02 13:41:35.79780682 +0000 UTC m=+252.702214393" Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.808356 4732 ???:1] "http: TLS handshake error from 192.168.126.11:36764: no serving certificate available for the kubelet" Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.812059 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:35 crc kubenswrapper[4732]: E0402 13:41:35.812454 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:36.312440357 +0000 UTC m=+253.216847920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.839726 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-cbp4s" podStartSLOduration=7.839695538 podStartE2EDuration="7.839695538s" podCreationTimestamp="2026-04-02 13:41:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:35.798998682 +0000 UTC m=+252.703406235" watchObservedRunningTime="2026-04-02 13:41:35.839695538 +0000 UTC m=+252.744103091" Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.883511 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-dq9x9" podStartSLOduration=175.883494315 podStartE2EDuration="2m55.883494315s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:35.842734148 +0000 UTC m=+252.747141701" watchObservedRunningTime="2026-04-02 13:41:35.883494315 +0000 UTC m=+252.787901868" Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.884194 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wbcwp" podStartSLOduration=7.884187314 podStartE2EDuration="7.884187314s" podCreationTimestamp="2026-04-02 13:41:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:35.882695324 +0000 UTC m=+252.787102887" watchObservedRunningTime="2026-04-02 13:41:35.884187314 +0000 UTC m=+252.788594867" Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.914161 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:35 crc kubenswrapper[4732]: E0402 13:41:35.914525 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:36.414491895 +0000 UTC m=+253.318899448 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:35 crc kubenswrapper[4732]: I0402 13:41:35.964689 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-t9dlm" podStartSLOduration=175.964671811 podStartE2EDuration="2m55.964671811s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:35.963211123 +0000 UTC m=+252.867618676" watchObservedRunningTime="2026-04-02 13:41:35.964671811 +0000 UTC m=+252.869079354" Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.033234 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:36 crc kubenswrapper[4732]: E0402 13:41:36.033627 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:36.533600503 +0000 UTC m=+253.438008056 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.130531 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vmngg" Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.134663 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:36 crc kubenswrapper[4732]: E0402 13:41:36.135025 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:36.635006674 +0000 UTC m=+253.539414237 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.171433 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-brl6n" podStartSLOduration=175.171415837 podStartE2EDuration="2m55.171415837s" podCreationTimestamp="2026-04-02 13:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:36.080860733 +0000 UTC m=+252.985268296" watchObservedRunningTime="2026-04-02 13:41:36.171415837 +0000 UTC m=+253.075823390" Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.239187 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:36 crc kubenswrapper[4732]: E0402 13:41:36.239552 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:36.739541848 +0000 UTC m=+253.643949401 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.311936 4732 patch_prober.go:28] interesting pod/router-default-5444994796-6qzc4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 02 13:41:36 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Apr 02 13:41:36 crc kubenswrapper[4732]: [+]process-running ok Apr 02 13:41:36 crc kubenswrapper[4732]: healthz check failed Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.312249 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6qzc4" podUID="d8ff2a93-ff6b-4ef8-9109-de3c22a6f108" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.341058 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:36 crc kubenswrapper[4732]: E0402 13:41:36.341379 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:36.841364459 +0000 UTC m=+253.745772012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.357975 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8qffd" event={"ID":"ff413ea9-0a19-4fc0-8067-9521bc9e472c","Type":"ContainerStarted","Data":"11fc725eae5fd963b0efb95809e51ab4792a371dacaa69dccbd022ebe281e115"} Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.380318 4732 generic.go:334] "Generic (PLEG): container finished" podID="a1a685af-f17e-4937-a7d6-edca2c96e842" containerID="7249bf341fb84c8f31b0d5dd5757fc386bd703008cb8dc1973574abf17796856" exitCode=0 Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.380401 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" event={"ID":"a1a685af-f17e-4937-a7d6-edca2c96e842","Type":"ContainerDied","Data":"7249bf341fb84c8f31b0d5dd5757fc386bd703008cb8dc1973574abf17796856"} Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.380425 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" event={"ID":"a1a685af-f17e-4937-a7d6-edca2c96e842","Type":"ContainerStarted","Data":"c15a8d655df1b63cec38fa2fa9fbc235cd8f9aff949a4d03b3219fc85d138c75"} Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.387802 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wbcwp" event={"ID":"1a7ade12-5bbc-4ece-a3b6-b85a31c36f16","Type":"ContainerStarted","Data":"71b8802e05d1e61983f2b8267395cc9497d1bf5aaa3d551cae05e72b8331b1a8"} Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.401593 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8dxlz" event={"ID":"8f520dcd-7f31-4d5b-bd4c-369a0ede6d70","Type":"ContainerStarted","Data":"0202cabbee8ec68f6c7d45b2f603b4fbd0eb98d1b08c71444b4109d633499274"} Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.401851 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8dxlz" Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.408690 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" event={"ID":"ea530987-3884-4994-9574-b73fc76fcdde","Type":"ContainerStarted","Data":"97d6c3ee10c0b8caf6d9b9655805d29e3f2e2a039f4b7e5e1c23bc1c3e534800"} Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.410807 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" event={"ID":"9b3f7b50-e118-4aca-986d-0f52e772edc3","Type":"ContainerStarted","Data":"d435288a9ed7e93ef31d76744552269c1ff294d314b2c520c47bb96a51e14d80"} Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.414011 4732 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-s6sm2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.414045 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-s6sm2" podUID="99e5508c-0d75-4f87-9c07-b53509e461aa" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.415593 4732 patch_prober.go:28] interesting pod/downloads-7954f5f757-zmhn7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.415699 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zmhn7" podUID="4e9d3578-0893-4852-80b6-999e5a7ccdc5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.421125 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" podStartSLOduration=175.421110167 podStartE2EDuration="2m55.421110167s" podCreationTimestamp="2026-04-02 13:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:36.420110631 +0000 UTC m=+253.324518184" watchObservedRunningTime="2026-04-02 13:41:36.421110167 +0000 UTC m=+253.325517720" Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.445043 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:36 crc kubenswrapper[4732]: E0402 13:41:36.447479 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:36.947451834 +0000 UTC m=+253.851859387 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.476434 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8dxlz" podStartSLOduration=175.476420119 podStartE2EDuration="2m55.476420119s" podCreationTimestamp="2026-04-02 13:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:36.465120411 +0000 UTC m=+253.369527974" watchObservedRunningTime="2026-04-02 13:41:36.476420119 +0000 UTC m=+253.380827672" Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.546302 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:36 crc kubenswrapper[4732]: E0402 13:41:36.548428 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:37.048407242 +0000 UTC m=+253.952814795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.649104 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:36 crc kubenswrapper[4732]: E0402 13:41:36.649471 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:37.149455814 +0000 UTC m=+254.053863367 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.729205 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4m6rj"] Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.730438 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4m6rj" Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.739138 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.750246 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:36 crc kubenswrapper[4732]: E0402 13:41:36.750344 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:37.25032911 +0000 UTC m=+254.154736663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.750526 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:36 crc kubenswrapper[4732]: E0402 13:41:36.750835 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:37.250823473 +0000 UTC m=+254.155231026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.832567 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4m6rj"] Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.854104 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:36 crc kubenswrapper[4732]: E0402 13:41:36.854308 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:37.354281618 +0000 UTC m=+254.258689171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.854422 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.854523 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf030ff0-459d-4453-975f-19ba4ff9641a-catalog-content\") pod \"community-operators-4m6rj\" (UID: \"cf030ff0-459d-4453-975f-19ba4ff9641a\") " pod="openshift-marketplace/community-operators-4m6rj" Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.854634 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4z2m\" (UniqueName: \"kubernetes.io/projected/cf030ff0-459d-4453-975f-19ba4ff9641a-kube-api-access-r4z2m\") pod \"community-operators-4m6rj\" (UID: \"cf030ff0-459d-4453-975f-19ba4ff9641a\") " pod="openshift-marketplace/community-operators-4m6rj" Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.854683 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf030ff0-459d-4453-975f-19ba4ff9641a-utilities\") pod \"community-operators-4m6rj\" (UID: \"cf030ff0-459d-4453-975f-19ba4ff9641a\") " pod="openshift-marketplace/community-operators-4m6rj" Apr 02 13:41:36 crc kubenswrapper[4732]: E0402 13:41:36.854691 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:37.354679829 +0000 UTC m=+254.259087382 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.930402 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qlgxl"] Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.931328 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qlgxl" Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.946564 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.956313 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:36 crc kubenswrapper[4732]: E0402 13:41:36.956446 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:37.456426508 +0000 UTC m=+254.360834061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.956492 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf030ff0-459d-4453-975f-19ba4ff9641a-catalog-content\") pod \"community-operators-4m6rj\" (UID: \"cf030ff0-459d-4453-975f-19ba4ff9641a\") " pod="openshift-marketplace/community-operators-4m6rj" Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.956564 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4z2m\" (UniqueName: \"kubernetes.io/projected/cf030ff0-459d-4453-975f-19ba4ff9641a-kube-api-access-r4z2m\") pod \"community-operators-4m6rj\" (UID: \"cf030ff0-459d-4453-975f-19ba4ff9641a\") " pod="openshift-marketplace/community-operators-4m6rj" Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.956587 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf030ff0-459d-4453-975f-19ba4ff9641a-utilities\") pod \"community-operators-4m6rj\" (UID: \"cf030ff0-459d-4453-975f-19ba4ff9641a\") " pod="openshift-marketplace/community-operators-4m6rj" Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.956686 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:36 crc kubenswrapper[4732]: E0402 13:41:36.956993 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:37.456982622 +0000 UTC m=+254.361390185 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.957097 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf030ff0-459d-4453-975f-19ba4ff9641a-catalog-content\") pod \"community-operators-4m6rj\" (UID: \"cf030ff0-459d-4453-975f-19ba4ff9641a\") " pod="openshift-marketplace/community-operators-4m6rj" Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.957135 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf030ff0-459d-4453-975f-19ba4ff9641a-utilities\") pod \"community-operators-4m6rj\" (UID: \"cf030ff0-459d-4453-975f-19ba4ff9641a\") " pod="openshift-marketplace/community-operators-4m6rj" Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.978954 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qlgxl"] Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.985842 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.986208 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:36 crc kubenswrapper[4732]: I0402 13:41:36.993445 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4z2m\" (UniqueName: \"kubernetes.io/projected/cf030ff0-459d-4453-975f-19ba4ff9641a-kube-api-access-r4z2m\") pod \"community-operators-4m6rj\" (UID: \"cf030ff0-459d-4453-975f-19ba4ff9641a\") " pod="openshift-marketplace/community-operators-4m6rj" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.064847 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4m6rj" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.071081 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.071264 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8zl9\" (UniqueName: \"kubernetes.io/projected/1827909b-49ea-4ba8-9995-f525d1d82f45-kube-api-access-m8zl9\") pod \"certified-operators-qlgxl\" (UID: \"1827909b-49ea-4ba8-9995-f525d1d82f45\") " pod="openshift-marketplace/certified-operators-qlgxl" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.071338 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1827909b-49ea-4ba8-9995-f525d1d82f45-utilities\") pod \"certified-operators-qlgxl\" (UID: \"1827909b-49ea-4ba8-9995-f525d1d82f45\") " pod="openshift-marketplace/certified-operators-qlgxl" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.071358 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1827909b-49ea-4ba8-9995-f525d1d82f45-catalog-content\") pod \"certified-operators-qlgxl\" (UID: \"1827909b-49ea-4ba8-9995-f525d1d82f45\") " pod="openshift-marketplace/certified-operators-qlgxl" Apr 02 13:41:37 crc kubenswrapper[4732]: E0402 13:41:37.071476 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:37.571460728 +0000 UTC m=+254.475868281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.111766 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8ps5w"] Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.112725 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ps5w" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.161515 4732 ???:1] "http: TLS handshake error from 192.168.126.11:48806: no serving certificate available for the kubelet" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.167104 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8ps5w"] Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.173213 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.173276 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1827909b-49ea-4ba8-9995-f525d1d82f45-utilities\") pod \"certified-operators-qlgxl\" (UID: \"1827909b-49ea-4ba8-9995-f525d1d82f45\") " pod="openshift-marketplace/certified-operators-qlgxl" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.173295 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1827909b-49ea-4ba8-9995-f525d1d82f45-catalog-content\") pod \"certified-operators-qlgxl\" (UID: \"1827909b-49ea-4ba8-9995-f525d1d82f45\") " pod="openshift-marketplace/certified-operators-qlgxl" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.173365 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8zl9\" (UniqueName: \"kubernetes.io/projected/1827909b-49ea-4ba8-9995-f525d1d82f45-kube-api-access-m8zl9\") pod \"certified-operators-qlgxl\" (UID: \"1827909b-49ea-4ba8-9995-f525d1d82f45\") " pod="openshift-marketplace/certified-operators-qlgxl" Apr 02 13:41:37 crc kubenswrapper[4732]: E0402 13:41:37.173899 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:37.673888796 +0000 UTC m=+254.578296339 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.174840 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1827909b-49ea-4ba8-9995-f525d1d82f45-utilities\") pod \"certified-operators-qlgxl\" (UID: \"1827909b-49ea-4ba8-9995-f525d1d82f45\") " pod="openshift-marketplace/certified-operators-qlgxl" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.175071 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1827909b-49ea-4ba8-9995-f525d1d82f45-catalog-content\") pod \"certified-operators-qlgxl\" (UID: \"1827909b-49ea-4ba8-9995-f525d1d82f45\") " pod="openshift-marketplace/certified-operators-qlgxl" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.209573 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8zl9\" (UniqueName: \"kubernetes.io/projected/1827909b-49ea-4ba8-9995-f525d1d82f45-kube-api-access-m8zl9\") pod \"certified-operators-qlgxl\" (UID: \"1827909b-49ea-4ba8-9995-f525d1d82f45\") " pod="openshift-marketplace/certified-operators-qlgxl" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.247918 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qlgxl" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.277596 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.277911 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9058e533-24e2-44f1-8631-dd9bf6a37192-utilities\") pod \"community-operators-8ps5w\" (UID: \"9058e533-24e2-44f1-8631-dd9bf6a37192\") " pod="openshift-marketplace/community-operators-8ps5w" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.277949 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-729nf\" (UniqueName: \"kubernetes.io/projected/9058e533-24e2-44f1-8631-dd9bf6a37192-kube-api-access-729nf\") pod \"community-operators-8ps5w\" (UID: \"9058e533-24e2-44f1-8631-dd9bf6a37192\") " pod="openshift-marketplace/community-operators-8ps5w" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.278021 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9058e533-24e2-44f1-8631-dd9bf6a37192-catalog-content\") pod \"community-operators-8ps5w\" (UID: \"9058e533-24e2-44f1-8631-dd9bf6a37192\") " pod="openshift-marketplace/community-operators-8ps5w" Apr 02 13:41:37 crc kubenswrapper[4732]: E0402 13:41:37.278108 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:37.778095031 +0000 UTC m=+254.682502584 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.293549 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fd4xj"] Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.294415 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fd4xj" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.312963 4732 patch_prober.go:28] interesting pod/router-default-5444994796-6qzc4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 02 13:41:37 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Apr 02 13:41:37 crc kubenswrapper[4732]: [+]process-running ok Apr 02 13:41:37 crc kubenswrapper[4732]: healthz check failed Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.313010 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6qzc4" podUID="d8ff2a93-ff6b-4ef8-9109-de3c22a6f108" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.324455 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fd4xj"] Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.372531 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-54khq"] Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.379897 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" podUID="16ec121b-fdf2-452d-8963-08d6132f7c5c" containerName="controller-manager" containerID="cri-o://3989604763ea02ad03b6f003c903eb3ed34bff775fbc928643ea3514f5743a30" gracePeriod=30 Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.380654 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9058e533-24e2-44f1-8631-dd9bf6a37192-catalog-content\") pod \"community-operators-8ps5w\" (UID: \"9058e533-24e2-44f1-8631-dd9bf6a37192\") " pod="openshift-marketplace/community-operators-8ps5w" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.380709 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9058e533-24e2-44f1-8631-dd9bf6a37192-utilities\") pod \"community-operators-8ps5w\" (UID: \"9058e533-24e2-44f1-8631-dd9bf6a37192\") " pod="openshift-marketplace/community-operators-8ps5w" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.380741 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ac64cdf-a607-481a-9907-e6e72fc8b083-utilities\") pod \"certified-operators-fd4xj\" (UID: \"6ac64cdf-a607-481a-9907-e6e72fc8b083\") " pod="openshift-marketplace/certified-operators-fd4xj" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.380757 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ac64cdf-a607-481a-9907-e6e72fc8b083-catalog-content\") pod \"certified-operators-fd4xj\" (UID: \"6ac64cdf-a607-481a-9907-e6e72fc8b083\") " pod="openshift-marketplace/certified-operators-fd4xj" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.380777 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-729nf\" (UniqueName: \"kubernetes.io/projected/9058e533-24e2-44f1-8631-dd9bf6a37192-kube-api-access-729nf\") pod \"community-operators-8ps5w\" (UID: \"9058e533-24e2-44f1-8631-dd9bf6a37192\") " pod="openshift-marketplace/community-operators-8ps5w" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.380801 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.380838 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnp2h\" (UniqueName: \"kubernetes.io/projected/6ac64cdf-a607-481a-9907-e6e72fc8b083-kube-api-access-pnp2h\") pod \"certified-operators-fd4xj\" (UID: \"6ac64cdf-a607-481a-9907-e6e72fc8b083\") " pod="openshift-marketplace/certified-operators-fd4xj" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.381189 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9058e533-24e2-44f1-8631-dd9bf6a37192-catalog-content\") pod \"community-operators-8ps5w\" (UID: \"9058e533-24e2-44f1-8631-dd9bf6a37192\") " pod="openshift-marketplace/community-operators-8ps5w" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.381390 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9058e533-24e2-44f1-8631-dd9bf6a37192-utilities\") pod \"community-operators-8ps5w\" (UID: \"9058e533-24e2-44f1-8631-dd9bf6a37192\") " pod="openshift-marketplace/community-operators-8ps5w" Apr 02 13:41:37 crc kubenswrapper[4732]: E0402 13:41:37.381772 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:37.881752651 +0000 UTC m=+254.786160264 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.440870 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp"] Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.441112 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp" podUID="2c662639-32a4-4f78-af37-9b1e65bab4e8" containerName="route-controller-manager" containerID="cri-o://09340322997b1875ead4db97dff94bfa5238c8d8eedab284b0d9b75216685e5c" gracePeriod=30 Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.461513 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-729nf\" (UniqueName: \"kubernetes.io/projected/9058e533-24e2-44f1-8631-dd9bf6a37192-kube-api-access-729nf\") pod \"community-operators-8ps5w\" (UID: \"9058e533-24e2-44f1-8631-dd9bf6a37192\") " pod="openshift-marketplace/community-operators-8ps5w" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.482730 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.482944 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ac64cdf-a607-481a-9907-e6e72fc8b083-utilities\") pod \"certified-operators-fd4xj\" (UID: \"6ac64cdf-a607-481a-9907-e6e72fc8b083\") " pod="openshift-marketplace/certified-operators-fd4xj" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.482965 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ac64cdf-a607-481a-9907-e6e72fc8b083-catalog-content\") pod \"certified-operators-fd4xj\" (UID: \"6ac64cdf-a607-481a-9907-e6e72fc8b083\") " pod="openshift-marketplace/certified-operators-fd4xj" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.483036 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnp2h\" (UniqueName: \"kubernetes.io/projected/6ac64cdf-a607-481a-9907-e6e72fc8b083-kube-api-access-pnp2h\") pod \"certified-operators-fd4xj\" (UID: \"6ac64cdf-a607-481a-9907-e6e72fc8b083\") " pod="openshift-marketplace/certified-operators-fd4xj" Apr 02 13:41:37 crc kubenswrapper[4732]: E0402 13:41:37.483220 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:37.983198803 +0000 UTC m=+254.887606356 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.483631 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ac64cdf-a607-481a-9907-e6e72fc8b083-utilities\") pod \"certified-operators-fd4xj\" (UID: \"6ac64cdf-a607-481a-9907-e6e72fc8b083\") " pod="openshift-marketplace/certified-operators-fd4xj" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.483698 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ac64cdf-a607-481a-9907-e6e72fc8b083-catalog-content\") pod \"certified-operators-fd4xj\" (UID: \"6ac64cdf-a607-481a-9907-e6e72fc8b083\") " pod="openshift-marketplace/certified-operators-fd4xj" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.516483 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" event={"ID":"ea530987-3884-4994-9574-b73fc76fcdde","Type":"ContainerStarted","Data":"ccff9c2b817177ce6f647ac40187504767ebaef3aa02d263c169e6f13c255e1c"} Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.530573 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnp2h\" (UniqueName: \"kubernetes.io/projected/6ac64cdf-a607-481a-9907-e6e72fc8b083-kube-api-access-pnp2h\") pod \"certified-operators-fd4xj\" (UID: \"6ac64cdf-a607-481a-9907-e6e72fc8b083\") " pod="openshift-marketplace/certified-operators-fd4xj" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.547462 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" event={"ID":"9b3f7b50-e118-4aca-986d-0f52e772edc3","Type":"ContainerStarted","Data":"f08865dfb8bbf55b158e21e0d1b9dfa253195446665dbf01bb2eb818fc0b8e94"} Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.591425 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:37 crc kubenswrapper[4732]: E0402 13:41:37.592597 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:38.092582484 +0000 UTC m=+254.996990037 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.603375 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" podStartSLOduration=177.603352629 podStartE2EDuration="2m57.603352629s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:37.572927655 +0000 UTC m=+254.477335228" watchObservedRunningTime="2026-04-02 13:41:37.603352629 +0000 UTC m=+254.507760182" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.607552 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4m6rj"] Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.638831 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fd4xj" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.698735 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:37 crc kubenswrapper[4732]: E0402 13:41:37.703252 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:38.203229699 +0000 UTC m=+255.107637262 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.703773 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:37 crc kubenswrapper[4732]: E0402 13:41:37.705375 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:38.205362176 +0000 UTC m=+255.109769829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.755807 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ps5w" Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.805223 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:37 crc kubenswrapper[4732]: E0402 13:41:37.805554 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:38.305540024 +0000 UTC m=+255.209947577 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.906550 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:37 crc kubenswrapper[4732]: E0402 13:41:37.907168 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:38.40715565 +0000 UTC m=+255.311563203 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:37 crc kubenswrapper[4732]: I0402 13:41:37.943010 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.005156 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qlgxl"] Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.009686 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:38 crc kubenswrapper[4732]: E0402 13:41:38.010046 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:38.51003054 +0000 UTC m=+255.414438093 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.111604 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:38 crc kubenswrapper[4732]: E0402 13:41:38.113738 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:38.613715781 +0000 UTC m=+255.518123444 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.128815 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.158688 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.212677 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c662639-32a4-4f78-af37-9b1e65bab4e8-serving-cert\") pod \"2c662639-32a4-4f78-af37-9b1e65bab4e8\" (UID: \"2c662639-32a4-4f78-af37-9b1e65bab4e8\") " Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.212763 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8btn6\" (UniqueName: \"kubernetes.io/projected/16ec121b-fdf2-452d-8963-08d6132f7c5c-kube-api-access-8btn6\") pod \"16ec121b-fdf2-452d-8963-08d6132f7c5c\" (UID: \"16ec121b-fdf2-452d-8963-08d6132f7c5c\") " Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.212857 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.212886 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c662639-32a4-4f78-af37-9b1e65bab4e8-client-ca\") pod \"2c662639-32a4-4f78-af37-9b1e65bab4e8\" (UID: \"2c662639-32a4-4f78-af37-9b1e65bab4e8\") " Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.212912 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16ec121b-fdf2-452d-8963-08d6132f7c5c-serving-cert\") pod \"16ec121b-fdf2-452d-8963-08d6132f7c5c\" (UID: \"16ec121b-fdf2-452d-8963-08d6132f7c5c\") " Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.212942 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16ec121b-fdf2-452d-8963-08d6132f7c5c-config\") pod \"16ec121b-fdf2-452d-8963-08d6132f7c5c\" (UID: \"16ec121b-fdf2-452d-8963-08d6132f7c5c\") " Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.212982 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c662639-32a4-4f78-af37-9b1e65bab4e8-config\") pod \"2c662639-32a4-4f78-af37-9b1e65bab4e8\" (UID: \"2c662639-32a4-4f78-af37-9b1e65bab4e8\") " Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.213018 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78ttp\" (UniqueName: \"kubernetes.io/projected/2c662639-32a4-4f78-af37-9b1e65bab4e8-kube-api-access-78ttp\") pod \"2c662639-32a4-4f78-af37-9b1e65bab4e8\" (UID: \"2c662639-32a4-4f78-af37-9b1e65bab4e8\") " Apr 02 13:41:38 crc kubenswrapper[4732]: E0402 13:41:38.213044 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:38.713020116 +0000 UTC m=+255.617427669 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.213087 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16ec121b-fdf2-452d-8963-08d6132f7c5c-client-ca\") pod \"16ec121b-fdf2-452d-8963-08d6132f7c5c\" (UID: \"16ec121b-fdf2-452d-8963-08d6132f7c5c\") " Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.213122 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16ec121b-fdf2-452d-8963-08d6132f7c5c-proxy-ca-bundles\") pod \"16ec121b-fdf2-452d-8963-08d6132f7c5c\" (UID: \"16ec121b-fdf2-452d-8963-08d6132f7c5c\") " Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.213427 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:38 crc kubenswrapper[4732]: E0402 13:41:38.213753 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:38.713743785 +0000 UTC m=+255.618151338 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.214701 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16ec121b-fdf2-452d-8963-08d6132f7c5c-client-ca" (OuterVolumeSpecName: "client-ca") pod "16ec121b-fdf2-452d-8963-08d6132f7c5c" (UID: "16ec121b-fdf2-452d-8963-08d6132f7c5c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.215094 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16ec121b-fdf2-452d-8963-08d6132f7c5c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "16ec121b-fdf2-452d-8963-08d6132f7c5c" (UID: "16ec121b-fdf2-452d-8963-08d6132f7c5c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.217275 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c662639-32a4-4f78-af37-9b1e65bab4e8-config" (OuterVolumeSpecName: "config") pod "2c662639-32a4-4f78-af37-9b1e65bab4e8" (UID: "2c662639-32a4-4f78-af37-9b1e65bab4e8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.217308 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c662639-32a4-4f78-af37-9b1e65bab4e8-client-ca" (OuterVolumeSpecName: "client-ca") pod "2c662639-32a4-4f78-af37-9b1e65bab4e8" (UID: "2c662639-32a4-4f78-af37-9b1e65bab4e8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.217926 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16ec121b-fdf2-452d-8963-08d6132f7c5c-config" (OuterVolumeSpecName: "config") pod "16ec121b-fdf2-452d-8963-08d6132f7c5c" (UID: "16ec121b-fdf2-452d-8963-08d6132f7c5c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.218970 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c662639-32a4-4f78-af37-9b1e65bab4e8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2c662639-32a4-4f78-af37-9b1e65bab4e8" (UID: "2c662639-32a4-4f78-af37-9b1e65bab4e8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.219022 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16ec121b-fdf2-452d-8963-08d6132f7c5c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "16ec121b-fdf2-452d-8963-08d6132f7c5c" (UID: "16ec121b-fdf2-452d-8963-08d6132f7c5c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.234505 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c662639-32a4-4f78-af37-9b1e65bab4e8-kube-api-access-78ttp" (OuterVolumeSpecName: "kube-api-access-78ttp") pod "2c662639-32a4-4f78-af37-9b1e65bab4e8" (UID: "2c662639-32a4-4f78-af37-9b1e65bab4e8"). InnerVolumeSpecName "kube-api-access-78ttp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.235899 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16ec121b-fdf2-452d-8963-08d6132f7c5c-kube-api-access-8btn6" (OuterVolumeSpecName: "kube-api-access-8btn6") pod "16ec121b-fdf2-452d-8963-08d6132f7c5c" (UID: "16ec121b-fdf2-452d-8963-08d6132f7c5c"). InnerVolumeSpecName "kube-api-access-8btn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.307600 4732 patch_prober.go:28] interesting pod/router-default-5444994796-6qzc4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 02 13:41:38 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Apr 02 13:41:38 crc kubenswrapper[4732]: [+]process-running ok Apr 02 13:41:38 crc kubenswrapper[4732]: healthz check failed Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.307707 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6qzc4" podUID="d8ff2a93-ff6b-4ef8-9109-de3c22a6f108" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.314246 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:38 crc kubenswrapper[4732]: E0402 13:41:38.314347 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:38.814328904 +0000 UTC m=+255.718736457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.314399 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.314484 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78ttp\" (UniqueName: \"kubernetes.io/projected/2c662639-32a4-4f78-af37-9b1e65bab4e8-kube-api-access-78ttp\") on node \"crc\" DevicePath \"\"" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.314495 4732 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16ec121b-fdf2-452d-8963-08d6132f7c5c-client-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.314857 4732 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16ec121b-fdf2-452d-8963-08d6132f7c5c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.314879 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c662639-32a4-4f78-af37-9b1e65bab4e8-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.314888 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8btn6\" (UniqueName: \"kubernetes.io/projected/16ec121b-fdf2-452d-8963-08d6132f7c5c-kube-api-access-8btn6\") on node \"crc\" DevicePath \"\"" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.314896 4732 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c662639-32a4-4f78-af37-9b1e65bab4e8-client-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:41:38 crc kubenswrapper[4732]: E0402 13:41:38.314914 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:38.814896779 +0000 UTC m=+255.719304332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.314952 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16ec121b-fdf2-452d-8963-08d6132f7c5c-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.314964 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16ec121b-fdf2-452d-8963-08d6132f7c5c-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.314974 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c662639-32a4-4f78-af37-9b1e65bab4e8-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.356668 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-brl6n"] Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.356936 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-service-ca/service-ca-9c57cc56f-brl6n" podUID="5cb3c06a-e3cf-4a60-b180-82759b9d55fc" containerName="service-ca-controller" containerID="cri-o://02eaba1c5190d6d2ba1493c6ccd7cd1055d8a32472c005b0b4c1674c96732f1f" gracePeriod=30 Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.377876 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8ps5w"] Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.416322 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:38 crc kubenswrapper[4732]: E0402 13:41:38.416742 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:38.916720761 +0000 UTC m=+255.821128314 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.418209 4732 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.443350 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fd4xj"] Apr 02 13:41:38 crc kubenswrapper[4732]: W0402 13:41:38.450137 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9058e533_24e2_44f1_8631_dd9bf6a37192.slice/crio-78737cb551255129fbb3203f7982eb1b28e6b2d57c02a9c97fd94c120fc1cf4f WatchSource:0}: Error finding container 78737cb551255129fbb3203f7982eb1b28e6b2d57c02a9c97fd94c120fc1cf4f: Status 404 returned error can't find the container with id 78737cb551255129fbb3203f7982eb1b28e6b2d57c02a9c97fd94c120fc1cf4f Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.518189 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:38 crc kubenswrapper[4732]: E0402 13:41:38.518595 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:39.018584063 +0000 UTC m=+255.922991616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.554726 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ps5w" event={"ID":"9058e533-24e2-44f1-8631-dd9bf6a37192","Type":"ContainerStarted","Data":"78737cb551255129fbb3203f7982eb1b28e6b2d57c02a9c97fd94c120fc1cf4f"} Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.556546 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fd4xj" event={"ID":"6ac64cdf-a607-481a-9907-e6e72fc8b083","Type":"ContainerStarted","Data":"69eab28f470ca7f2b836a4317fe17162e04269abf8b52aec33aaa3bef380b973"} Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.559056 4732 generic.go:334] "Generic (PLEG): container finished" podID="2c662639-32a4-4f78-af37-9b1e65bab4e8" containerID="09340322997b1875ead4db97dff94bfa5238c8d8eedab284b0d9b75216685e5c" exitCode=0 Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.559121 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp" event={"ID":"2c662639-32a4-4f78-af37-9b1e65bab4e8","Type":"ContainerDied","Data":"09340322997b1875ead4db97dff94bfa5238c8d8eedab284b0d9b75216685e5c"} Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.559146 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp" event={"ID":"2c662639-32a4-4f78-af37-9b1e65bab4e8","Type":"ContainerDied","Data":"e10769e8ff494320d92a409f25730369c48a152f098c3c74207c6950e2854368"} Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.559165 4732 scope.go:117] "RemoveContainer" containerID="09340322997b1875ead4db97dff94bfa5238c8d8eedab284b0d9b75216685e5c" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.559327 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.567086 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qlgxl" event={"ID":"1827909b-49ea-4ba8-9995-f525d1d82f45","Type":"ContainerStarted","Data":"da97d71055b4b784da56de209f467f22027d238d2a64236a68128d6712f04476"} Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.569178 4732 generic.go:334] "Generic (PLEG): container finished" podID="cf030ff0-459d-4453-975f-19ba4ff9641a" containerID="898bc99e31c66d55335c3093239515a3af51a99013591088cd8ecde3d9445f69" exitCode=0 Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.569220 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m6rj" event={"ID":"cf030ff0-459d-4453-975f-19ba4ff9641a","Type":"ContainerDied","Data":"898bc99e31c66d55335c3093239515a3af51a99013591088cd8ecde3d9445f69"} Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.569235 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m6rj" event={"ID":"cf030ff0-459d-4453-975f-19ba4ff9641a","Type":"ContainerStarted","Data":"8e710b90b254ff538be0e156b50bf321fb22653a2383e4ccd08086e5cab87427"} Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.572182 4732 generic.go:334] "Generic (PLEG): container finished" podID="16ec121b-fdf2-452d-8963-08d6132f7c5c" containerID="3989604763ea02ad03b6f003c903eb3ed34bff775fbc928643ea3514f5743a30" exitCode=0 Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.572850 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.572842 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" event={"ID":"16ec121b-fdf2-452d-8963-08d6132f7c5c","Type":"ContainerDied","Data":"3989604763ea02ad03b6f003c903eb3ed34bff775fbc928643ea3514f5743a30"} Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.572907 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-54khq" event={"ID":"16ec121b-fdf2-452d-8963-08d6132f7c5c","Type":"ContainerDied","Data":"ae54703e3a6fed5d4bcd093b60943f903041de0c2e8f7e509ce77149953c3ed9"} Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.579245 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2z2d" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.611741 4732 scope.go:117] "RemoveContainer" containerID="09340322997b1875ead4db97dff94bfa5238c8d8eedab284b0d9b75216685e5c" Apr 02 13:41:38 crc kubenswrapper[4732]: E0402 13:41:38.612422 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09340322997b1875ead4db97dff94bfa5238c8d8eedab284b0d9b75216685e5c\": container with ID starting with 09340322997b1875ead4db97dff94bfa5238c8d8eedab284b0d9b75216685e5c not found: ID does not exist" containerID="09340322997b1875ead4db97dff94bfa5238c8d8eedab284b0d9b75216685e5c" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.612451 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09340322997b1875ead4db97dff94bfa5238c8d8eedab284b0d9b75216685e5c"} err="failed to get container status \"09340322997b1875ead4db97dff94bfa5238c8d8eedab284b0d9b75216685e5c\": rpc error: code = NotFound desc = could not find container \"09340322997b1875ead4db97dff94bfa5238c8d8eedab284b0d9b75216685e5c\": container with ID starting with 09340322997b1875ead4db97dff94bfa5238c8d8eedab284b0d9b75216685e5c not found: ID does not exist" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.612470 4732 scope.go:117] "RemoveContainer" containerID="3989604763ea02ad03b6f003c903eb3ed34bff775fbc928643ea3514f5743a30" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.618851 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:38 crc kubenswrapper[4732]: E0402 13:41:38.620039 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:39.120022475 +0000 UTC m=+256.024430028 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.620783 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp"] Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.627162 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jbmtp"] Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.630912 4732 scope.go:117] "RemoveContainer" containerID="3989604763ea02ad03b6f003c903eb3ed34bff775fbc928643ea3514f5743a30" Apr 02 13:41:38 crc kubenswrapper[4732]: E0402 13:41:38.632046 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3989604763ea02ad03b6f003c903eb3ed34bff775fbc928643ea3514f5743a30\": container with ID starting with 3989604763ea02ad03b6f003c903eb3ed34bff775fbc928643ea3514f5743a30 not found: ID does not exist" containerID="3989604763ea02ad03b6f003c903eb3ed34bff775fbc928643ea3514f5743a30" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.632180 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3989604763ea02ad03b6f003c903eb3ed34bff775fbc928643ea3514f5743a30"} err="failed to get container status \"3989604763ea02ad03b6f003c903eb3ed34bff775fbc928643ea3514f5743a30\": rpc error: code = NotFound desc = could not find container \"3989604763ea02ad03b6f003c903eb3ed34bff775fbc928643ea3514f5743a30\": container with ID starting with 3989604763ea02ad03b6f003c903eb3ed34bff775fbc928643ea3514f5743a30 not found: ID does not exist" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.638933 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-54khq"] Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.640499 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-54khq"] Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.687644 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16ec121b-fdf2-452d-8963-08d6132f7c5c" path="/var/lib/kubelet/pods/16ec121b-fdf2-452d-8963-08d6132f7c5c/volumes" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.688433 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c662639-32a4-4f78-af37-9b1e65bab4e8" path="/var/lib/kubelet/pods/2c662639-32a4-4f78-af37-9b1e65bab4e8/volumes" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.720637 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:38 crc kubenswrapper[4732]: E0402 13:41:38.721813 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:39.221797495 +0000 UTC m=+256.126205048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.821247 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:38 crc kubenswrapper[4732]: E0402 13:41:38.821572 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:39.321558893 +0000 UTC m=+256.225966446 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.912218 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9lrnf"] Apr 02 13:41:38 crc kubenswrapper[4732]: E0402 13:41:38.912609 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c662639-32a4-4f78-af37-9b1e65bab4e8" containerName="route-controller-manager" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.912653 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c662639-32a4-4f78-af37-9b1e65bab4e8" containerName="route-controller-manager" Apr 02 13:41:38 crc kubenswrapper[4732]: E0402 13:41:38.912672 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ec121b-fdf2-452d-8963-08d6132f7c5c" containerName="controller-manager" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.912684 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ec121b-fdf2-452d-8963-08d6132f7c5c" containerName="controller-manager" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.912874 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c662639-32a4-4f78-af37-9b1e65bab4e8" containerName="route-controller-manager" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.912907 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="16ec121b-fdf2-452d-8963-08d6132f7c5c" containerName="controller-manager" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.914300 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9lrnf" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.924687 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9lrnf"] Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.925518 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Apr 02 13:41:38 crc kubenswrapper[4732]: I0402 13:41:38.925578 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:38 crc kubenswrapper[4732]: E0402 13:41:38.925975 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:39.425960182 +0000 UTC m=+256.330367755 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.027159 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:39 crc kubenswrapper[4732]: E0402 13:41:39.027411 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-02 13:41:39.527389934 +0000 UTC m=+256.431797487 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.027592 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a0e365-014c-40e8-8749-7512f2c00758-utilities\") pod \"redhat-marketplace-9lrnf\" (UID: \"51a0e365-014c-40e8-8749-7512f2c00758\") " pod="openshift-marketplace/redhat-marketplace-9lrnf" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.028069 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjk9n\" (UniqueName: \"kubernetes.io/projected/51a0e365-014c-40e8-8749-7512f2c00758-kube-api-access-zjk9n\") pod \"redhat-marketplace-9lrnf\" (UID: \"51a0e365-014c-40e8-8749-7512f2c00758\") " pod="openshift-marketplace/redhat-marketplace-9lrnf" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.028180 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.028210 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a0e365-014c-40e8-8749-7512f2c00758-catalog-content\") pod \"redhat-marketplace-9lrnf\" (UID: \"51a0e365-014c-40e8-8749-7512f2c00758\") " pod="openshift-marketplace/redhat-marketplace-9lrnf" Apr 02 13:41:39 crc kubenswrapper[4732]: E0402 13:41:39.028528 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-02 13:41:39.528517063 +0000 UTC m=+256.432924616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhxz4" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.061642 4732 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-04-02T13:41:38.418239841Z","Handler":null,"Name":""} Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.066666 4732 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.066691 4732 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.130648 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.130851 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjk9n\" (UniqueName: \"kubernetes.io/projected/51a0e365-014c-40e8-8749-7512f2c00758-kube-api-access-zjk9n\") pod \"redhat-marketplace-9lrnf\" (UID: \"51a0e365-014c-40e8-8749-7512f2c00758\") " pod="openshift-marketplace/redhat-marketplace-9lrnf" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.130924 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a0e365-014c-40e8-8749-7512f2c00758-catalog-content\") pod \"redhat-marketplace-9lrnf\" (UID: \"51a0e365-014c-40e8-8749-7512f2c00758\") " pod="openshift-marketplace/redhat-marketplace-9lrnf" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.130992 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a0e365-014c-40e8-8749-7512f2c00758-utilities\") pod \"redhat-marketplace-9lrnf\" (UID: \"51a0e365-014c-40e8-8749-7512f2c00758\") " pod="openshift-marketplace/redhat-marketplace-9lrnf" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.131653 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a0e365-014c-40e8-8749-7512f2c00758-utilities\") pod \"redhat-marketplace-9lrnf\" (UID: \"51a0e365-014c-40e8-8749-7512f2c00758\") " pod="openshift-marketplace/redhat-marketplace-9lrnf" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.131659 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a0e365-014c-40e8-8749-7512f2c00758-catalog-content\") pod \"redhat-marketplace-9lrnf\" (UID: \"51a0e365-014c-40e8-8749-7512f2c00758\") " pod="openshift-marketplace/redhat-marketplace-9lrnf" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.156220 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjk9n\" (UniqueName: \"kubernetes.io/projected/51a0e365-014c-40e8-8749-7512f2c00758-kube-api-access-zjk9n\") pod \"redhat-marketplace-9lrnf\" (UID: \"51a0e365-014c-40e8-8749-7512f2c00758\") " pod="openshift-marketplace/redhat-marketplace-9lrnf" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.165286 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.232487 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.236862 4732 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.236898 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.270945 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-brl6n" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.272122 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhxz4\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.290293 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.290721 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z7v99"] Apr 02 13:41:39 crc kubenswrapper[4732]: E0402 13:41:39.294429 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cb3c06a-e3cf-4a60-b180-82759b9d55fc" containerName="service-ca-controller" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.294455 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cb3c06a-e3cf-4a60-b180-82759b9d55fc" containerName="service-ca-controller" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.294546 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cb3c06a-e3cf-4a60-b180-82759b9d55fc" containerName="service-ca-controller" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.295439 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z7v99" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.310018 4732 patch_prober.go:28] interesting pod/router-default-5444994796-6qzc4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 02 13:41:39 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Apr 02 13:41:39 crc kubenswrapper[4732]: [+]process-running ok Apr 02 13:41:39 crc kubenswrapper[4732]: healthz check failed Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.310086 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6qzc4" podUID="d8ff2a93-ff6b-4ef8-9109-de3c22a6f108" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.328100 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z7v99"] Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.332834 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.333521 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.334464 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5cb3c06a-e3cf-4a60-b180-82759b9d55fc-signing-cabundle\") pod \"5cb3c06a-e3cf-4a60-b180-82759b9d55fc\" (UID: \"5cb3c06a-e3cf-4a60-b180-82759b9d55fc\") " Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.334538 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5cb3c06a-e3cf-4a60-b180-82759b9d55fc-signing-key\") pod \"5cb3c06a-e3cf-4a60-b180-82759b9d55fc\" (UID: \"5cb3c06a-e3cf-4a60-b180-82759b9d55fc\") " Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.334568 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8qcb\" (UniqueName: \"kubernetes.io/projected/5cb3c06a-e3cf-4a60-b180-82759b9d55fc-kube-api-access-m8qcb\") pod \"5cb3c06a-e3cf-4a60-b180-82759b9d55fc\" (UID: \"5cb3c06a-e3cf-4a60-b180-82759b9d55fc\") " Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.335147 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.335562 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cb3c06a-e3cf-4a60-b180-82759b9d55fc-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "5cb3c06a-e3cf-4a60-b180-82759b9d55fc" (UID: "5cb3c06a-e3cf-4a60-b180-82759b9d55fc"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.335894 4732 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5cb3c06a-e3cf-4a60-b180-82759b9d55fc-signing-cabundle\") on node \"crc\" DevicePath \"\"" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.336310 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.338221 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.339910 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cb3c06a-e3cf-4a60-b180-82759b9d55fc-kube-api-access-m8qcb" (OuterVolumeSpecName: "kube-api-access-m8qcb") pod "5cb3c06a-e3cf-4a60-b180-82759b9d55fc" (UID: "5cb3c06a-e3cf-4a60-b180-82759b9d55fc"). InnerVolumeSpecName "kube-api-access-m8qcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.340504 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cb3c06a-e3cf-4a60-b180-82759b9d55fc-signing-key" (OuterVolumeSpecName: "signing-key") pod "5cb3c06a-e3cf-4a60-b180-82759b9d55fc" (UID: "5cb3c06a-e3cf-4a60-b180-82759b9d55fc"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.356371 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9lrnf" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.437439 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72981c60-e9a1-4e25-9b64-7493d6fdaab6-catalog-content\") pod \"redhat-marketplace-z7v99\" (UID: \"72981c60-e9a1-4e25-9b64-7493d6fdaab6\") " pod="openshift-marketplace/redhat-marketplace-z7v99" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.437803 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72981c60-e9a1-4e25-9b64-7493d6fdaab6-utilities\") pod \"redhat-marketplace-z7v99\" (UID: \"72981c60-e9a1-4e25-9b64-7493d6fdaab6\") " pod="openshift-marketplace/redhat-marketplace-z7v99" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.437851 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e3fea23-4311-467e-b808-05ada9bc4e03-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1e3fea23-4311-467e-b808-05ada9bc4e03\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.438005 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhqpl\" (UniqueName: \"kubernetes.io/projected/72981c60-e9a1-4e25-9b64-7493d6fdaab6-kube-api-access-nhqpl\") pod \"redhat-marketplace-z7v99\" (UID: \"72981c60-e9a1-4e25-9b64-7493d6fdaab6\") " pod="openshift-marketplace/redhat-marketplace-z7v99" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.438086 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1e3fea23-4311-467e-b808-05ada9bc4e03-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1e3fea23-4311-467e-b808-05ada9bc4e03\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.438208 4732 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5cb3c06a-e3cf-4a60-b180-82759b9d55fc-signing-key\") on node \"crc\" DevicePath \"\"" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.438221 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8qcb\" (UniqueName: \"kubernetes.io/projected/5cb3c06a-e3cf-4a60-b180-82759b9d55fc-kube-api-access-m8qcb\") on node \"crc\" DevicePath \"\"" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.514001 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7bff847b8d-88b8t"] Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.515341 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.516934 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh"] Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.517513 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.518276 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.518608 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.518752 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.520332 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.521542 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.521821 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.522177 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.522373 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.522546 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.522572 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.522852 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.524100 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.526287 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh"] Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.531515 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.540489 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e3fea23-4311-467e-b808-05ada9bc4e03-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1e3fea23-4311-467e-b808-05ada9bc4e03\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.540541 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhqpl\" (UniqueName: \"kubernetes.io/projected/72981c60-e9a1-4e25-9b64-7493d6fdaab6-kube-api-access-nhqpl\") pod \"redhat-marketplace-z7v99\" (UID: \"72981c60-e9a1-4e25-9b64-7493d6fdaab6\") " pod="openshift-marketplace/redhat-marketplace-z7v99" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.540569 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.540592 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1e3fea23-4311-467e-b808-05ada9bc4e03-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1e3fea23-4311-467e-b808-05ada9bc4e03\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.540638 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72981c60-e9a1-4e25-9b64-7493d6fdaab6-catalog-content\") pod \"redhat-marketplace-z7v99\" (UID: \"72981c60-e9a1-4e25-9b64-7493d6fdaab6\") " pod="openshift-marketplace/redhat-marketplace-z7v99" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.540660 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.540676 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.540699 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72981c60-e9a1-4e25-9b64-7493d6fdaab6-utilities\") pod \"redhat-marketplace-z7v99\" (UID: \"72981c60-e9a1-4e25-9b64-7493d6fdaab6\") " pod="openshift-marketplace/redhat-marketplace-z7v99" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.540725 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.542222 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1e3fea23-4311-467e-b808-05ada9bc4e03-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1e3fea23-4311-467e-b808-05ada9bc4e03\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.542313 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bff847b8d-88b8t"] Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.543765 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72981c60-e9a1-4e25-9b64-7493d6fdaab6-catalog-content\") pod \"redhat-marketplace-z7v99\" (UID: \"72981c60-e9a1-4e25-9b64-7493d6fdaab6\") " pod="openshift-marketplace/redhat-marketplace-z7v99" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.543811 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.544145 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72981c60-e9a1-4e25-9b64-7493d6fdaab6-utilities\") pod \"redhat-marketplace-z7v99\" (UID: \"72981c60-e9a1-4e25-9b64-7493d6fdaab6\") " pod="openshift-marketplace/redhat-marketplace-z7v99" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.544511 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.546880 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.547541 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.557160 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zhxz4"] Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.559800 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e3fea23-4311-467e-b808-05ada9bc4e03-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1e3fea23-4311-467e-b808-05ada9bc4e03\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Apr 02 13:41:39 crc kubenswrapper[4732]: W0402 13:41:39.570427 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2570535_673c_495c_a5aa_392f14ceebb1.slice/crio-17532bd5afe9e00601608767d5760acdc596adaab8fba875408b53a8ecf51ebf WatchSource:0}: Error finding container 17532bd5afe9e00601608767d5760acdc596adaab8fba875408b53a8ecf51ebf: Status 404 returned error can't find the container with id 17532bd5afe9e00601608767d5760acdc596adaab8fba875408b53a8ecf51ebf Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.578687 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhqpl\" (UniqueName: \"kubernetes.io/projected/72981c60-e9a1-4e25-9b64-7493d6fdaab6-kube-api-access-nhqpl\") pod \"redhat-marketplace-z7v99\" (UID: \"72981c60-e9a1-4e25-9b64-7493d6fdaab6\") " pod="openshift-marketplace/redhat-marketplace-z7v99" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.578771 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.580278 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.584153 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.584477 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.585425 4732 generic.go:334] "Generic (PLEG): container finished" podID="1827909b-49ea-4ba8-9995-f525d1d82f45" containerID="7dcbae5ed3cdbf1bd8f3cfa18f78e2c48935aa93532bd6393fda2191c7650c52" exitCode=0 Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.585560 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qlgxl" event={"ID":"1827909b-49ea-4ba8-9995-f525d1d82f45","Type":"ContainerDied","Data":"7dcbae5ed3cdbf1bd8f3cfa18f78e2c48935aa93532bd6393fda2191c7650c52"} Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.597758 4732 generic.go:334] "Generic (PLEG): container finished" podID="5cb3c06a-e3cf-4a60-b180-82759b9d55fc" containerID="02eaba1c5190d6d2ba1493c6ccd7cd1055d8a32472c005b0b4c1674c96732f1f" exitCode=0 Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.597841 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-brl6n" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.597866 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-brl6n" event={"ID":"5cb3c06a-e3cf-4a60-b180-82759b9d55fc","Type":"ContainerDied","Data":"02eaba1c5190d6d2ba1493c6ccd7cd1055d8a32472c005b0b4c1674c96732f1f"} Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.597906 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-brl6n" event={"ID":"5cb3c06a-e3cf-4a60-b180-82759b9d55fc","Type":"ContainerDied","Data":"0074f780bcde7f9c803e94a8a99c5131a2501fea2d0f047409284c745205680b"} Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.597929 4732 scope.go:117] "RemoveContainer" containerID="02eaba1c5190d6d2ba1493c6ccd7cd1055d8a32472c005b0b4c1674c96732f1f" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.611088 4732 generic.go:334] "Generic (PLEG): container finished" podID="9058e533-24e2-44f1-8631-dd9bf6a37192" containerID="ecbe9a47fc07081a457ade95202e51e947d903d2cac58f72e34bc96e96791f2c" exitCode=0 Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.611147 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ps5w" event={"ID":"9058e533-24e2-44f1-8631-dd9bf6a37192","Type":"ContainerDied","Data":"ecbe9a47fc07081a457ade95202e51e947d903d2cac58f72e34bc96e96791f2c"} Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.615798 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.616339 4732 generic.go:334] "Generic (PLEG): container finished" podID="6ac64cdf-a607-481a-9907-e6e72fc8b083" containerID="946618f3b3e5eb08548d122e42d9d04ac4bbc3030f5ff340eb464f99d279f52e" exitCode=0 Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.616794 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fd4xj" event={"ID":"6ac64cdf-a607-481a-9907-e6e72fc8b083","Type":"ContainerDied","Data":"946618f3b3e5eb08548d122e42d9d04ac4bbc3030f5ff340eb464f99d279f52e"} Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.617907 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z7v99" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.620207 4732 generic.go:334] "Generic (PLEG): container finished" podID="67577cc5-6dee-4465-beee-ea424d976972" containerID="93bddc4cc6346d65d466d01bb6e7bdfa76ee89770919cb5fa9a55eb9760f977f" exitCode=0 Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.620283 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29585610-wbzhk" event={"ID":"67577cc5-6dee-4465-beee-ea424d976972","Type":"ContainerDied","Data":"93bddc4cc6346d65d466d01bb6e7bdfa76ee89770919cb5fa9a55eb9760f977f"} Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.634524 4732 scope.go:117] "RemoveContainer" containerID="02eaba1c5190d6d2ba1493c6ccd7cd1055d8a32472c005b0b4c1674c96732f1f" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.635194 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" event={"ID":"9b3f7b50-e118-4aca-986d-0f52e772edc3","Type":"ContainerStarted","Data":"7d8875ae3abba56c90860781077115d4a5a207c65918c70f051ecfbf9068d2a3"} Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.635251 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" event={"ID":"9b3f7b50-e118-4aca-986d-0f52e772edc3","Type":"ContainerStarted","Data":"3fcf719eb0950e4bf986d67b6a155e5b179e62e02ea988dc44fa6612bbe1eea9"} Apr 02 13:41:39 crc kubenswrapper[4732]: E0402 13:41:39.635498 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02eaba1c5190d6d2ba1493c6ccd7cd1055d8a32472c005b0b4c1674c96732f1f\": container with ID starting with 02eaba1c5190d6d2ba1493c6ccd7cd1055d8a32472c005b0b4c1674c96732f1f not found: ID does not exist" containerID="02eaba1c5190d6d2ba1493c6ccd7cd1055d8a32472c005b0b4c1674c96732f1f" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.635568 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02eaba1c5190d6d2ba1493c6ccd7cd1055d8a32472c005b0b4c1674c96732f1f"} err="failed to get container status \"02eaba1c5190d6d2ba1493c6ccd7cd1055d8a32472c005b0b4c1674c96732f1f\": rpc error: code = NotFound desc = could not find container \"02eaba1c5190d6d2ba1493c6ccd7cd1055d8a32472c005b0b4c1674c96732f1f\": container with ID starting with 02eaba1c5190d6d2ba1493c6ccd7cd1055d8a32472c005b0b4c1674c96732f1f not found: ID does not exist" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.642223 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcf8da11-8bbb-4de3-bbe5-69869bf4829c-config\") pod \"route-controller-manager-d4c476667-t8fgh\" (UID: \"dcf8da11-8bbb-4de3-bbe5-69869bf4829c\") " pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.642273 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1678e6e2-ca55-46d4-a56a-281280da2ccc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1678e6e2-ca55-46d4-a56a-281280da2ccc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.642294 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9kd9\" (UniqueName: \"kubernetes.io/projected/dcf8da11-8bbb-4de3-bbe5-69869bf4829c-kube-api-access-g9kd9\") pod \"route-controller-manager-d4c476667-t8fgh\" (UID: \"dcf8da11-8bbb-4de3-bbe5-69869bf4829c\") " pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.642316 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dmwn\" (UniqueName: \"kubernetes.io/projected/c039ba90-db23-4d59-8109-2b83616af131-kube-api-access-8dmwn\") pod \"controller-manager-7bff847b8d-88b8t\" (UID: \"c039ba90-db23-4d59-8109-2b83616af131\") " pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.642332 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c039ba90-db23-4d59-8109-2b83616af131-serving-cert\") pod \"controller-manager-7bff847b8d-88b8t\" (UID: \"c039ba90-db23-4d59-8109-2b83616af131\") " pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.642355 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcf8da11-8bbb-4de3-bbe5-69869bf4829c-serving-cert\") pod \"route-controller-manager-d4c476667-t8fgh\" (UID: \"dcf8da11-8bbb-4de3-bbe5-69869bf4829c\") " pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.642385 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs\") pod \"network-metrics-daemon-crx2z\" (UID: \"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\") " pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.642404 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dcf8da11-8bbb-4de3-bbe5-69869bf4829c-client-ca\") pod \"route-controller-manager-d4c476667-t8fgh\" (UID: \"dcf8da11-8bbb-4de3-bbe5-69869bf4829c\") " pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.642436 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c039ba90-db23-4d59-8109-2b83616af131-client-ca\") pod \"controller-manager-7bff847b8d-88b8t\" (UID: \"c039ba90-db23-4d59-8109-2b83616af131\") " pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.642479 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c039ba90-db23-4d59-8109-2b83616af131-config\") pod \"controller-manager-7bff847b8d-88b8t\" (UID: \"c039ba90-db23-4d59-8109-2b83616af131\") " pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.642499 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1678e6e2-ca55-46d4-a56a-281280da2ccc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1678e6e2-ca55-46d4-a56a-281280da2ccc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.642516 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c039ba90-db23-4d59-8109-2b83616af131-proxy-ca-bundles\") pod \"controller-manager-7bff847b8d-88b8t\" (UID: \"c039ba90-db23-4d59-8109-2b83616af131\") " pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.646573 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/386bd92b-c67e-4cc6-8a47-6f8d6e799bc7-metrics-certs\") pod \"network-metrics-daemon-crx2z\" (UID: \"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7\") " pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.650498 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.710287 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-brl6n"] Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.726377 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-brl6n"] Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.745961 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-88db4f5f6-zjhx6"] Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.746500 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1678e6e2-ca55-46d4-a56a-281280da2ccc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1678e6e2-ca55-46d4-a56a-281280da2ccc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.746532 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c039ba90-db23-4d59-8109-2b83616af131-proxy-ca-bundles\") pod \"controller-manager-7bff847b8d-88b8t\" (UID: \"c039ba90-db23-4d59-8109-2b83616af131\") " pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.746563 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcf8da11-8bbb-4de3-bbe5-69869bf4829c-config\") pod \"route-controller-manager-d4c476667-t8fgh\" (UID: \"dcf8da11-8bbb-4de3-bbe5-69869bf4829c\") " pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.746588 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9kd9\" (UniqueName: \"kubernetes.io/projected/dcf8da11-8bbb-4de3-bbe5-69869bf4829c-kube-api-access-g9kd9\") pod \"route-controller-manager-d4c476667-t8fgh\" (UID: \"dcf8da11-8bbb-4de3-bbe5-69869bf4829c\") " pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.746605 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1678e6e2-ca55-46d4-a56a-281280da2ccc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1678e6e2-ca55-46d4-a56a-281280da2ccc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.746634 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dmwn\" (UniqueName: \"kubernetes.io/projected/c039ba90-db23-4d59-8109-2b83616af131-kube-api-access-8dmwn\") pod \"controller-manager-7bff847b8d-88b8t\" (UID: \"c039ba90-db23-4d59-8109-2b83616af131\") " pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.746658 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c039ba90-db23-4d59-8109-2b83616af131-serving-cert\") pod \"controller-manager-7bff847b8d-88b8t\" (UID: \"c039ba90-db23-4d59-8109-2b83616af131\") " pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.746695 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcf8da11-8bbb-4de3-bbe5-69869bf4829c-serving-cert\") pod \"route-controller-manager-d4c476667-t8fgh\" (UID: \"dcf8da11-8bbb-4de3-bbe5-69869bf4829c\") " pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.746717 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dcf8da11-8bbb-4de3-bbe5-69869bf4829c-client-ca\") pod \"route-controller-manager-d4c476667-t8fgh\" (UID: \"dcf8da11-8bbb-4de3-bbe5-69869bf4829c\") " pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.746764 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c039ba90-db23-4d59-8109-2b83616af131-client-ca\") pod \"controller-manager-7bff847b8d-88b8t\" (UID: \"c039ba90-db23-4d59-8109-2b83616af131\") " pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.746816 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c039ba90-db23-4d59-8109-2b83616af131-config\") pod \"controller-manager-7bff847b8d-88b8t\" (UID: \"c039ba90-db23-4d59-8109-2b83616af131\") " pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.747150 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-88db4f5f6-zjhx6" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.750262 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c039ba90-db23-4d59-8109-2b83616af131-client-ca\") pod \"controller-manager-7bff847b8d-88b8t\" (UID: \"c039ba90-db23-4d59-8109-2b83616af131\") " pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.751341 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dcf8da11-8bbb-4de3-bbe5-69869bf4829c-client-ca\") pod \"route-controller-manager-d4c476667-t8fgh\" (UID: \"dcf8da11-8bbb-4de3-bbe5-69869bf4829c\") " pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.751985 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c039ba90-db23-4d59-8109-2b83616af131-config\") pod \"controller-manager-7bff847b8d-88b8t\" (UID: \"c039ba90-db23-4d59-8109-2b83616af131\") " pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.752372 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.752721 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.752937 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.753189 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.753360 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.754421 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1678e6e2-ca55-46d4-a56a-281280da2ccc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1678e6e2-ca55-46d4-a56a-281280da2ccc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.756743 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c039ba90-db23-4d59-8109-2b83616af131-proxy-ca-bundles\") pod \"controller-manager-7bff847b8d-88b8t\" (UID: \"c039ba90-db23-4d59-8109-2b83616af131\") " pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.757715 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcf8da11-8bbb-4de3-bbe5-69869bf4829c-config\") pod \"route-controller-manager-d4c476667-t8fgh\" (UID: \"dcf8da11-8bbb-4de3-bbe5-69869bf4829c\") " pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.760802 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcf8da11-8bbb-4de3-bbe5-69869bf4829c-serving-cert\") pod \"route-controller-manager-d4c476667-t8fgh\" (UID: \"dcf8da11-8bbb-4de3-bbe5-69869bf4829c\") " pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.775929 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-k2sdm" podStartSLOduration=11.775907241 podStartE2EDuration="11.775907241s" podCreationTimestamp="2026-04-02 13:41:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:39.738970924 +0000 UTC m=+256.643378487" watchObservedRunningTime="2026-04-02 13:41:39.775907241 +0000 UTC m=+256.680314794" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.780855 4732 ???:1] "http: TLS handshake error from 192.168.126.11:48818: no serving certificate available for the kubelet" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.780887 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c039ba90-db23-4d59-8109-2b83616af131-serving-cert\") pod \"controller-manager-7bff847b8d-88b8t\" (UID: \"c039ba90-db23-4d59-8109-2b83616af131\") " pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.787767 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-88db4f5f6-zjhx6"] Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.788585 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1678e6e2-ca55-46d4-a56a-281280da2ccc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1678e6e2-ca55-46d4-a56a-281280da2ccc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.790759 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9kd9\" (UniqueName: \"kubernetes.io/projected/dcf8da11-8bbb-4de3-bbe5-69869bf4829c-kube-api-access-g9kd9\") pod \"route-controller-manager-d4c476667-t8fgh\" (UID: \"dcf8da11-8bbb-4de3-bbe5-69869bf4829c\") " pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.798276 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.806451 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dmwn\" (UniqueName: \"kubernetes.io/projected/c039ba90-db23-4d59-8109-2b83616af131-kube-api-access-8dmwn\") pod \"controller-manager-7bff847b8d-88b8t\" (UID: \"c039ba90-db23-4d59-8109-2b83616af131\") " pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.808355 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.819345 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.820276 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9lrnf"] Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.848429 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ee74166d-3c00-4e8f-8b0d-cb782afa86fd-signing-cabundle\") pod \"service-ca-88db4f5f6-zjhx6\" (UID: \"ee74166d-3c00-4e8f-8b0d-cb782afa86fd\") " pod="openshift-service-ca/service-ca-88db4f5f6-zjhx6" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.848484 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ee74166d-3c00-4e8f-8b0d-cb782afa86fd-signing-key\") pod \"service-ca-88db4f5f6-zjhx6\" (UID: \"ee74166d-3c00-4e8f-8b0d-cb782afa86fd\") " pod="openshift-service-ca/service-ca-88db4f5f6-zjhx6" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.848561 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxwwp\" (UniqueName: \"kubernetes.io/projected/ee74166d-3c00-4e8f-8b0d-cb782afa86fd-kube-api-access-pxwwp\") pod \"service-ca-88db4f5f6-zjhx6\" (UID: \"ee74166d-3c00-4e8f-8b0d-cb782afa86fd\") " pod="openshift-service-ca/service-ca-88db4f5f6-zjhx6" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.856163 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.864791 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.895165 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lvckz"] Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.896747 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lvckz" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.900803 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crx2z" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.904269 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.905650 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lvckz"] Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.920984 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.923535 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z7v99"] Apr 02 13:41:39 crc kubenswrapper[4732]: W0402 13:41:39.941850 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72981c60_e9a1_4e25_9b64_7493d6fdaab6.slice/crio-d4d772450856dfbebd55c6fec998caee6c1d412523c56bc65e0766342de36120 WatchSource:0}: Error finding container d4d772450856dfbebd55c6fec998caee6c1d412523c56bc65e0766342de36120: Status 404 returned error can't find the container with id d4d772450856dfbebd55c6fec998caee6c1d412523c56bc65e0766342de36120 Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.949329 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33708fee-32a5-4418-81d0-226813150db7-catalog-content\") pod \"redhat-operators-lvckz\" (UID: \"33708fee-32a5-4418-81d0-226813150db7\") " pod="openshift-marketplace/redhat-operators-lvckz" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.949365 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxwwp\" (UniqueName: \"kubernetes.io/projected/ee74166d-3c00-4e8f-8b0d-cb782afa86fd-kube-api-access-pxwwp\") pod \"service-ca-88db4f5f6-zjhx6\" (UID: \"ee74166d-3c00-4e8f-8b0d-cb782afa86fd\") " pod="openshift-service-ca/service-ca-88db4f5f6-zjhx6" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.949418 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ee74166d-3c00-4e8f-8b0d-cb782afa86fd-signing-cabundle\") pod \"service-ca-88db4f5f6-zjhx6\" (UID: \"ee74166d-3c00-4e8f-8b0d-cb782afa86fd\") " pod="openshift-service-ca/service-ca-88db4f5f6-zjhx6" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.949448 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tcvq\" (UniqueName: \"kubernetes.io/projected/33708fee-32a5-4418-81d0-226813150db7-kube-api-access-4tcvq\") pod \"redhat-operators-lvckz\" (UID: \"33708fee-32a5-4418-81d0-226813150db7\") " pod="openshift-marketplace/redhat-operators-lvckz" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.949474 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ee74166d-3c00-4e8f-8b0d-cb782afa86fd-signing-key\") pod \"service-ca-88db4f5f6-zjhx6\" (UID: \"ee74166d-3c00-4e8f-8b0d-cb782afa86fd\") " pod="openshift-service-ca/service-ca-88db4f5f6-zjhx6" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.949513 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33708fee-32a5-4418-81d0-226813150db7-utilities\") pod \"redhat-operators-lvckz\" (UID: \"33708fee-32a5-4418-81d0-226813150db7\") " pod="openshift-marketplace/redhat-operators-lvckz" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.950547 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ee74166d-3c00-4e8f-8b0d-cb782afa86fd-signing-cabundle\") pod \"service-ca-88db4f5f6-zjhx6\" (UID: \"ee74166d-3c00-4e8f-8b0d-cb782afa86fd\") " pod="openshift-service-ca/service-ca-88db4f5f6-zjhx6" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.957955 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ee74166d-3c00-4e8f-8b0d-cb782afa86fd-signing-key\") pod \"service-ca-88db4f5f6-zjhx6\" (UID: \"ee74166d-3c00-4e8f-8b0d-cb782afa86fd\") " pod="openshift-service-ca/service-ca-88db4f5f6-zjhx6" Apr 02 13:41:39 crc kubenswrapper[4732]: I0402 13:41:39.967899 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxwwp\" (UniqueName: \"kubernetes.io/projected/ee74166d-3c00-4e8f-8b0d-cb782afa86fd-kube-api-access-pxwwp\") pod \"service-ca-88db4f5f6-zjhx6\" (UID: \"ee74166d-3c00-4e8f-8b0d-cb782afa86fd\") " pod="openshift-service-ca/service-ca-88db4f5f6-zjhx6" Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.021228 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.050258 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tcvq\" (UniqueName: \"kubernetes.io/projected/33708fee-32a5-4418-81d0-226813150db7-kube-api-access-4tcvq\") pod \"redhat-operators-lvckz\" (UID: \"33708fee-32a5-4418-81d0-226813150db7\") " pod="openshift-marketplace/redhat-operators-lvckz" Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.050369 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33708fee-32a5-4418-81d0-226813150db7-utilities\") pod \"redhat-operators-lvckz\" (UID: \"33708fee-32a5-4418-81d0-226813150db7\") " pod="openshift-marketplace/redhat-operators-lvckz" Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.050977 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33708fee-32a5-4418-81d0-226813150db7-utilities\") pod \"redhat-operators-lvckz\" (UID: \"33708fee-32a5-4418-81d0-226813150db7\") " pod="openshift-marketplace/redhat-operators-lvckz" Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.051037 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33708fee-32a5-4418-81d0-226813150db7-catalog-content\") pod \"redhat-operators-lvckz\" (UID: \"33708fee-32a5-4418-81d0-226813150db7\") " pod="openshift-marketplace/redhat-operators-lvckz" Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.051335 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33708fee-32a5-4418-81d0-226813150db7-catalog-content\") pod \"redhat-operators-lvckz\" (UID: \"33708fee-32a5-4418-81d0-226813150db7\") " pod="openshift-marketplace/redhat-operators-lvckz" Apr 02 13:41:40 crc kubenswrapper[4732]: W0402 13:41:40.051772 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1e3fea23_4311_467e_b808_05ada9bc4e03.slice/crio-16e0763307c769d8e44e6666f30f5c22dc3418c0bfcb0973d050503557c1fbf6 WatchSource:0}: Error finding container 16e0763307c769d8e44e6666f30f5c22dc3418c0bfcb0973d050503557c1fbf6: Status 404 returned error can't find the container with id 16e0763307c769d8e44e6666f30f5c22dc3418c0bfcb0973d050503557c1fbf6 Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.071097 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tcvq\" (UniqueName: \"kubernetes.io/projected/33708fee-32a5-4418-81d0-226813150db7-kube-api-access-4tcvq\") pod \"redhat-operators-lvckz\" (UID: \"33708fee-32a5-4418-81d0-226813150db7\") " pod="openshift-marketplace/redhat-operators-lvckz" Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.106590 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-88db4f5f6-zjhx6" Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.238718 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lvckz" Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.285830 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vgjz4"] Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.288677 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vgjz4" Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.302103 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vgjz4"] Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.328854 4732 patch_prober.go:28] interesting pod/router-default-5444994796-6qzc4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 02 13:41:40 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Apr 02 13:41:40 crc kubenswrapper[4732]: [+]process-running ok Apr 02 13:41:40 crc kubenswrapper[4732]: healthz check failed Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.328908 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6qzc4" podUID="d8ff2a93-ff6b-4ef8-9109-de3c22a6f108" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.359414 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50d43b2d-24ec-439f-a418-3673791eb1b1-utilities\") pod \"redhat-operators-vgjz4\" (UID: \"50d43b2d-24ec-439f-a418-3673791eb1b1\") " pod="openshift-marketplace/redhat-operators-vgjz4" Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.359784 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxmng\" (UniqueName: \"kubernetes.io/projected/50d43b2d-24ec-439f-a418-3673791eb1b1-kube-api-access-wxmng\") pod \"redhat-operators-vgjz4\" (UID: \"50d43b2d-24ec-439f-a418-3673791eb1b1\") " pod="openshift-marketplace/redhat-operators-vgjz4" Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.359850 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50d43b2d-24ec-439f-a418-3673791eb1b1-catalog-content\") pod \"redhat-operators-vgjz4\" (UID: \"50d43b2d-24ec-439f-a418-3673791eb1b1\") " pod="openshift-marketplace/redhat-operators-vgjz4" Apr 02 13:41:40 crc kubenswrapper[4732]: W0402 13:41:40.429469 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-1ec329d9282ec79e266b2cdc12fe752fe66c9f8aaf98721ea7d5e8bdc7344a28 WatchSource:0}: Error finding container 1ec329d9282ec79e266b2cdc12fe752fe66c9f8aaf98721ea7d5e8bdc7344a28: Status 404 returned error can't find the container with id 1ec329d9282ec79e266b2cdc12fe752fe66c9f8aaf98721ea7d5e8bdc7344a28 Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.464589 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50d43b2d-24ec-439f-a418-3673791eb1b1-catalog-content\") pod \"redhat-operators-vgjz4\" (UID: \"50d43b2d-24ec-439f-a418-3673791eb1b1\") " pod="openshift-marketplace/redhat-operators-vgjz4" Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.464659 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50d43b2d-24ec-439f-a418-3673791eb1b1-utilities\") pod \"redhat-operators-vgjz4\" (UID: \"50d43b2d-24ec-439f-a418-3673791eb1b1\") " pod="openshift-marketplace/redhat-operators-vgjz4" Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.464698 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxmng\" (UniqueName: \"kubernetes.io/projected/50d43b2d-24ec-439f-a418-3673791eb1b1-kube-api-access-wxmng\") pod \"redhat-operators-vgjz4\" (UID: \"50d43b2d-24ec-439f-a418-3673791eb1b1\") " pod="openshift-marketplace/redhat-operators-vgjz4" Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.465412 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50d43b2d-24ec-439f-a418-3673791eb1b1-catalog-content\") pod \"redhat-operators-vgjz4\" (UID: \"50d43b2d-24ec-439f-a418-3673791eb1b1\") " pod="openshift-marketplace/redhat-operators-vgjz4" Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.465535 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50d43b2d-24ec-439f-a418-3673791eb1b1-utilities\") pod \"redhat-operators-vgjz4\" (UID: \"50d43b2d-24ec-439f-a418-3673791eb1b1\") " pod="openshift-marketplace/redhat-operators-vgjz4" Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.482719 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxmng\" (UniqueName: \"kubernetes.io/projected/50d43b2d-24ec-439f-a418-3673791eb1b1-kube-api-access-wxmng\") pod \"redhat-operators-vgjz4\" (UID: \"50d43b2d-24ec-439f-a418-3673791eb1b1\") " pod="openshift-marketplace/redhat-operators-vgjz4" Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.648058 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1e3fea23-4311-467e-b808-05ada9bc4e03","Type":"ContainerStarted","Data":"16e0763307c769d8e44e6666f30f5c22dc3418c0bfcb0973d050503557c1fbf6"} Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.657470 4732 generic.go:334] "Generic (PLEG): container finished" podID="51a0e365-014c-40e8-8749-7512f2c00758" containerID="07f5d8e0c009a92486cfbb49db1e9a3555daf3090c6f9416221f132da132091b" exitCode=0 Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.657570 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9lrnf" event={"ID":"51a0e365-014c-40e8-8749-7512f2c00758","Type":"ContainerDied","Data":"07f5d8e0c009a92486cfbb49db1e9a3555daf3090c6f9416221f132da132091b"} Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.659147 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9lrnf" event={"ID":"51a0e365-014c-40e8-8749-7512f2c00758","Type":"ContainerStarted","Data":"a5b64c74cd0af37c8f6db5871a77443a73c101f6fd86bf5106af224fbcade0e1"} Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.668214 4732 generic.go:334] "Generic (PLEG): container finished" podID="72981c60-e9a1-4e25-9b64-7493d6fdaab6" containerID="193dd2b4bb23f9d733807fda8e1d414d8610bf2650f00c987038316f39b50247" exitCode=0 Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.668274 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z7v99" event={"ID":"72981c60-e9a1-4e25-9b64-7493d6fdaab6","Type":"ContainerDied","Data":"193dd2b4bb23f9d733807fda8e1d414d8610bf2650f00c987038316f39b50247"} Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.668299 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z7v99" event={"ID":"72981c60-e9a1-4e25-9b64-7493d6fdaab6","Type":"ContainerStarted","Data":"d4d772450856dfbebd55c6fec998caee6c1d412523c56bc65e0766342de36120"} Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.672205 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1ec329d9282ec79e266b2cdc12fe752fe66c9f8aaf98721ea7d5e8bdc7344a28"} Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.711192 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cb3c06a-e3cf-4a60-b180-82759b9d55fc" path="/var/lib/kubelet/pods/5cb3c06a-e3cf-4a60-b180-82759b9d55fc/volumes" Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.712310 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.713000 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" event={"ID":"e2570535-673c-495c-a5aa-392f14ceebb1","Type":"ContainerStarted","Data":"cebfc68889bf68f1dee6260dd61cc2f0f7ebd22790507b1bca9eb8eb36c9a1ed"} Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.713020 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" event={"ID":"e2570535-673c-495c-a5aa-392f14ceebb1","Type":"ContainerStarted","Data":"17532bd5afe9e00601608767d5760acdc596adaab8fba875408b53a8ecf51ebf"} Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.724973 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vgjz4" Apr 02 13:41:40 crc kubenswrapper[4732]: W0402 13:41:40.725863 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-4fcddf4dffd00f8bed03bbf95e99f9be5a3bb42630b7ead023f5c50c241478ee WatchSource:0}: Error finding container 4fcddf4dffd00f8bed03bbf95e99f9be5a3bb42630b7ead023f5c50c241478ee: Status 404 returned error can't find the container with id 4fcddf4dffd00f8bed03bbf95e99f9be5a3bb42630b7ead023f5c50c241478ee Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.743494 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh"] Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.745230 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" podStartSLOduration=180.745207903 podStartE2EDuration="3m0.745207903s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:40.729685243 +0000 UTC m=+257.634092806" watchObservedRunningTime="2026-04-02 13:41:40.745207903 +0000 UTC m=+257.649615456" Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.762542 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.782470 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lvckz"] Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.850782 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bff847b8d-88b8t"] Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.858245 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-88db4f5f6-zjhx6"] Apr 02 13:41:40 crc kubenswrapper[4732]: I0402 13:41:40.877860 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-crx2z"] Apr 02 13:41:40 crc kubenswrapper[4732]: W0402 13:41:40.920143 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod386bd92b_c67e_4cc6_8a47_6f8d6e799bc7.slice/crio-0737eaa546ede4a6e70de5bb34f732d792695320b6f607235844e8cf74ffe56e WatchSource:0}: Error finding container 0737eaa546ede4a6e70de5bb34f732d792695320b6f607235844e8cf74ffe56e: Status 404 returned error can't find the container with id 0737eaa546ede4a6e70de5bb34f732d792695320b6f607235844e8cf74ffe56e Apr 02 13:41:40 crc kubenswrapper[4732]: W0402 13:41:40.931365 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc039ba90_db23_4d59_8109_2b83616af131.slice/crio-340c689f412eee8c2c4338160d0682cbc1f30a4f66e1e6b756d93e699ce234e6 WatchSource:0}: Error finding container 340c689f412eee8c2c4338160d0682cbc1f30a4f66e1e6b756d93e699ce234e6: Status 404 returned error can't find the container with id 340c689f412eee8c2c4338160d0682cbc1f30a4f66e1e6b756d93e699ce234e6 Apr 02 13:41:40 crc kubenswrapper[4732]: W0402 13:41:40.939678 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee74166d_3c00_4e8f_8b0d_cb782afa86fd.slice/crio-ad37f24d17713bf1861087dfed2f7489bcd27741c0d6efced9d9d61742abca5d WatchSource:0}: Error finding container ad37f24d17713bf1861087dfed2f7489bcd27741c0d6efced9d9d61742abca5d: Status 404 returned error can't find the container with id ad37f24d17713bf1861087dfed2f7489bcd27741c0d6efced9d9d61742abca5d Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.012913 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29585610-wbzhk" Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.024655 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vgjz4"] Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.033493 4732 patch_prober.go:28] interesting pod/downloads-7954f5f757-zmhn7 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.033548 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-zmhn7" podUID="4e9d3578-0893-4852-80b6-999e5a7ccdc5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.033493 4732 patch_prober.go:28] interesting pod/downloads-7954f5f757-zmhn7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.033597 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zmhn7" podUID="4e9d3578-0893-4852-80b6-999e5a7ccdc5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.079955 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67577cc5-6dee-4465-beee-ea424d976972-secret-volume\") pod \"67577cc5-6dee-4465-beee-ea424d976972\" (UID: \"67577cc5-6dee-4465-beee-ea424d976972\") " Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.080025 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67577cc5-6dee-4465-beee-ea424d976972-config-volume\") pod \"67577cc5-6dee-4465-beee-ea424d976972\" (UID: \"67577cc5-6dee-4465-beee-ea424d976972\") " Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.080175 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqcj6\" (UniqueName: \"kubernetes.io/projected/67577cc5-6dee-4465-beee-ea424d976972-kube-api-access-rqcj6\") pod \"67577cc5-6dee-4465-beee-ea424d976972\" (UID: \"67577cc5-6dee-4465-beee-ea424d976972\") " Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.081941 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67577cc5-6dee-4465-beee-ea424d976972-config-volume" (OuterVolumeSpecName: "config-volume") pod "67577cc5-6dee-4465-beee-ea424d976972" (UID: "67577cc5-6dee-4465-beee-ea424d976972"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.087439 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67577cc5-6dee-4465-beee-ea424d976972-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "67577cc5-6dee-4465-beee-ea424d976972" (UID: "67577cc5-6dee-4465-beee-ea424d976972"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.089193 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67577cc5-6dee-4465-beee-ea424d976972-kube-api-access-rqcj6" (OuterVolumeSpecName: "kube-api-access-rqcj6") pod "67577cc5-6dee-4465-beee-ea424d976972" (UID: "67577cc5-6dee-4465-beee-ea424d976972"). InnerVolumeSpecName "kube-api-access-rqcj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:41:41 crc kubenswrapper[4732]: W0402 13:41:41.115069 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50d43b2d_24ec_439f_a418_3673791eb1b1.slice/crio-c72853c3883ddf064d1087c7a54a67bc5173e895763e7bbc0616ecb63dbd4ecc WatchSource:0}: Error finding container c72853c3883ddf064d1087c7a54a67bc5173e895763e7bbc0616ecb63dbd4ecc: Status 404 returned error can't find the container with id c72853c3883ddf064d1087c7a54a67bc5173e895763e7bbc0616ecb63dbd4ecc Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.182838 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqcj6\" (UniqueName: \"kubernetes.io/projected/67577cc5-6dee-4465-beee-ea424d976972-kube-api-access-rqcj6\") on node \"crc\" DevicePath \"\"" Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.182876 4732 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67577cc5-6dee-4465-beee-ea424d976972-secret-volume\") on node \"crc\" DevicePath \"\"" Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.182888 4732 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67577cc5-6dee-4465-beee-ea424d976972-config-volume\") on node \"crc\" DevicePath \"\"" Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.261349 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-s6sm2" Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.280688 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.280740 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.282197 4732 patch_prober.go:28] interesting pod/console-f9d7485db-dq9x9 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.29:8443/health\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.282243 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dq9x9" podUID="4d77c191-7d04-4381-838f-b7a355e7c2d4" containerName="console" probeResult="failure" output="Get \"https://10.217.0.29:8443/health\": dial tcp 10.217.0.29:8443: connect: connection refused" Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.325309 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-6qzc4" Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.332193 4732 patch_prober.go:28] interesting pod/router-default-5444994796-6qzc4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 02 13:41:41 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Apr 02 13:41:41 crc kubenswrapper[4732]: [+]process-running ok Apr 02 13:41:41 crc kubenswrapper[4732]: healthz check failed Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.332249 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6qzc4" podUID="d8ff2a93-ff6b-4ef8-9109-de3c22a6f108" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.737150 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8f7228ab8d80ddfa91227d679718519eb6bb35b4031d8b526a028698bcbcdc31"} Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.737516 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2693d2f7b1bac3946cae65f3a90e38c3a761d686662951aefc509b0b855e61ff"} Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.749625 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-crx2z" event={"ID":"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7","Type":"ContainerStarted","Data":"0737eaa546ede4a6e70de5bb34f732d792695320b6f607235844e8cf74ffe56e"} Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.782589 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1678e6e2-ca55-46d4-a56a-281280da2ccc","Type":"ContainerStarted","Data":"968c846760d53bf36b006b3ab21bc5259f64a871e95f27fc2efb002be7e15325"} Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.782666 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1678e6e2-ca55-46d4-a56a-281280da2ccc","Type":"ContainerStarted","Data":"f980b897120d7695e17f23215205320caed59545049eb719a51ae8cfe4d2f55e"} Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.788223 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"80952a8c54f8db82337f510ac69743d245f86d83c1a692a124cfe15b9cff783b"} Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.812295 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.812275001 podStartE2EDuration="2.812275001s" podCreationTimestamp="2026-04-02 13:41:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:41.810002941 +0000 UTC m=+258.714410504" watchObservedRunningTime="2026-04-02 13:41:41.812275001 +0000 UTC m=+258.716682564" Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.834042 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-88db4f5f6-zjhx6" event={"ID":"ee74166d-3c00-4e8f-8b0d-cb782afa86fd","Type":"ContainerStarted","Data":"27c785fa6baac2a7ca54a00254c701c6714429533343fa9cfb604d6f23f4118a"} Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.834098 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-88db4f5f6-zjhx6" event={"ID":"ee74166d-3c00-4e8f-8b0d-cb782afa86fd","Type":"ContainerStarted","Data":"ad37f24d17713bf1861087dfed2f7489bcd27741c0d6efced9d9d61742abca5d"} Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.847067 4732 generic.go:334] "Generic (PLEG): container finished" podID="50d43b2d-24ec-439f-a418-3673791eb1b1" containerID="8e4ab18fa6d5568cc39926566d16f55002ef34dcf5f6d4f7c14f444d167dcb69" exitCode=0 Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.847121 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgjz4" event={"ID":"50d43b2d-24ec-439f-a418-3673791eb1b1","Type":"ContainerDied","Data":"8e4ab18fa6d5568cc39926566d16f55002ef34dcf5f6d4f7c14f444d167dcb69"} Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.847142 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgjz4" event={"ID":"50d43b2d-24ec-439f-a418-3673791eb1b1","Type":"ContainerStarted","Data":"c72853c3883ddf064d1087c7a54a67bc5173e895763e7bbc0616ecb63dbd4ecc"} Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.865189 4732 generic.go:334] "Generic (PLEG): container finished" podID="33708fee-32a5-4418-81d0-226813150db7" containerID="966e60f520d1c8037b0fde5c77272cfcf79e1f1b7e6ec9164a02737e3ffa1266" exitCode=0 Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.865275 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvckz" event={"ID":"33708fee-32a5-4418-81d0-226813150db7","Type":"ContainerDied","Data":"966e60f520d1c8037b0fde5c77272cfcf79e1f1b7e6ec9164a02737e3ffa1266"} Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.865300 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvckz" event={"ID":"33708fee-32a5-4418-81d0-226813150db7","Type":"ContainerStarted","Data":"bec5d9172a678578a4a25802436f82279e97fe20393ed0ec28b58bbd4d7353be"} Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.902281 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.902330 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.903731 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-88db4f5f6-zjhx6" podStartSLOduration=2.903714959 podStartE2EDuration="2.903714959s" podCreationTimestamp="2026-04-02 13:41:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:41.872032881 +0000 UTC m=+258.776440444" watchObservedRunningTime="2026-04-02 13:41:41.903714959 +0000 UTC m=+258.808122512" Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.908682 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ea5e8ea59a27681ffd90a0d31f4eb4e1a46624a0ce34418502dde6cddbbc3a94"} Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.908828 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4fcddf4dffd00f8bed03bbf95e99f9be5a3bb42630b7ead023f5c50c241478ee"} Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.909480 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.927945 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.934750 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" event={"ID":"dcf8da11-8bbb-4de3-bbe5-69869bf4829c","Type":"ContainerStarted","Data":"574ebdb08baf59a41093e85e00cad943cfebcc0093f4627d5e3b85709f8f98e0"} Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.934785 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" event={"ID":"dcf8da11-8bbb-4de3-bbe5-69869bf4829c","Type":"ContainerStarted","Data":"f1106e35a25bad6f69ce140890c33fed5bc2d17ea354d195802cbbe915e76120"} Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.935419 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.962748 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29585610-wbzhk" event={"ID":"67577cc5-6dee-4465-beee-ea424d976972","Type":"ContainerDied","Data":"16cea3f04af541b5904b37a103b84d956f23b721989734df32189ea2aed43667"} Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.962782 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16cea3f04af541b5904b37a103b84d956f23b721989734df32189ea2aed43667" Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.962866 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29585610-wbzhk" Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.968410 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" podStartSLOduration=4.968392838 podStartE2EDuration="4.968392838s" podCreationTimestamp="2026-04-02 13:41:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:41.967070643 +0000 UTC m=+258.871478206" watchObservedRunningTime="2026-04-02 13:41:41.968392838 +0000 UTC m=+258.872800391" Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.979352 4732 generic.go:334] "Generic (PLEG): container finished" podID="1e3fea23-4311-467e-b808-05ada9bc4e03" containerID="edee49ee77db3a0ec8cee4b5d10b2e0b9ee139199a6a92806ce27f82acfc74b8" exitCode=0 Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.979409 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1e3fea23-4311-467e-b808-05ada9bc4e03","Type":"ContainerDied","Data":"edee49ee77db3a0ec8cee4b5d10b2e0b9ee139199a6a92806ce27f82acfc74b8"} Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.995334 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" event={"ID":"c039ba90-db23-4d59-8109-2b83616af131","Type":"ContainerStarted","Data":"40d63b200b0f432a2e8204e506158c90f6be2f635a07c7f689d9bcb4b003c038"} Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.995365 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" event={"ID":"c039ba90-db23-4d59-8109-2b83616af131","Type":"ContainerStarted","Data":"340c689f412eee8c2c4338160d0682cbc1f30a4f66e1e6b756d93e699ce234e6"} Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.995378 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:41 crc kubenswrapper[4732]: I0402 13:41:41.995624 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" Apr 02 13:41:42 crc kubenswrapper[4732]: I0402 13:41:42.010500 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" Apr 02 13:41:42 crc kubenswrapper[4732]: I0402 13:41:42.191659 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" Apr 02 13:41:42 crc kubenswrapper[4732]: I0402 13:41:42.227720 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" podStartSLOduration=5.227698003 podStartE2EDuration="5.227698003s" podCreationTimestamp="2026-04-02 13:41:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:42.081474918 +0000 UTC m=+258.985882481" watchObservedRunningTime="2026-04-02 13:41:42.227698003 +0000 UTC m=+259.132105566" Apr 02 13:41:42 crc kubenswrapper[4732]: I0402 13:41:42.315997 4732 patch_prober.go:28] interesting pod/router-default-5444994796-6qzc4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 02 13:41:42 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Apr 02 13:41:42 crc kubenswrapper[4732]: [+]process-running ok Apr 02 13:41:42 crc kubenswrapper[4732]: healthz check failed Apr 02 13:41:42 crc kubenswrapper[4732]: I0402 13:41:42.316051 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6qzc4" podUID="d8ff2a93-ff6b-4ef8-9109-de3c22a6f108" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:41:43 crc kubenswrapper[4732]: I0402 13:41:43.022290 4732 generic.go:334] "Generic (PLEG): container finished" podID="1678e6e2-ca55-46d4-a56a-281280da2ccc" containerID="968c846760d53bf36b006b3ab21bc5259f64a871e95f27fc2efb002be7e15325" exitCode=0 Apr 02 13:41:43 crc kubenswrapper[4732]: I0402 13:41:43.022333 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1678e6e2-ca55-46d4-a56a-281280da2ccc","Type":"ContainerDied","Data":"968c846760d53bf36b006b3ab21bc5259f64a871e95f27fc2efb002be7e15325"} Apr 02 13:41:43 crc kubenswrapper[4732]: I0402 13:41:43.028757 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-crx2z" event={"ID":"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7","Type":"ContainerStarted","Data":"d7889af58bfd936108076b46b0089bf2b30c2f8907b588ce619d9f78f233adee"} Apr 02 13:41:43 crc kubenswrapper[4732]: I0402 13:41:43.028803 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-crx2z" event={"ID":"386bd92b-c67e-4cc6-8a47-6f8d6e799bc7","Type":"ContainerStarted","Data":"92a0f96a2302e5d6da99991479a2af036e62d895c0a50708015d24683fb5dfd9"} Apr 02 13:41:43 crc kubenswrapper[4732]: I0402 13:41:43.034269 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:41:43 crc kubenswrapper[4732]: I0402 13:41:43.075640 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-crx2z" podStartSLOduration=183.075605028 podStartE2EDuration="3m3.075605028s" podCreationTimestamp="2026-04-02 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:41:43.063042566 +0000 UTC m=+259.967450119" watchObservedRunningTime="2026-04-02 13:41:43.075605028 +0000 UTC m=+259.980012581" Apr 02 13:41:43 crc kubenswrapper[4732]: I0402 13:41:43.318848 4732 patch_prober.go:28] interesting pod/router-default-5444994796-6qzc4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 02 13:41:43 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Apr 02 13:41:43 crc kubenswrapper[4732]: [+]process-running ok Apr 02 13:41:43 crc kubenswrapper[4732]: healthz check failed Apr 02 13:41:43 crc kubenswrapper[4732]: I0402 13:41:43.318916 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6qzc4" podUID="d8ff2a93-ff6b-4ef8-9109-de3c22a6f108" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:41:44 crc kubenswrapper[4732]: I0402 13:41:44.310703 4732 patch_prober.go:28] interesting pod/router-default-5444994796-6qzc4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 02 13:41:44 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Apr 02 13:41:44 crc kubenswrapper[4732]: [+]process-running ok Apr 02 13:41:44 crc kubenswrapper[4732]: healthz check failed Apr 02 13:41:44 crc kubenswrapper[4732]: I0402 13:41:44.311080 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6qzc4" podUID="d8ff2a93-ff6b-4ef8-9109-de3c22a6f108" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:41:44 crc kubenswrapper[4732]: I0402 13:41:44.934350 4732 ???:1] "http: TLS handshake error from 192.168.126.11:48822: no serving certificate available for the kubelet" Apr 02 13:41:45 crc kubenswrapper[4732]: I0402 13:41:45.309536 4732 patch_prober.go:28] interesting pod/router-default-5444994796-6qzc4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 02 13:41:45 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Apr 02 13:41:45 crc kubenswrapper[4732]: [+]process-running ok Apr 02 13:41:45 crc kubenswrapper[4732]: healthz check failed Apr 02 13:41:45 crc kubenswrapper[4732]: I0402 13:41:45.309605 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6qzc4" podUID="d8ff2a93-ff6b-4ef8-9109-de3c22a6f108" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:41:45 crc kubenswrapper[4732]: I0402 13:41:45.579488 4732 ???:1] "http: TLS handshake error from 192.168.126.11:48828: no serving certificate available for the kubelet" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.027489 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-68b6f48864-m96db"] Apr 02 13:41:46 crc kubenswrapper[4732]: E0402 13:41:46.027967 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67577cc5-6dee-4465-beee-ea424d976972" containerName="collect-profiles" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.027986 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="67577cc5-6dee-4465-beee-ea424d976972" containerName="collect-profiles" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.028085 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="67577cc5-6dee-4465-beee-ea424d976972" containerName="collect-profiles" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.029416 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.036154 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68b6f48864-m96db"] Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.084373 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdbtg\" (UniqueName: \"kubernetes.io/projected/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-kube-api-access-gdbtg\") pod \"console-68b6f48864-m96db\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.084439 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-oauth-serving-cert\") pod \"console-68b6f48864-m96db\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.084473 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-service-ca\") pod \"console-68b6f48864-m96db\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.084497 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-console-config\") pod \"console-68b6f48864-m96db\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.084526 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-console-oauth-config\") pod \"console-68b6f48864-m96db\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.084544 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-trusted-ca-bundle\") pod \"console-68b6f48864-m96db\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.084625 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-console-serving-cert\") pod \"console-68b6f48864-m96db\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.185693 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-console-serving-cert\") pod \"console-68b6f48864-m96db\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.188047 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdbtg\" (UniqueName: \"kubernetes.io/projected/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-kube-api-access-gdbtg\") pod \"console-68b6f48864-m96db\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.188077 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-oauth-serving-cert\") pod \"console-68b6f48864-m96db\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.188102 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-service-ca\") pod \"console-68b6f48864-m96db\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.188122 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-console-config\") pod \"console-68b6f48864-m96db\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.188149 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-console-oauth-config\") pod \"console-68b6f48864-m96db\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.188167 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-trusted-ca-bundle\") pod \"console-68b6f48864-m96db\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.189177 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-service-ca\") pod \"console-68b6f48864-m96db\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.189177 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-console-config\") pod \"console-68b6f48864-m96db\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.189630 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-trusted-ca-bundle\") pod \"console-68b6f48864-m96db\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.189858 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-oauth-serving-cert\") pod \"console-68b6f48864-m96db\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.194345 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-console-oauth-config\") pod \"console-68b6f48864-m96db\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.214223 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdbtg\" (UniqueName: \"kubernetes.io/projected/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-kube-api-access-gdbtg\") pod \"console-68b6f48864-m96db\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.222307 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-console-serving-cert\") pod \"console-68b6f48864-m96db\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.314260 4732 patch_prober.go:28] interesting pod/router-default-5444994796-6qzc4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 02 13:41:46 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Apr 02 13:41:46 crc kubenswrapper[4732]: [+]process-running ok Apr 02 13:41:46 crc kubenswrapper[4732]: healthz check failed Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.314374 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6qzc4" podUID="d8ff2a93-ff6b-4ef8-9109-de3c22a6f108" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.330294 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-86zsp"] Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.402487 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:41:46 crc kubenswrapper[4732]: I0402 13:41:46.995258 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wbcwp" Apr 02 13:41:47 crc kubenswrapper[4732]: I0402 13:41:47.324139 4732 patch_prober.go:28] interesting pod/router-default-5444994796-6qzc4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 02 13:41:47 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Apr 02 13:41:47 crc kubenswrapper[4732]: [+]process-running ok Apr 02 13:41:47 crc kubenswrapper[4732]: healthz check failed Apr 02 13:41:47 crc kubenswrapper[4732]: I0402 13:41:47.324214 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6qzc4" podUID="d8ff2a93-ff6b-4ef8-9109-de3c22a6f108" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:41:48 crc kubenswrapper[4732]: I0402 13:41:48.306846 4732 patch_prober.go:28] interesting pod/router-default-5444994796-6qzc4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 02 13:41:48 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Apr 02 13:41:48 crc kubenswrapper[4732]: [+]process-running ok Apr 02 13:41:48 crc kubenswrapper[4732]: healthz check failed Apr 02 13:41:48 crc kubenswrapper[4732]: I0402 13:41:48.307153 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6qzc4" podUID="d8ff2a93-ff6b-4ef8-9109-de3c22a6f108" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:41:48 crc kubenswrapper[4732]: I0402 13:41:48.869365 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-85455b8986-kh7t7"] Apr 02 13:41:48 crc kubenswrapper[4732]: I0402 13:41:48.870054 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:41:48 crc kubenswrapper[4732]: I0402 13:41:48.874529 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-85455b8986-kh7t7"] Apr 02 13:41:48 crc kubenswrapper[4732]: I0402 13:41:48.940177 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lxzt\" (UniqueName: \"kubernetes.io/projected/0664f762-59d2-4c95-8e8f-698af0b15611-kube-api-access-8lxzt\") pod \"image-registry-85455b8986-kh7t7\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:41:48 crc kubenswrapper[4732]: I0402 13:41:48.940217 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0664f762-59d2-4c95-8e8f-698af0b15611-bound-sa-token\") pod \"image-registry-85455b8986-kh7t7\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:41:48 crc kubenswrapper[4732]: I0402 13:41:48.940264 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-85455b8986-kh7t7\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:41:48 crc kubenswrapper[4732]: I0402 13:41:48.940283 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0664f762-59d2-4c95-8e8f-698af0b15611-registry-tls\") pod \"image-registry-85455b8986-kh7t7\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:41:48 crc kubenswrapper[4732]: I0402 13:41:48.940298 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0664f762-59d2-4c95-8e8f-698af0b15611-registry-certificates\") pod \"image-registry-85455b8986-kh7t7\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:41:48 crc kubenswrapper[4732]: I0402 13:41:48.940325 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0664f762-59d2-4c95-8e8f-698af0b15611-ca-trust-extracted\") pod \"image-registry-85455b8986-kh7t7\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:41:48 crc kubenswrapper[4732]: I0402 13:41:48.940342 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0664f762-59d2-4c95-8e8f-698af0b15611-trusted-ca\") pod \"image-registry-85455b8986-kh7t7\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:41:48 crc kubenswrapper[4732]: I0402 13:41:48.940381 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0664f762-59d2-4c95-8e8f-698af0b15611-installation-pull-secrets\") pod \"image-registry-85455b8986-kh7t7\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:41:48 crc kubenswrapper[4732]: I0402 13:41:48.964643 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-85455b8986-kh7t7\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:41:49 crc kubenswrapper[4732]: I0402 13:41:49.041035 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0664f762-59d2-4c95-8e8f-698af0b15611-installation-pull-secrets\") pod \"image-registry-85455b8986-kh7t7\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:41:49 crc kubenswrapper[4732]: I0402 13:41:49.041092 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lxzt\" (UniqueName: \"kubernetes.io/projected/0664f762-59d2-4c95-8e8f-698af0b15611-kube-api-access-8lxzt\") pod \"image-registry-85455b8986-kh7t7\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:41:49 crc kubenswrapper[4732]: I0402 13:41:49.041110 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0664f762-59d2-4c95-8e8f-698af0b15611-bound-sa-token\") pod \"image-registry-85455b8986-kh7t7\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:41:49 crc kubenswrapper[4732]: I0402 13:41:49.041146 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0664f762-59d2-4c95-8e8f-698af0b15611-registry-tls\") pod \"image-registry-85455b8986-kh7t7\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:41:49 crc kubenswrapper[4732]: I0402 13:41:49.041161 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0664f762-59d2-4c95-8e8f-698af0b15611-registry-certificates\") pod \"image-registry-85455b8986-kh7t7\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:41:49 crc kubenswrapper[4732]: I0402 13:41:49.041183 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0664f762-59d2-4c95-8e8f-698af0b15611-ca-trust-extracted\") pod \"image-registry-85455b8986-kh7t7\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:41:49 crc kubenswrapper[4732]: I0402 13:41:49.041200 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0664f762-59d2-4c95-8e8f-698af0b15611-trusted-ca\") pod \"image-registry-85455b8986-kh7t7\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:41:49 crc kubenswrapper[4732]: I0402 13:41:49.042328 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0664f762-59d2-4c95-8e8f-698af0b15611-trusted-ca\") pod \"image-registry-85455b8986-kh7t7\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:41:49 crc kubenswrapper[4732]: I0402 13:41:49.043067 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0664f762-59d2-4c95-8e8f-698af0b15611-ca-trust-extracted\") pod \"image-registry-85455b8986-kh7t7\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:41:49 crc kubenswrapper[4732]: I0402 13:41:49.044016 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0664f762-59d2-4c95-8e8f-698af0b15611-registry-certificates\") pod \"image-registry-85455b8986-kh7t7\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:41:49 crc kubenswrapper[4732]: I0402 13:41:49.062065 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lxzt\" (UniqueName: \"kubernetes.io/projected/0664f762-59d2-4c95-8e8f-698af0b15611-kube-api-access-8lxzt\") pod \"image-registry-85455b8986-kh7t7\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:41:49 crc kubenswrapper[4732]: I0402 13:41:49.063242 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0664f762-59d2-4c95-8e8f-698af0b15611-installation-pull-secrets\") pod \"image-registry-85455b8986-kh7t7\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:41:49 crc kubenswrapper[4732]: I0402 13:41:49.064530 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0664f762-59d2-4c95-8e8f-698af0b15611-registry-tls\") pod \"image-registry-85455b8986-kh7t7\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:41:49 crc kubenswrapper[4732]: I0402 13:41:49.082668 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0664f762-59d2-4c95-8e8f-698af0b15611-bound-sa-token\") pod \"image-registry-85455b8986-kh7t7\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:41:49 crc kubenswrapper[4732]: I0402 13:41:49.193691 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:41:49 crc kubenswrapper[4732]: I0402 13:41:49.308662 4732 patch_prober.go:28] interesting pod/router-default-5444994796-6qzc4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 02 13:41:49 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Apr 02 13:41:49 crc kubenswrapper[4732]: [+]process-running ok Apr 02 13:41:49 crc kubenswrapper[4732]: healthz check failed Apr 02 13:41:49 crc kubenswrapper[4732]: I0402 13:41:49.308723 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6qzc4" podUID="d8ff2a93-ff6b-4ef8-9109-de3c22a6f108" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:41:50 crc kubenswrapper[4732]: I0402 13:41:50.307833 4732 patch_prober.go:28] interesting pod/router-default-5444994796-6qzc4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 02 13:41:50 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Apr 02 13:41:50 crc kubenswrapper[4732]: [+]process-running ok Apr 02 13:41:50 crc kubenswrapper[4732]: healthz check failed Apr 02 13:41:50 crc kubenswrapper[4732]: I0402 13:41:50.307915 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6qzc4" podUID="d8ff2a93-ff6b-4ef8-9109-de3c22a6f108" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:41:51 crc kubenswrapper[4732]: I0402 13:41:51.033554 4732 patch_prober.go:28] interesting pod/downloads-7954f5f757-zmhn7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Apr 02 13:41:51 crc kubenswrapper[4732]: I0402 13:41:51.033655 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zmhn7" podUID="4e9d3578-0893-4852-80b6-999e5a7ccdc5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" Apr 02 13:41:51 crc kubenswrapper[4732]: I0402 13:41:51.033666 4732 patch_prober.go:28] interesting pod/downloads-7954f5f757-zmhn7 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Apr 02 13:41:51 crc kubenswrapper[4732]: I0402 13:41:51.033753 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-zmhn7" podUID="4e9d3578-0893-4852-80b6-999e5a7ccdc5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" Apr 02 13:41:51 crc kubenswrapper[4732]: I0402 13:41:51.281108 4732 patch_prober.go:28] interesting pod/console-f9d7485db-dq9x9 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.29:8443/health\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Apr 02 13:41:51 crc kubenswrapper[4732]: I0402 13:41:51.281159 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dq9x9" podUID="4d77c191-7d04-4381-838f-b7a355e7c2d4" containerName="console" probeResult="failure" output="Get \"https://10.217.0.29:8443/health\": dial tcp 10.217.0.29:8443: connect: connection refused" Apr 02 13:41:51 crc kubenswrapper[4732]: I0402 13:41:51.308123 4732 patch_prober.go:28] interesting pod/router-default-5444994796-6qzc4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 02 13:41:51 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Apr 02 13:41:51 crc kubenswrapper[4732]: [+]process-running ok Apr 02 13:41:51 crc kubenswrapper[4732]: healthz check failed Apr 02 13:41:51 crc kubenswrapper[4732]: I0402 13:41:51.308182 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6qzc4" podUID="d8ff2a93-ff6b-4ef8-9109-de3c22a6f108" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:41:52 crc kubenswrapper[4732]: I0402 13:41:52.309309 4732 patch_prober.go:28] interesting pod/router-default-5444994796-6qzc4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 02 13:41:52 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Apr 02 13:41:52 crc kubenswrapper[4732]: [+]process-running ok Apr 02 13:41:52 crc kubenswrapper[4732]: healthz check failed Apr 02 13:41:52 crc kubenswrapper[4732]: I0402 13:41:52.309778 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6qzc4" podUID="d8ff2a93-ff6b-4ef8-9109-de3c22a6f108" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:41:53 crc kubenswrapper[4732]: I0402 13:41:53.307957 4732 patch_prober.go:28] interesting pod/router-default-5444994796-6qzc4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 02 13:41:53 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Apr 02 13:41:53 crc kubenswrapper[4732]: [+]process-running ok Apr 02 13:41:53 crc kubenswrapper[4732]: healthz check failed Apr 02 13:41:53 crc kubenswrapper[4732]: I0402 13:41:53.308018 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6qzc4" podUID="d8ff2a93-ff6b-4ef8-9109-de3c22a6f108" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.283236 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.311052 4732 patch_prober.go:28] interesting pod/router-default-5444994796-6qzc4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 02 13:41:54 crc kubenswrapper[4732]: [+]has-synced ok Apr 02 13:41:54 crc kubenswrapper[4732]: [+]process-running ok Apr 02 13:41:54 crc kubenswrapper[4732]: healthz check failed Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.311113 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6qzc4" podUID="d8ff2a93-ff6b-4ef8-9109-de3c22a6f108" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.330519 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1678e6e2-ca55-46d4-a56a-281280da2ccc-kube-api-access\") pod \"1678e6e2-ca55-46d4-a56a-281280da2ccc\" (UID: \"1678e6e2-ca55-46d4-a56a-281280da2ccc\") " Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.331825 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1678e6e2-ca55-46d4-a56a-281280da2ccc-kubelet-dir\") pod \"1678e6e2-ca55-46d4-a56a-281280da2ccc\" (UID: \"1678e6e2-ca55-46d4-a56a-281280da2ccc\") " Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.331980 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1678e6e2-ca55-46d4-a56a-281280da2ccc-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1678e6e2-ca55-46d4-a56a-281280da2ccc" (UID: "1678e6e2-ca55-46d4-a56a-281280da2ccc"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.332407 4732 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1678e6e2-ca55-46d4-a56a-281280da2ccc-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.336805 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1678e6e2-ca55-46d4-a56a-281280da2ccc-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1678e6e2-ca55-46d4-a56a-281280da2ccc" (UID: "1678e6e2-ca55-46d4-a56a-281280da2ccc"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.434145 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1678e6e2-ca55-46d4-a56a-281280da2ccc-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.708425 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zhxz4"] Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.714066 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.725799 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-d9c747dbb-jqfln"] Apr 02 13:41:54 crc kubenswrapper[4732]: E0402 13:41:54.726063 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1678e6e2-ca55-46d4-a56a-281280da2ccc" containerName="pruner" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.726079 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1678e6e2-ca55-46d4-a56a-281280da2ccc" containerName="pruner" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.726254 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1678e6e2-ca55-46d4-a56a-281280da2ccc" containerName="pruner" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.726755 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.732755 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-d9c747dbb-jqfln"] Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.740184 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-564q9\" (UniqueName: \"kubernetes.io/projected/57f69fbf-b90f-411e-8e0c-70a25bb01566-kube-api-access-564q9\") pod \"image-registry-d9c747dbb-jqfln\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.740254 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57f69fbf-b90f-411e-8e0c-70a25bb01566-ca-trust-extracted\") pod \"image-registry-d9c747dbb-jqfln\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.740277 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57f69fbf-b90f-411e-8e0c-70a25bb01566-registry-tls\") pod \"image-registry-d9c747dbb-jqfln\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.740314 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-d9c747dbb-jqfln\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.740434 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57f69fbf-b90f-411e-8e0c-70a25bb01566-bound-sa-token\") pod \"image-registry-d9c747dbb-jqfln\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.740457 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57f69fbf-b90f-411e-8e0c-70a25bb01566-installation-pull-secrets\") pod \"image-registry-d9c747dbb-jqfln\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.740479 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57f69fbf-b90f-411e-8e0c-70a25bb01566-registry-certificates\") pod \"image-registry-d9c747dbb-jqfln\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.740497 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57f69fbf-b90f-411e-8e0c-70a25bb01566-trusted-ca\") pod \"image-registry-d9c747dbb-jqfln\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.775941 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-d9c747dbb-jqfln\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.842222 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-564q9\" (UniqueName: \"kubernetes.io/projected/57f69fbf-b90f-411e-8e0c-70a25bb01566-kube-api-access-564q9\") pod \"image-registry-d9c747dbb-jqfln\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.842512 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57f69fbf-b90f-411e-8e0c-70a25bb01566-ca-trust-extracted\") pod \"image-registry-d9c747dbb-jqfln\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.842544 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57f69fbf-b90f-411e-8e0c-70a25bb01566-registry-tls\") pod \"image-registry-d9c747dbb-jqfln\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.842593 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57f69fbf-b90f-411e-8e0c-70a25bb01566-bound-sa-token\") pod \"image-registry-d9c747dbb-jqfln\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.842629 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57f69fbf-b90f-411e-8e0c-70a25bb01566-installation-pull-secrets\") pod \"image-registry-d9c747dbb-jqfln\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.842655 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57f69fbf-b90f-411e-8e0c-70a25bb01566-registry-certificates\") pod \"image-registry-d9c747dbb-jqfln\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.842674 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57f69fbf-b90f-411e-8e0c-70a25bb01566-trusted-ca\") pod \"image-registry-d9c747dbb-jqfln\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.959088 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57f69fbf-b90f-411e-8e0c-70a25bb01566-ca-trust-extracted\") pod \"image-registry-d9c747dbb-jqfln\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.959750 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57f69fbf-b90f-411e-8e0c-70a25bb01566-registry-tls\") pod \"image-registry-d9c747dbb-jqfln\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.959916 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57f69fbf-b90f-411e-8e0c-70a25bb01566-registry-certificates\") pod \"image-registry-d9c747dbb-jqfln\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.959956 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57f69fbf-b90f-411e-8e0c-70a25bb01566-trusted-ca\") pod \"image-registry-d9c747dbb-jqfln\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.959941 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57f69fbf-b90f-411e-8e0c-70a25bb01566-bound-sa-token\") pod \"image-registry-d9c747dbb-jqfln\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.959794 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-564q9\" (UniqueName: \"kubernetes.io/projected/57f69fbf-b90f-411e-8e0c-70a25bb01566-kube-api-access-564q9\") pod \"image-registry-d9c747dbb-jqfln\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:41:54 crc kubenswrapper[4732]: I0402 13:41:54.965859 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57f69fbf-b90f-411e-8e0c-70a25bb01566-installation-pull-secrets\") pod \"image-registry-d9c747dbb-jqfln\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:41:55 crc kubenswrapper[4732]: I0402 13:41:55.221714 4732 ???:1] "http: TLS handshake error from 192.168.126.11:52120: no serving certificate available for the kubelet" Apr 02 13:41:55 crc kubenswrapper[4732]: I0402 13:41:55.239965 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1678e6e2-ca55-46d4-a56a-281280da2ccc","Type":"ContainerDied","Data":"f980b897120d7695e17f23215205320caed59545049eb719a51ae8cfe4d2f55e"} Apr 02 13:41:55 crc kubenswrapper[4732]: I0402 13:41:55.240005 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f980b897120d7695e17f23215205320caed59545049eb719a51ae8cfe4d2f55e" Apr 02 13:41:55 crc kubenswrapper[4732]: I0402 13:41:55.240046 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Apr 02 13:41:55 crc kubenswrapper[4732]: I0402 13:41:55.258542 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:41:55 crc kubenswrapper[4732]: I0402 13:41:55.310538 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-6qzc4" Apr 02 13:41:55 crc kubenswrapper[4732]: I0402 13:41:55.313976 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-6qzc4" Apr 02 13:41:56 crc kubenswrapper[4732]: I0402 13:41:56.894749 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bff847b8d-88b8t"] Apr 02 13:41:56 crc kubenswrapper[4732]: I0402 13:41:56.895231 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" podUID="c039ba90-db23-4d59-8109-2b83616af131" containerName="controller-manager" containerID="cri-o://40d63b200b0f432a2e8204e506158c90f6be2f635a07c7f689d9bcb4b003c038" gracePeriod=30 Apr 02 13:41:56 crc kubenswrapper[4732]: I0402 13:41:56.909046 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh"] Apr 02 13:41:56 crc kubenswrapper[4732]: I0402 13:41:56.909368 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" podUID="dcf8da11-8bbb-4de3-bbe5-69869bf4829c" containerName="route-controller-manager" containerID="cri-o://574ebdb08baf59a41093e85e00cad943cfebcc0093f4627d5e3b85709f8f98e0" gracePeriod=30 Apr 02 13:41:58 crc kubenswrapper[4732]: I0402 13:41:58.257067 4732 generic.go:334] "Generic (PLEG): container finished" podID="c039ba90-db23-4d59-8109-2b83616af131" containerID="40d63b200b0f432a2e8204e506158c90f6be2f635a07c7f689d9bcb4b003c038" exitCode=0 Apr 02 13:41:58 crc kubenswrapper[4732]: I0402 13:41:58.257123 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" event={"ID":"c039ba90-db23-4d59-8109-2b83616af131","Type":"ContainerDied","Data":"40d63b200b0f432a2e8204e506158c90f6be2f635a07c7f689d9bcb4b003c038"} Apr 02 13:41:59 crc kubenswrapper[4732]: I0402 13:41:59.267714 4732 generic.go:334] "Generic (PLEG): container finished" podID="dcf8da11-8bbb-4de3-bbe5-69869bf4829c" containerID="574ebdb08baf59a41093e85e00cad943cfebcc0093f4627d5e3b85709f8f98e0" exitCode=0 Apr 02 13:41:59 crc kubenswrapper[4732]: I0402 13:41:59.267756 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" event={"ID":"dcf8da11-8bbb-4de3-bbe5-69869bf4829c","Type":"ContainerDied","Data":"574ebdb08baf59a41093e85e00cad943cfebcc0093f4627d5e3b85709f8f98e0"} Apr 02 13:41:59 crc kubenswrapper[4732]: I0402 13:41:59.857549 4732 patch_prober.go:28] interesting pod/controller-manager-7bff847b8d-88b8t container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Apr 02 13:41:59 crc kubenswrapper[4732]: I0402 13:41:59.857720 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" podUID="c039ba90-db23-4d59-8109-2b83616af131" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Apr 02 13:41:59 crc kubenswrapper[4732]: I0402 13:41:59.865805 4732 patch_prober.go:28] interesting pod/route-controller-manager-d4c476667-t8fgh container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Apr 02 13:41:59 crc kubenswrapper[4732]: I0402 13:41:59.865874 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" podUID="dcf8da11-8bbb-4de3-bbe5-69869bf4829c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Apr 02 13:41:59 crc kubenswrapper[4732]: I0402 13:41:59.891309 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.010058 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e3fea23-4311-467e-b808-05ada9bc4e03-kube-api-access\") pod \"1e3fea23-4311-467e-b808-05ada9bc4e03\" (UID: \"1e3fea23-4311-467e-b808-05ada9bc4e03\") " Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.010275 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1e3fea23-4311-467e-b808-05ada9bc4e03-kubelet-dir\") pod \"1e3fea23-4311-467e-b808-05ada9bc4e03\" (UID: \"1e3fea23-4311-467e-b808-05ada9bc4e03\") " Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.010753 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e3fea23-4311-467e-b808-05ada9bc4e03-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1e3fea23-4311-467e-b808-05ada9bc4e03" (UID: "1e3fea23-4311-467e-b808-05ada9bc4e03"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.015873 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e3fea23-4311-467e-b808-05ada9bc4e03-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1e3fea23-4311-467e-b808-05ada9bc4e03" (UID: "1e3fea23-4311-467e-b808-05ada9bc4e03"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.112176 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e3fea23-4311-467e-b808-05ada9bc4e03-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.112230 4732 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1e3fea23-4311-467e-b808-05ada9bc4e03-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.147108 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585622-lvtgw"] Apr 02 13:42:00 crc kubenswrapper[4732]: E0402 13:42:00.149829 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e3fea23-4311-467e-b808-05ada9bc4e03" containerName="pruner" Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.149985 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e3fea23-4311-467e-b808-05ada9bc4e03" containerName="pruner" Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.150267 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e3fea23-4311-467e-b808-05ada9bc4e03" containerName="pruner" Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.150952 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585622-lvtgw" Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.153521 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.155729 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-10-crc"] Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.156406 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-10-crc" Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.158310 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.158691 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.158947 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585622-lvtgw"] Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.176819 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-10-crc"] Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.273594 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1e3fea23-4311-467e-b808-05ada9bc4e03","Type":"ContainerDied","Data":"16e0763307c769d8e44e6666f30f5c22dc3418c0bfcb0973d050503557c1fbf6"} Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.273680 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16e0763307c769d8e44e6666f30f5c22dc3418c0bfcb0973d050503557c1fbf6" Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.273656 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.314324 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34ac97a0-6ef7-4d42-84e9-8926e28a822d-kubelet-dir\") pod \"revision-pruner-10-crc\" (UID: \"34ac97a0-6ef7-4d42-84e9-8926e28a822d\") " pod="openshift-kube-controller-manager/revision-pruner-10-crc" Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.314417 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brhqn\" (UniqueName: \"kubernetes.io/projected/65013539-f3b4-4513-881a-14408a922424-kube-api-access-brhqn\") pod \"auto-csr-approver-29585622-lvtgw\" (UID: \"65013539-f3b4-4513-881a-14408a922424\") " pod="openshift-infra/auto-csr-approver-29585622-lvtgw" Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.314631 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/34ac97a0-6ef7-4d42-84e9-8926e28a822d-kube-api-access\") pod \"revision-pruner-10-crc\" (UID: \"34ac97a0-6ef7-4d42-84e9-8926e28a822d\") " pod="openshift-kube-controller-manager/revision-pruner-10-crc" Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.415739 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brhqn\" (UniqueName: \"kubernetes.io/projected/65013539-f3b4-4513-881a-14408a922424-kube-api-access-brhqn\") pod \"auto-csr-approver-29585622-lvtgw\" (UID: \"65013539-f3b4-4513-881a-14408a922424\") " pod="openshift-infra/auto-csr-approver-29585622-lvtgw" Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.415815 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/34ac97a0-6ef7-4d42-84e9-8926e28a822d-kube-api-access\") pod \"revision-pruner-10-crc\" (UID: \"34ac97a0-6ef7-4d42-84e9-8926e28a822d\") " pod="openshift-kube-controller-manager/revision-pruner-10-crc" Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.415861 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34ac97a0-6ef7-4d42-84e9-8926e28a822d-kubelet-dir\") pod \"revision-pruner-10-crc\" (UID: \"34ac97a0-6ef7-4d42-84e9-8926e28a822d\") " pod="openshift-kube-controller-manager/revision-pruner-10-crc" Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.415935 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34ac97a0-6ef7-4d42-84e9-8926e28a822d-kubelet-dir\") pod \"revision-pruner-10-crc\" (UID: \"34ac97a0-6ef7-4d42-84e9-8926e28a822d\") " pod="openshift-kube-controller-manager/revision-pruner-10-crc" Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.434465 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brhqn\" (UniqueName: \"kubernetes.io/projected/65013539-f3b4-4513-881a-14408a922424-kube-api-access-brhqn\") pod \"auto-csr-approver-29585622-lvtgw\" (UID: \"65013539-f3b4-4513-881a-14408a922424\") " pod="openshift-infra/auto-csr-approver-29585622-lvtgw" Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.440272 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/34ac97a0-6ef7-4d42-84e9-8926e28a822d-kube-api-access\") pod \"revision-pruner-10-crc\" (UID: \"34ac97a0-6ef7-4d42-84e9-8926e28a822d\") " pod="openshift-kube-controller-manager/revision-pruner-10-crc" Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.478346 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585622-lvtgw" Apr 02 13:42:00 crc kubenswrapper[4732]: I0402 13:42:00.487655 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-10-crc" Apr 02 13:42:01 crc kubenswrapper[4732]: I0402 13:42:01.066922 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-zmhn7" Apr 02 13:42:02 crc kubenswrapper[4732]: I0402 13:42:01.282034 4732 patch_prober.go:28] interesting pod/console-f9d7485db-dq9x9 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.29:8443/health\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Apr 02 13:42:02 crc kubenswrapper[4732]: I0402 13:42:01.282100 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dq9x9" podUID="4d77c191-7d04-4381-838f-b7a355e7c2d4" containerName="console" probeResult="failure" output="Get \"https://10.217.0.29:8443/health\": dial tcp 10.217.0.29:8443: connect: connection refused" Apr 02 13:42:02 crc kubenswrapper[4732]: I0402 13:42:01.925290 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 13:42:02 crc kubenswrapper[4732]: I0402 13:42:01.925701 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 13:42:02 crc kubenswrapper[4732]: I0402 13:42:01.956740 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-10-crc"] Apr 02 13:42:02 crc kubenswrapper[4732]: I0402 13:42:01.957829 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-10-crc" Apr 02 13:42:02 crc kubenswrapper[4732]: I0402 13:42:01.982056 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-10-crc"] Apr 02 13:42:02 crc kubenswrapper[4732]: I0402 13:42:02.137320 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/377f12b4-3628-432e-8132-f75725645672-var-lock\") pod \"installer-10-crc\" (UID: \"377f12b4-3628-432e-8132-f75725645672\") " pod="openshift-kube-controller-manager/installer-10-crc" Apr 02 13:42:02 crc kubenswrapper[4732]: I0402 13:42:02.137369 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/377f12b4-3628-432e-8132-f75725645672-kubelet-dir\") pod \"installer-10-crc\" (UID: \"377f12b4-3628-432e-8132-f75725645672\") " pod="openshift-kube-controller-manager/installer-10-crc" Apr 02 13:42:02 crc kubenswrapper[4732]: I0402 13:42:02.137417 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/377f12b4-3628-432e-8132-f75725645672-kube-api-access\") pod \"installer-10-crc\" (UID: \"377f12b4-3628-432e-8132-f75725645672\") " pod="openshift-kube-controller-manager/installer-10-crc" Apr 02 13:42:02 crc kubenswrapper[4732]: I0402 13:42:02.238521 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/377f12b4-3628-432e-8132-f75725645672-var-lock\") pod \"installer-10-crc\" (UID: \"377f12b4-3628-432e-8132-f75725645672\") " pod="openshift-kube-controller-manager/installer-10-crc" Apr 02 13:42:02 crc kubenswrapper[4732]: I0402 13:42:02.238575 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/377f12b4-3628-432e-8132-f75725645672-kubelet-dir\") pod \"installer-10-crc\" (UID: \"377f12b4-3628-432e-8132-f75725645672\") " pod="openshift-kube-controller-manager/installer-10-crc" Apr 02 13:42:02 crc kubenswrapper[4732]: I0402 13:42:02.238626 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/377f12b4-3628-432e-8132-f75725645672-var-lock\") pod \"installer-10-crc\" (UID: \"377f12b4-3628-432e-8132-f75725645672\") " pod="openshift-kube-controller-manager/installer-10-crc" Apr 02 13:42:02 crc kubenswrapper[4732]: I0402 13:42:02.238645 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/377f12b4-3628-432e-8132-f75725645672-kube-api-access\") pod \"installer-10-crc\" (UID: \"377f12b4-3628-432e-8132-f75725645672\") " pod="openshift-kube-controller-manager/installer-10-crc" Apr 02 13:42:02 crc kubenswrapper[4732]: I0402 13:42:02.238714 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/377f12b4-3628-432e-8132-f75725645672-kubelet-dir\") pod \"installer-10-crc\" (UID: \"377f12b4-3628-432e-8132-f75725645672\") " pod="openshift-kube-controller-manager/installer-10-crc" Apr 02 13:42:02 crc kubenswrapper[4732]: I0402 13:42:02.257337 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/377f12b4-3628-432e-8132-f75725645672-kube-api-access\") pod \"installer-10-crc\" (UID: \"377f12b4-3628-432e-8132-f75725645672\") " pod="openshift-kube-controller-manager/installer-10-crc" Apr 02 13:42:02 crc kubenswrapper[4732]: I0402 13:42:02.296770 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-10-crc" Apr 02 13:42:09 crc kubenswrapper[4732]: I0402 13:42:09.857062 4732 patch_prober.go:28] interesting pod/controller-manager-7bff847b8d-88b8t container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Apr 02 13:42:09 crc kubenswrapper[4732]: I0402 13:42:09.857681 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" podUID="c039ba90-db23-4d59-8109-2b83616af131" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Apr 02 13:42:10 crc kubenswrapper[4732]: I0402 13:42:10.866895 4732 patch_prober.go:28] interesting pod/route-controller-manager-d4c476667-t8fgh container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 02 13:42:10 crc kubenswrapper[4732]: I0402 13:42:10.867345 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" podUID="dcf8da11-8bbb-4de3-bbe5-69869bf4829c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 02 13:42:11 crc kubenswrapper[4732]: I0402 13:42:11.281203 4732 patch_prober.go:28] interesting pod/console-f9d7485db-dq9x9 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.29:8443/health\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Apr 02 13:42:11 crc kubenswrapper[4732]: I0402 13:42:11.281281 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dq9x9" podUID="4d77c191-7d04-4381-838f-b7a355e7c2d4" containerName="console" probeResult="failure" output="Get \"https://10.217.0.29:8443/health\": dial tcp 10.217.0.29:8443: connect: connection refused" Apr 02 13:42:11 crc kubenswrapper[4732]: I0402 13:42:11.384127 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" podUID="c70b5281-74d8-44ff-8f4b-326a3d7192aa" containerName="oauth-openshift" containerID="cri-o://30a82c816feba70dd1579b0c688c30adec2675441e955350d7487435fabf989c" gracePeriod=15 Apr 02 13:42:11 crc kubenswrapper[4732]: I0402 13:42:11.819602 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8dxlz" Apr 02 13:42:12 crc kubenswrapper[4732]: I0402 13:42:12.353084 4732 generic.go:334] "Generic (PLEG): container finished" podID="c70b5281-74d8-44ff-8f4b-326a3d7192aa" containerID="30a82c816feba70dd1579b0c688c30adec2675441e955350d7487435fabf989c" exitCode=0 Apr 02 13:42:12 crc kubenswrapper[4732]: I0402 13:42:12.353150 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" event={"ID":"c70b5281-74d8-44ff-8f4b-326a3d7192aa","Type":"ContainerDied","Data":"30a82c816feba70dd1579b0c688c30adec2675441e955350d7487435fabf989c"} Apr 02 13:42:12 crc kubenswrapper[4732]: I0402 13:42:12.466513 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-7-crc"] Apr 02 13:42:12 crc kubenswrapper[4732]: I0402 13:42:12.467828 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-7-crc" Apr 02 13:42:12 crc kubenswrapper[4732]: I0402 13:42:12.473497 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-5vhrm" Apr 02 13:42:12 crc kubenswrapper[4732]: I0402 13:42:12.476921 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Apr 02 13:42:12 crc kubenswrapper[4732]: I0402 13:42:12.478346 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-7-crc"] Apr 02 13:42:12 crc kubenswrapper[4732]: I0402 13:42:12.484440 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c78649c4-5d4a-4dac-a180-6ee450fd150f-kube-api-access\") pod \"installer-7-crc\" (UID: \"c78649c4-5d4a-4dac-a180-6ee450fd150f\") " pod="openshift-kube-scheduler/installer-7-crc" Apr 02 13:42:12 crc kubenswrapper[4732]: I0402 13:42:12.484690 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c78649c4-5d4a-4dac-a180-6ee450fd150f-var-lock\") pod \"installer-7-crc\" (UID: \"c78649c4-5d4a-4dac-a180-6ee450fd150f\") " pod="openshift-kube-scheduler/installer-7-crc" Apr 02 13:42:12 crc kubenswrapper[4732]: I0402 13:42:12.484842 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c78649c4-5d4a-4dac-a180-6ee450fd150f-kubelet-dir\") pod \"installer-7-crc\" (UID: \"c78649c4-5d4a-4dac-a180-6ee450fd150f\") " pod="openshift-kube-scheduler/installer-7-crc" Apr 02 13:42:12 crc kubenswrapper[4732]: I0402 13:42:12.585818 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c78649c4-5d4a-4dac-a180-6ee450fd150f-kube-api-access\") pod \"installer-7-crc\" (UID: \"c78649c4-5d4a-4dac-a180-6ee450fd150f\") " pod="openshift-kube-scheduler/installer-7-crc" Apr 02 13:42:12 crc kubenswrapper[4732]: I0402 13:42:12.585980 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c78649c4-5d4a-4dac-a180-6ee450fd150f-var-lock\") pod \"installer-7-crc\" (UID: \"c78649c4-5d4a-4dac-a180-6ee450fd150f\") " pod="openshift-kube-scheduler/installer-7-crc" Apr 02 13:42:12 crc kubenswrapper[4732]: I0402 13:42:12.586012 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c78649c4-5d4a-4dac-a180-6ee450fd150f-kubelet-dir\") pod \"installer-7-crc\" (UID: \"c78649c4-5d4a-4dac-a180-6ee450fd150f\") " pod="openshift-kube-scheduler/installer-7-crc" Apr 02 13:42:12 crc kubenswrapper[4732]: I0402 13:42:12.586128 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c78649c4-5d4a-4dac-a180-6ee450fd150f-kubelet-dir\") pod \"installer-7-crc\" (UID: \"c78649c4-5d4a-4dac-a180-6ee450fd150f\") " pod="openshift-kube-scheduler/installer-7-crc" Apr 02 13:42:12 crc kubenswrapper[4732]: I0402 13:42:12.586173 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c78649c4-5d4a-4dac-a180-6ee450fd150f-var-lock\") pod \"installer-7-crc\" (UID: \"c78649c4-5d4a-4dac-a180-6ee450fd150f\") " pod="openshift-kube-scheduler/installer-7-crc" Apr 02 13:42:12 crc kubenswrapper[4732]: I0402 13:42:12.608442 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c78649c4-5d4a-4dac-a180-6ee450fd150f-kube-api-access\") pod \"installer-7-crc\" (UID: \"c78649c4-5d4a-4dac-a180-6ee450fd150f\") " pod="openshift-kube-scheduler/installer-7-crc" Apr 02 13:42:12 crc kubenswrapper[4732]: I0402 13:42:12.802057 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-7-crc" Apr 02 13:42:14 crc kubenswrapper[4732]: I0402 13:42:14.714997 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Apr 02 13:42:14 crc kubenswrapper[4732]: I0402 13:42:14.716105 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Apr 02 13:42:14 crc kubenswrapper[4732]: I0402 13:42:14.718751 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Apr 02 13:42:14 crc kubenswrapper[4732]: I0402 13:42:14.718854 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Apr 02 13:42:14 crc kubenswrapper[4732]: I0402 13:42:14.727245 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Apr 02 13:42:14 crc kubenswrapper[4732]: I0402 13:42:14.920062 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/293d710f-0d74-455a-ace7-1dcaa32d9b7e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"293d710f-0d74-455a-ace7-1dcaa32d9b7e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Apr 02 13:42:14 crc kubenswrapper[4732]: I0402 13:42:14.920570 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/293d710f-0d74-455a-ace7-1dcaa32d9b7e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"293d710f-0d74-455a-ace7-1dcaa32d9b7e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Apr 02 13:42:15 crc kubenswrapper[4732]: I0402 13:42:15.021485 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/293d710f-0d74-455a-ace7-1dcaa32d9b7e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"293d710f-0d74-455a-ace7-1dcaa32d9b7e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Apr 02 13:42:15 crc kubenswrapper[4732]: I0402 13:42:15.021565 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/293d710f-0d74-455a-ace7-1dcaa32d9b7e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"293d710f-0d74-455a-ace7-1dcaa32d9b7e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Apr 02 13:42:15 crc kubenswrapper[4732]: I0402 13:42:15.021646 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/293d710f-0d74-455a-ace7-1dcaa32d9b7e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"293d710f-0d74-455a-ace7-1dcaa32d9b7e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Apr 02 13:42:15 crc kubenswrapper[4732]: I0402 13:42:15.039684 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/293d710f-0d74-455a-ace7-1dcaa32d9b7e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"293d710f-0d74-455a-ace7-1dcaa32d9b7e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Apr 02 13:42:15 crc kubenswrapper[4732]: I0402 13:42:15.041574 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Apr 02 13:42:16 crc kubenswrapper[4732]: I0402 13:42:16.955178 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-10-crc"] Apr 02 13:42:17 crc kubenswrapper[4732]: I0402 13:42:17.202661 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68b6f48864-m96db"] Apr 02 13:42:17 crc kubenswrapper[4732]: I0402 13:42:17.559397 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-11-crc"] Apr 02 13:42:17 crc kubenswrapper[4732]: I0402 13:42:17.560651 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-11-crc" Apr 02 13:42:17 crc kubenswrapper[4732]: I0402 13:42:17.566458 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-11-crc"] Apr 02 13:42:17 crc kubenswrapper[4732]: I0402 13:42:17.756077 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8ba08f5-0264-484a-a73a-a8659ce79e10-kube-api-access\") pod \"revision-pruner-11-crc\" (UID: \"d8ba08f5-0264-484a-a73a-a8659ce79e10\") " pod="openshift-kube-controller-manager/revision-pruner-11-crc" Apr 02 13:42:17 crc kubenswrapper[4732]: I0402 13:42:17.756170 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8ba08f5-0264-484a-a73a-a8659ce79e10-kubelet-dir\") pod \"revision-pruner-11-crc\" (UID: \"d8ba08f5-0264-484a-a73a-a8659ce79e10\") " pod="openshift-kube-controller-manager/revision-pruner-11-crc" Apr 02 13:42:17 crc kubenswrapper[4732]: E0402 13:42:17.830386 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Apr 02 13:42:17 crc kubenswrapper[4732]: E0402 13:42:17.830530 4732 kuberuntime_manager.go:1274] "Unhandled Error" err=< Apr 02 13:42:17 crc kubenswrapper[4732]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Apr 02 13:42:17 crc kubenswrapper[4732]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2jkkc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29585620-t897v_openshift-infra(9a82c61a-7d7e-4401-963a-1f1fe908002c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Apr 02 13:42:17 crc kubenswrapper[4732]: > logger="UnhandledError" Apr 02 13:42:17 crc kubenswrapper[4732]: E0402 13:42:17.831721 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29585620-t897v" podUID="9a82c61a-7d7e-4401-963a-1f1fe908002c" Apr 02 13:42:17 crc kubenswrapper[4732]: I0402 13:42:17.857954 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8ba08f5-0264-484a-a73a-a8659ce79e10-kube-api-access\") pod \"revision-pruner-11-crc\" (UID: \"d8ba08f5-0264-484a-a73a-a8659ce79e10\") " pod="openshift-kube-controller-manager/revision-pruner-11-crc" Apr 02 13:42:17 crc kubenswrapper[4732]: I0402 13:42:17.858035 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8ba08f5-0264-484a-a73a-a8659ce79e10-kubelet-dir\") pod \"revision-pruner-11-crc\" (UID: \"d8ba08f5-0264-484a-a73a-a8659ce79e10\") " pod="openshift-kube-controller-manager/revision-pruner-11-crc" Apr 02 13:42:17 crc kubenswrapper[4732]: I0402 13:42:17.859052 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8ba08f5-0264-484a-a73a-a8659ce79e10-kubelet-dir\") pod \"revision-pruner-11-crc\" (UID: \"d8ba08f5-0264-484a-a73a-a8659ce79e10\") " pod="openshift-kube-controller-manager/revision-pruner-11-crc" Apr 02 13:42:17 crc kubenswrapper[4732]: I0402 13:42:17.885792 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8ba08f5-0264-484a-a73a-a8659ce79e10-kube-api-access\") pod \"revision-pruner-11-crc\" (UID: \"d8ba08f5-0264-484a-a73a-a8659ce79e10\") " pod="openshift-kube-controller-manager/revision-pruner-11-crc" Apr 02 13:42:18 crc kubenswrapper[4732]: I0402 13:42:18.181705 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-11-crc" Apr 02 13:42:18 crc kubenswrapper[4732]: E0402 13:42:18.383674 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29585620-t897v" podUID="9a82c61a-7d7e-4401-963a-1f1fe908002c" Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.556419 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-11-crc"] Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.557422 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-11-crc" Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.565634 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-11-crc"] Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.682764 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/536912b9-ad03-42a4-bce9-754227ecbf82-kube-api-access\") pod \"installer-11-crc\" (UID: \"536912b9-ad03-42a4-bce9-754227ecbf82\") " pod="openshift-kube-controller-manager/installer-11-crc" Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.683074 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/536912b9-ad03-42a4-bce9-754227ecbf82-kubelet-dir\") pod \"installer-11-crc\" (UID: \"536912b9-ad03-42a4-bce9-754227ecbf82\") " pod="openshift-kube-controller-manager/installer-11-crc" Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.683124 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/536912b9-ad03-42a4-bce9-754227ecbf82-var-lock\") pod \"installer-11-crc\" (UID: \"536912b9-ad03-42a4-bce9-754227ecbf82\") " pod="openshift-kube-controller-manager/installer-11-crc" Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.717538 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.718567 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.724835 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.784675 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/536912b9-ad03-42a4-bce9-754227ecbf82-kubelet-dir\") pod \"installer-11-crc\" (UID: \"536912b9-ad03-42a4-bce9-754227ecbf82\") " pod="openshift-kube-controller-manager/installer-11-crc" Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.784720 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/536912b9-ad03-42a4-bce9-754227ecbf82-var-lock\") pod \"installer-11-crc\" (UID: \"536912b9-ad03-42a4-bce9-754227ecbf82\") " pod="openshift-kube-controller-manager/installer-11-crc" Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.784756 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/536912b9-ad03-42a4-bce9-754227ecbf82-kube-api-access\") pod \"installer-11-crc\" (UID: \"536912b9-ad03-42a4-bce9-754227ecbf82\") " pod="openshift-kube-controller-manager/installer-11-crc" Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.784919 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/536912b9-ad03-42a4-bce9-754227ecbf82-kubelet-dir\") pod \"installer-11-crc\" (UID: \"536912b9-ad03-42a4-bce9-754227ecbf82\") " pod="openshift-kube-controller-manager/installer-11-crc" Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.785035 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/536912b9-ad03-42a4-bce9-754227ecbf82-var-lock\") pod \"installer-11-crc\" (UID: \"536912b9-ad03-42a4-bce9-754227ecbf82\") " pod="openshift-kube-controller-manager/installer-11-crc" Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.797967 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" podUID="e2570535-673c-495c-a5aa-392f14ceebb1" containerName="registry" containerID="cri-o://cebfc68889bf68f1dee6260dd61cc2f0f7ebd22790507b1bca9eb8eb36c9a1ed" gracePeriod=30 Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.804425 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/536912b9-ad03-42a4-bce9-754227ecbf82-kube-api-access\") pod \"installer-11-crc\" (UID: \"536912b9-ad03-42a4-bce9-754227ecbf82\") " pod="openshift-kube-controller-manager/installer-11-crc" Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.829966 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.876482 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-11-crc" Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.886459 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85581866-f172-4f4e-8805-55c1f175201d-kube-api-access\") pod \"installer-9-crc\" (UID: \"85581866-f172-4f4e-8805-55c1f175201d\") " pod="openshift-kube-apiserver/installer-9-crc" Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.886502 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/85581866-f172-4f4e-8805-55c1f175201d-var-lock\") pod \"installer-9-crc\" (UID: \"85581866-f172-4f4e-8805-55c1f175201d\") " pod="openshift-kube-apiserver/installer-9-crc" Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.886554 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/85581866-f172-4f4e-8805-55c1f175201d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"85581866-f172-4f4e-8805-55c1f175201d\") " pod="openshift-kube-apiserver/installer-9-crc" Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.987781 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85581866-f172-4f4e-8805-55c1f175201d-kube-api-access\") pod \"installer-9-crc\" (UID: \"85581866-f172-4f4e-8805-55c1f175201d\") " pod="openshift-kube-apiserver/installer-9-crc" Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.987908 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/85581866-f172-4f4e-8805-55c1f175201d-var-lock\") pod \"installer-9-crc\" (UID: \"85581866-f172-4f4e-8805-55c1f175201d\") " pod="openshift-kube-apiserver/installer-9-crc" Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.988002 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/85581866-f172-4f4e-8805-55c1f175201d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"85581866-f172-4f4e-8805-55c1f175201d\") " pod="openshift-kube-apiserver/installer-9-crc" Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.988297 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/85581866-f172-4f4e-8805-55c1f175201d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"85581866-f172-4f4e-8805-55c1f175201d\") " pod="openshift-kube-apiserver/installer-9-crc" Apr 02 13:42:19 crc kubenswrapper[4732]: I0402 13:42:19.988404 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/85581866-f172-4f4e-8805-55c1f175201d-var-lock\") pod \"installer-9-crc\" (UID: \"85581866-f172-4f4e-8805-55c1f175201d\") " pod="openshift-kube-apiserver/installer-9-crc" Apr 02 13:42:20 crc kubenswrapper[4732]: I0402 13:42:20.004121 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85581866-f172-4f4e-8805-55c1f175201d-kube-api-access\") pod \"installer-9-crc\" (UID: \"85581866-f172-4f4e-8805-55c1f175201d\") " pod="openshift-kube-apiserver/installer-9-crc" Apr 02 13:42:20 crc kubenswrapper[4732]: I0402 13:42:20.035436 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Apr 02 13:42:20 crc kubenswrapper[4732]: I0402 13:42:20.857484 4732 patch_prober.go:28] interesting pod/controller-manager-7bff847b8d-88b8t container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 02 13:42:20 crc kubenswrapper[4732]: I0402 13:42:20.857567 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" podUID="c039ba90-db23-4d59-8109-2b83616af131" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 02 13:42:20 crc kubenswrapper[4732]: I0402 13:42:20.867523 4732 patch_prober.go:28] interesting pod/route-controller-manager-d4c476667-t8fgh container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 02 13:42:20 crc kubenswrapper[4732]: I0402 13:42:20.867656 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" podUID="dcf8da11-8bbb-4de3-bbe5-69869bf4829c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 02 13:42:20 crc kubenswrapper[4732]: I0402 13:42:20.959695 4732 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-86zsp container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.6:6443/healthz\": dial tcp 10.217.0.6:6443: connect: connection refused" start-of-body= Apr 02 13:42:20 crc kubenswrapper[4732]: I0402 13:42:20.959764 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" podUID="c70b5281-74d8-44ff-8f4b-326a3d7192aa" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.6:6443/healthz\": dial tcp 10.217.0.6:6443: connect: connection refused" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.148684 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.154314 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.179654 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4"] Apr 02 13:42:21 crc kubenswrapper[4732]: E0402 13:42:21.179910 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c039ba90-db23-4d59-8109-2b83616af131" containerName="controller-manager" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.179925 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c039ba90-db23-4d59-8109-2b83616af131" containerName="controller-manager" Apr 02 13:42:21 crc kubenswrapper[4732]: E0402 13:42:21.179940 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf8da11-8bbb-4de3-bbe5-69869bf4829c" containerName="route-controller-manager" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.179946 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf8da11-8bbb-4de3-bbe5-69869bf4829c" containerName="route-controller-manager" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.180044 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c039ba90-db23-4d59-8109-2b83616af131" containerName="controller-manager" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.180056 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcf8da11-8bbb-4de3-bbe5-69869bf4829c" containerName="route-controller-manager" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.180429 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.191930 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4"] Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.281047 4732 patch_prober.go:28] interesting pod/console-f9d7485db-dq9x9 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.29:8443/health\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.281110 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dq9x9" podUID="4d77c191-7d04-4381-838f-b7a355e7c2d4" containerName="console" probeResult="failure" output="Get \"https://10.217.0.29:8443/health\": dial tcp 10.217.0.29:8443: connect: connection refused" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.315017 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9kd9\" (UniqueName: \"kubernetes.io/projected/dcf8da11-8bbb-4de3-bbe5-69869bf4829c-kube-api-access-g9kd9\") pod \"dcf8da11-8bbb-4de3-bbe5-69869bf4829c\" (UID: \"dcf8da11-8bbb-4de3-bbe5-69869bf4829c\") " Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.315333 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c039ba90-db23-4d59-8109-2b83616af131-client-ca\") pod \"c039ba90-db23-4d59-8109-2b83616af131\" (UID: \"c039ba90-db23-4d59-8109-2b83616af131\") " Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.315439 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c039ba90-db23-4d59-8109-2b83616af131-serving-cert\") pod \"c039ba90-db23-4d59-8109-2b83616af131\" (UID: \"c039ba90-db23-4d59-8109-2b83616af131\") " Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.315540 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcf8da11-8bbb-4de3-bbe5-69869bf4829c-serving-cert\") pod \"dcf8da11-8bbb-4de3-bbe5-69869bf4829c\" (UID: \"dcf8da11-8bbb-4de3-bbe5-69869bf4829c\") " Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.315632 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dcf8da11-8bbb-4de3-bbe5-69869bf4829c-client-ca\") pod \"dcf8da11-8bbb-4de3-bbe5-69869bf4829c\" (UID: \"dcf8da11-8bbb-4de3-bbe5-69869bf4829c\") " Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.315756 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcf8da11-8bbb-4de3-bbe5-69869bf4829c-config\") pod \"dcf8da11-8bbb-4de3-bbe5-69869bf4829c\" (UID: \"dcf8da11-8bbb-4de3-bbe5-69869bf4829c\") " Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.315861 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c039ba90-db23-4d59-8109-2b83616af131-proxy-ca-bundles\") pod \"c039ba90-db23-4d59-8109-2b83616af131\" (UID: \"c039ba90-db23-4d59-8109-2b83616af131\") " Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.315934 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dmwn\" (UniqueName: \"kubernetes.io/projected/c039ba90-db23-4d59-8109-2b83616af131-kube-api-access-8dmwn\") pod \"c039ba90-db23-4d59-8109-2b83616af131\" (UID: \"c039ba90-db23-4d59-8109-2b83616af131\") " Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.315997 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c039ba90-db23-4d59-8109-2b83616af131-config\") pod \"c039ba90-db23-4d59-8109-2b83616af131\" (UID: \"c039ba90-db23-4d59-8109-2b83616af131\") " Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.316161 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c039ba90-db23-4d59-8109-2b83616af131-client-ca" (OuterVolumeSpecName: "client-ca") pod "c039ba90-db23-4d59-8109-2b83616af131" (UID: "c039ba90-db23-4d59-8109-2b83616af131"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.316270 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c379e504-2a74-402a-a807-0865a0ada4ba-config\") pod \"route-controller-manager-bfd65bd69-hd7v4\" (UID: \"c379e504-2a74-402a-a807-0865a0ada4ba\") " pod="openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.316702 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcf8da11-8bbb-4de3-bbe5-69869bf4829c-client-ca" (OuterVolumeSpecName: "client-ca") pod "dcf8da11-8bbb-4de3-bbe5-69869bf4829c" (UID: "dcf8da11-8bbb-4de3-bbe5-69869bf4829c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.316735 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c039ba90-db23-4d59-8109-2b83616af131-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c039ba90-db23-4d59-8109-2b83616af131" (UID: "c039ba90-db23-4d59-8109-2b83616af131"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.316921 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcf8da11-8bbb-4de3-bbe5-69869bf4829c-config" (OuterVolumeSpecName: "config") pod "dcf8da11-8bbb-4de3-bbe5-69869bf4829c" (UID: "dcf8da11-8bbb-4de3-bbe5-69869bf4829c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.317590 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c039ba90-db23-4d59-8109-2b83616af131-config" (OuterVolumeSpecName: "config") pod "c039ba90-db23-4d59-8109-2b83616af131" (UID: "c039ba90-db23-4d59-8109-2b83616af131"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.317725 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxlng\" (UniqueName: \"kubernetes.io/projected/c379e504-2a74-402a-a807-0865a0ada4ba-kube-api-access-rxlng\") pod \"route-controller-manager-bfd65bd69-hd7v4\" (UID: \"c379e504-2a74-402a-a807-0865a0ada4ba\") " pod="openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.317786 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c379e504-2a74-402a-a807-0865a0ada4ba-client-ca\") pod \"route-controller-manager-bfd65bd69-hd7v4\" (UID: \"c379e504-2a74-402a-a807-0865a0ada4ba\") " pod="openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.317866 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c379e504-2a74-402a-a807-0865a0ada4ba-serving-cert\") pod \"route-controller-manager-bfd65bd69-hd7v4\" (UID: \"c379e504-2a74-402a-a807-0865a0ada4ba\") " pod="openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.317917 4732 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c039ba90-db23-4d59-8109-2b83616af131-client-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.317932 4732 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dcf8da11-8bbb-4de3-bbe5-69869bf4829c-client-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.317945 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcf8da11-8bbb-4de3-bbe5-69869bf4829c-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.317956 4732 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c039ba90-db23-4d59-8109-2b83616af131-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.317969 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c039ba90-db23-4d59-8109-2b83616af131-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.320836 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c039ba90-db23-4d59-8109-2b83616af131-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c039ba90-db23-4d59-8109-2b83616af131" (UID: "c039ba90-db23-4d59-8109-2b83616af131"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.320939 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcf8da11-8bbb-4de3-bbe5-69869bf4829c-kube-api-access-g9kd9" (OuterVolumeSpecName: "kube-api-access-g9kd9") pod "dcf8da11-8bbb-4de3-bbe5-69869bf4829c" (UID: "dcf8da11-8bbb-4de3-bbe5-69869bf4829c"). InnerVolumeSpecName "kube-api-access-g9kd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.321024 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c039ba90-db23-4d59-8109-2b83616af131-kube-api-access-8dmwn" (OuterVolumeSpecName: "kube-api-access-8dmwn") pod "c039ba90-db23-4d59-8109-2b83616af131" (UID: "c039ba90-db23-4d59-8109-2b83616af131"). InnerVolumeSpecName "kube-api-access-8dmwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.321789 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcf8da11-8bbb-4de3-bbe5-69869bf4829c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "dcf8da11-8bbb-4de3-bbe5-69869bf4829c" (UID: "dcf8da11-8bbb-4de3-bbe5-69869bf4829c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.398318 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" event={"ID":"c039ba90-db23-4d59-8109-2b83616af131","Type":"ContainerDied","Data":"340c689f412eee8c2c4338160d0682cbc1f30a4f66e1e6b756d93e699ce234e6"} Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.398383 4732 scope.go:117] "RemoveContainer" containerID="40d63b200b0f432a2e8204e506158c90f6be2f635a07c7f689d9bcb4b003c038" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.398557 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bff847b8d-88b8t" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.403363 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" event={"ID":"dcf8da11-8bbb-4de3-bbe5-69869bf4829c","Type":"ContainerDied","Data":"f1106e35a25bad6f69ce140890c33fed5bc2d17ea354d195802cbbe915e76120"} Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.403422 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.418718 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxlng\" (UniqueName: \"kubernetes.io/projected/c379e504-2a74-402a-a807-0865a0ada4ba-kube-api-access-rxlng\") pod \"route-controller-manager-bfd65bd69-hd7v4\" (UID: \"c379e504-2a74-402a-a807-0865a0ada4ba\") " pod="openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.419114 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c379e504-2a74-402a-a807-0865a0ada4ba-client-ca\") pod \"route-controller-manager-bfd65bd69-hd7v4\" (UID: \"c379e504-2a74-402a-a807-0865a0ada4ba\") " pod="openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.419394 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c379e504-2a74-402a-a807-0865a0ada4ba-serving-cert\") pod \"route-controller-manager-bfd65bd69-hd7v4\" (UID: \"c379e504-2a74-402a-a807-0865a0ada4ba\") " pod="openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.419702 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c379e504-2a74-402a-a807-0865a0ada4ba-config\") pod \"route-controller-manager-bfd65bd69-hd7v4\" (UID: \"c379e504-2a74-402a-a807-0865a0ada4ba\") " pod="openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.420077 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dmwn\" (UniqueName: \"kubernetes.io/projected/c039ba90-db23-4d59-8109-2b83616af131-kube-api-access-8dmwn\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.420276 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9kd9\" (UniqueName: \"kubernetes.io/projected/dcf8da11-8bbb-4de3-bbe5-69869bf4829c-kube-api-access-g9kd9\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.420472 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c039ba90-db23-4d59-8109-2b83616af131-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.420703 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcf8da11-8bbb-4de3-bbe5-69869bf4829c-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.437058 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c379e504-2a74-402a-a807-0865a0ada4ba-config\") pod \"route-controller-manager-bfd65bd69-hd7v4\" (UID: \"c379e504-2a74-402a-a807-0865a0ada4ba\") " pod="openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.438442 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c379e504-2a74-402a-a807-0865a0ada4ba-client-ca\") pod \"route-controller-manager-bfd65bd69-hd7v4\" (UID: \"c379e504-2a74-402a-a807-0865a0ada4ba\") " pod="openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.449591 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c379e504-2a74-402a-a807-0865a0ada4ba-serving-cert\") pod \"route-controller-manager-bfd65bd69-hd7v4\" (UID: \"c379e504-2a74-402a-a807-0865a0ada4ba\") " pod="openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.457274 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bff847b8d-88b8t"] Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.460905 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7bff847b8d-88b8t"] Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.462856 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxlng\" (UniqueName: \"kubernetes.io/projected/c379e504-2a74-402a-a807-0865a0ada4ba-kube-api-access-rxlng\") pod \"route-controller-manager-bfd65bd69-hd7v4\" (UID: \"c379e504-2a74-402a-a807-0865a0ada4ba\") " pod="openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4" Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.466870 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh"] Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.469787 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d4c476667-t8fgh"] Apr 02 13:42:21 crc kubenswrapper[4732]: I0402 13:42:21.501373 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4" Apr 02 13:42:22 crc kubenswrapper[4732]: I0402 13:42:22.409036 4732 generic.go:334] "Generic (PLEG): container finished" podID="e2570535-673c-495c-a5aa-392f14ceebb1" containerID="cebfc68889bf68f1dee6260dd61cc2f0f7ebd22790507b1bca9eb8eb36c9a1ed" exitCode=0 Apr 02 13:42:22 crc kubenswrapper[4732]: I0402 13:42:22.409074 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" event={"ID":"e2570535-673c-495c-a5aa-392f14ceebb1","Type":"ContainerDied","Data":"cebfc68889bf68f1dee6260dd61cc2f0f7ebd22790507b1bca9eb8eb36c9a1ed"} Apr 02 13:42:22 crc kubenswrapper[4732]: I0402 13:42:22.687303 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c039ba90-db23-4d59-8109-2b83616af131" path="/var/lib/kubelet/pods/c039ba90-db23-4d59-8109-2b83616af131/volumes" Apr 02 13:42:22 crc kubenswrapper[4732]: I0402 13:42:22.688123 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcf8da11-8bbb-4de3-bbe5-69869bf4829c" path="/var/lib/kubelet/pods/dcf8da11-8bbb-4de3-bbe5-69869bf4829c/volumes" Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.534273 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-86c47b858b-ktqm8"] Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.535413 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.537199 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.537706 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.538305 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.538983 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.539083 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.539640 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.542713 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86c47b858b-ktqm8"] Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.544545 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.651204 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28e25a15-5386-4a3e-a772-6aba6096471e-serving-cert\") pod \"controller-manager-86c47b858b-ktqm8\" (UID: \"28e25a15-5386-4a3e-a772-6aba6096471e\") " pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.651280 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28e25a15-5386-4a3e-a772-6aba6096471e-proxy-ca-bundles\") pod \"controller-manager-86c47b858b-ktqm8\" (UID: \"28e25a15-5386-4a3e-a772-6aba6096471e\") " pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.651366 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4959w\" (UniqueName: \"kubernetes.io/projected/28e25a15-5386-4a3e-a772-6aba6096471e-kube-api-access-4959w\") pod \"controller-manager-86c47b858b-ktqm8\" (UID: \"28e25a15-5386-4a3e-a772-6aba6096471e\") " pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.651399 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28e25a15-5386-4a3e-a772-6aba6096471e-client-ca\") pod \"controller-manager-86c47b858b-ktqm8\" (UID: \"28e25a15-5386-4a3e-a772-6aba6096471e\") " pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.651418 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28e25a15-5386-4a3e-a772-6aba6096471e-config\") pod \"controller-manager-86c47b858b-ktqm8\" (UID: \"28e25a15-5386-4a3e-a772-6aba6096471e\") " pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.752572 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4959w\" (UniqueName: \"kubernetes.io/projected/28e25a15-5386-4a3e-a772-6aba6096471e-kube-api-access-4959w\") pod \"controller-manager-86c47b858b-ktqm8\" (UID: \"28e25a15-5386-4a3e-a772-6aba6096471e\") " pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.752649 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28e25a15-5386-4a3e-a772-6aba6096471e-client-ca\") pod \"controller-manager-86c47b858b-ktqm8\" (UID: \"28e25a15-5386-4a3e-a772-6aba6096471e\") " pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.752679 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28e25a15-5386-4a3e-a772-6aba6096471e-config\") pod \"controller-manager-86c47b858b-ktqm8\" (UID: \"28e25a15-5386-4a3e-a772-6aba6096471e\") " pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.752715 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28e25a15-5386-4a3e-a772-6aba6096471e-serving-cert\") pod \"controller-manager-86c47b858b-ktqm8\" (UID: \"28e25a15-5386-4a3e-a772-6aba6096471e\") " pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.752762 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28e25a15-5386-4a3e-a772-6aba6096471e-proxy-ca-bundles\") pod \"controller-manager-86c47b858b-ktqm8\" (UID: \"28e25a15-5386-4a3e-a772-6aba6096471e\") " pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.754029 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28e25a15-5386-4a3e-a772-6aba6096471e-client-ca\") pod \"controller-manager-86c47b858b-ktqm8\" (UID: \"28e25a15-5386-4a3e-a772-6aba6096471e\") " pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.754146 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28e25a15-5386-4a3e-a772-6aba6096471e-proxy-ca-bundles\") pod \"controller-manager-86c47b858b-ktqm8\" (UID: \"28e25a15-5386-4a3e-a772-6aba6096471e\") " pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.754284 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28e25a15-5386-4a3e-a772-6aba6096471e-config\") pod \"controller-manager-86c47b858b-ktqm8\" (UID: \"28e25a15-5386-4a3e-a772-6aba6096471e\") " pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.765345 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28e25a15-5386-4a3e-a772-6aba6096471e-serving-cert\") pod \"controller-manager-86c47b858b-ktqm8\" (UID: \"28e25a15-5386-4a3e-a772-6aba6096471e\") " pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.769003 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4959w\" (UniqueName: \"kubernetes.io/projected/28e25a15-5386-4a3e-a772-6aba6096471e-kube-api-access-4959w\") pod \"controller-manager-86c47b858b-ktqm8\" (UID: \"28e25a15-5386-4a3e-a772-6aba6096471e\") " pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" Apr 02 13:42:23 crc kubenswrapper[4732]: I0402 13:42:23.857980 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" Apr 02 13:42:29 crc kubenswrapper[4732]: I0402 13:42:29.292940 4732 patch_prober.go:28] interesting pod/image-registry-697d97f7c8-zhxz4 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.217.0.15:5000/healthz\": dial tcp 10.217.0.15:5000: connect: connection refused" start-of-body= Apr 02 13:42:29 crc kubenswrapper[4732]: I0402 13:42:29.293487 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" podUID="e2570535-673c-495c-a5aa-392f14ceebb1" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.15:5000/healthz\": dial tcp 10.217.0.15:5000: connect: connection refused" Apr 02 13:42:31 crc kubenswrapper[4732]: W0402 13:42:31.173554 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6ca8cc0_a0bc_47a8_bd4a_d4b69a3f0828.slice/crio-00fb02962ffae3b81c5144caecff72c5ddef198999b02158944d81f60d844b61 WatchSource:0}: Error finding container 00fb02962ffae3b81c5144caecff72c5ddef198999b02158944d81f60d844b61: Status 404 returned error can't find the container with id 00fb02962ffae3b81c5144caecff72c5ddef198999b02158944d81f60d844b61 Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.250953 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.286690 4732 patch_prober.go:28] interesting pod/console-f9d7485db-dq9x9 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.29:8443/health\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.287019 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dq9x9" podUID="4d77c191-7d04-4381-838f-b7a355e7c2d4" containerName="console" probeResult="failure" output="Get \"https://10.217.0.29:8443/health\": dial tcp 10.217.0.29:8443: connect: connection refused" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.367017 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-ocp-branding-template\") pod \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.367086 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-user-idp-0-file-data\") pod \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.367110 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-user-template-login\") pod \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.367128 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-cliconfig\") pod \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.367145 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-session\") pod \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.367162 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-user-template-provider-selection\") pod \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.367178 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c70b5281-74d8-44ff-8f4b-326a3d7192aa-audit-dir\") pod \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.367197 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-router-certs\") pod \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.367215 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c70b5281-74d8-44ff-8f4b-326a3d7192aa-audit-policies\") pod \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.367244 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmhxj\" (UniqueName: \"kubernetes.io/projected/c70b5281-74d8-44ff-8f4b-326a3d7192aa-kube-api-access-tmhxj\") pod \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.367259 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-user-template-error\") pod \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.367294 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-serving-cert\") pod \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.367335 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-service-ca\") pod \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.367355 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-trusted-ca-bundle\") pod \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\" (UID: \"c70b5281-74d8-44ff-8f4b-326a3d7192aa\") " Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.368992 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c70b5281-74d8-44ff-8f4b-326a3d7192aa-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "c70b5281-74d8-44ff-8f4b-326a3d7192aa" (UID: "c70b5281-74d8-44ff-8f4b-326a3d7192aa"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.369094 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c70b5281-74d8-44ff-8f4b-326a3d7192aa-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "c70b5281-74d8-44ff-8f4b-326a3d7192aa" (UID: "c70b5281-74d8-44ff-8f4b-326a3d7192aa"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.369205 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "c70b5281-74d8-44ff-8f4b-326a3d7192aa" (UID: "c70b5281-74d8-44ff-8f4b-326a3d7192aa"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.369736 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "c70b5281-74d8-44ff-8f4b-326a3d7192aa" (UID: "c70b5281-74d8-44ff-8f4b-326a3d7192aa"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.369825 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "c70b5281-74d8-44ff-8f4b-326a3d7192aa" (UID: "c70b5281-74d8-44ff-8f4b-326a3d7192aa"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.400339 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "c70b5281-74d8-44ff-8f4b-326a3d7192aa" (UID: "c70b5281-74d8-44ff-8f4b-326a3d7192aa"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.409891 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "c70b5281-74d8-44ff-8f4b-326a3d7192aa" (UID: "c70b5281-74d8-44ff-8f4b-326a3d7192aa"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.414889 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "c70b5281-74d8-44ff-8f4b-326a3d7192aa" (UID: "c70b5281-74d8-44ff-8f4b-326a3d7192aa"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.415262 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c70b5281-74d8-44ff-8f4b-326a3d7192aa-kube-api-access-tmhxj" (OuterVolumeSpecName: "kube-api-access-tmhxj") pod "c70b5281-74d8-44ff-8f4b-326a3d7192aa" (UID: "c70b5281-74d8-44ff-8f4b-326a3d7192aa"). InnerVolumeSpecName "kube-api-access-tmhxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.415995 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "c70b5281-74d8-44ff-8f4b-326a3d7192aa" (UID: "c70b5281-74d8-44ff-8f4b-326a3d7192aa"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.416281 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "c70b5281-74d8-44ff-8f4b-326a3d7192aa" (UID: "c70b5281-74d8-44ff-8f4b-326a3d7192aa"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.416477 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "c70b5281-74d8-44ff-8f4b-326a3d7192aa" (UID: "c70b5281-74d8-44ff-8f4b-326a3d7192aa"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.416745 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "c70b5281-74d8-44ff-8f4b-326a3d7192aa" (UID: "c70b5281-74d8-44ff-8f4b-326a3d7192aa"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.416863 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "c70b5281-74d8-44ff-8f4b-326a3d7192aa" (UID: "c70b5281-74d8-44ff-8f4b-326a3d7192aa"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.442247 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-85455b8986-kh7t7"] Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.459734 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68b6f48864-m96db" event={"ID":"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828","Type":"ContainerStarted","Data":"00fb02962ffae3b81c5144caecff72c5ddef198999b02158944d81f60d844b61"} Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.461270 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" event={"ID":"c70b5281-74d8-44ff-8f4b-326a3d7192aa","Type":"ContainerDied","Data":"bec58acb001e8a9e7a6f7d0fc19d800961db922cfa11c56d9ccf308011473970"} Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.461334 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.468564 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.468595 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.468605 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.468633 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.468649 4732 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c70b5281-74d8-44ff-8f4b-326a3d7192aa-audit-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.468660 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.468670 4732 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c70b5281-74d8-44ff-8f4b-326a3d7192aa-audit-policies\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.468682 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmhxj\" (UniqueName: \"kubernetes.io/projected/c70b5281-74d8-44ff-8f4b-326a3d7192aa-kube-api-access-tmhxj\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.468693 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.468701 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.468736 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.468746 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.468754 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.468763 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c70b5281-74d8-44ff-8f4b-326a3d7192aa-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.489789 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-86zsp"] Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.492366 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-86zsp"] Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.923987 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.924046 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.959893 4732 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-86zsp container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.6:6443/healthz\": context deadline exceeded" start-of-body= Apr 02 13:42:31 crc kubenswrapper[4732]: I0402 13:42:31.959949 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-86zsp" podUID="c70b5281-74d8-44ff-8f4b-326a3d7192aa" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.6:6443/healthz\": context deadline exceeded" Apr 02 13:42:32 crc kubenswrapper[4732]: I0402 13:42:32.695248 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c70b5281-74d8-44ff-8f4b-326a3d7192aa" path="/var/lib/kubelet/pods/c70b5281-74d8-44ff-8f4b-326a3d7192aa/volumes" Apr 02 13:42:33 crc kubenswrapper[4732]: E0402 13:42:33.249030 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Apr 02 13:42:33 crc kubenswrapper[4732]: E0402 13:42:33.249676 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m8zl9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-qlgxl_openshift-marketplace(1827909b-49ea-4ba8-9995-f525d1d82f45): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Apr 02 13:42:33 crc kubenswrapper[4732]: E0402 13:42:33.250847 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-qlgxl" podUID="1827909b-49ea-4ba8-9995-f525d1d82f45" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.201152 4732 ???:1] "http: TLS handshake error from 192.168.126.11:57512: no serving certificate available for the kubelet" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.640722 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5f47bfd98d-lmc96"] Apr 02 13:42:36 crc kubenswrapper[4732]: E0402 13:42:36.640947 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c70b5281-74d8-44ff-8f4b-326a3d7192aa" containerName="oauth-openshift" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.640957 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70b5281-74d8-44ff-8f4b-326a3d7192aa" containerName="oauth-openshift" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.641053 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c70b5281-74d8-44ff-8f4b-326a3d7192aa" containerName="oauth-openshift" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.641416 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.650057 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.650318 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.650669 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.652407 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.654786 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.654844 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.654907 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.654913 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.655018 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.655046 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.656289 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.657293 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.663164 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.664365 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5f47bfd98d-lmc96"] Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.666638 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.667250 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.753604 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.753999 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-user-template-error\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.754039 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-system-service-ca\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.754063 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-system-session\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.754101 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlgk2\" (UniqueName: \"kubernetes.io/projected/3632e96a-6031-4b71-9651-bfef4302cdc1-kube-api-access-hlgk2\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.754157 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-system-router-certs\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.754217 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.754252 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3632e96a-6031-4b71-9651-bfef4302cdc1-audit-dir\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.754276 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3632e96a-6031-4b71-9651-bfef4302cdc1-audit-policies\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.754318 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-user-template-login\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.754341 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.754368 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.754395 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.754417 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.855452 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.855511 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-user-template-error\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.855546 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-system-service-ca\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.855569 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-system-session\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.855629 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlgk2\" (UniqueName: \"kubernetes.io/projected/3632e96a-6031-4b71-9651-bfef4302cdc1-kube-api-access-hlgk2\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.855657 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-system-router-certs\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.855701 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.855724 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3632e96a-6031-4b71-9651-bfef4302cdc1-audit-dir\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.855741 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3632e96a-6031-4b71-9651-bfef4302cdc1-audit-policies\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.855763 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-user-template-login\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.855779 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.855795 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.855814 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.855832 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.856430 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.856522 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3632e96a-6031-4b71-9651-bfef4302cdc1-audit-dir\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.857141 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-system-service-ca\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.857997 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.858376 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3632e96a-6031-4b71-9651-bfef4302cdc1-audit-policies\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.862123 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.869990 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-user-template-error\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.873074 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-user-template-login\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.873498 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.887452 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-system-router-certs\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.889033 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-system-session\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.889238 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlgk2\" (UniqueName: \"kubernetes.io/projected/3632e96a-6031-4b71-9651-bfef4302cdc1-kube-api-access-hlgk2\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.889993 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.890235 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3632e96a-6031-4b71-9651-bfef4302cdc1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5f47bfd98d-lmc96\" (UID: \"3632e96a-6031-4b71-9651-bfef4302cdc1\") " pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.903230 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4"] Apr 02 13:42:36 crc kubenswrapper[4732]: I0402 13:42:36.974325 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:37 crc kubenswrapper[4732]: E0402 13:42:37.428700 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Apr 02 13:42:37 crc kubenswrapper[4732]: E0402 13:42:37.428846 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4tcvq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-lvckz_openshift-marketplace(33708fee-32a5-4418-81d0-226813150db7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Apr 02 13:42:37 crc kubenswrapper[4732]: E0402 13:42:37.430153 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-lvckz" podUID="33708fee-32a5-4418-81d0-226813150db7" Apr 02 13:42:37 crc kubenswrapper[4732]: E0402 13:42:37.436856 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Apr 02 13:42:37 crc kubenswrapper[4732]: E0402 13:42:37.437017 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pnp2h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-fd4xj_openshift-marketplace(6ac64cdf-a607-481a-9907-e6e72fc8b083): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Apr 02 13:42:37 crc kubenswrapper[4732]: E0402 13:42:37.438184 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-fd4xj" podUID="6ac64cdf-a607-481a-9907-e6e72fc8b083" Apr 02 13:42:37 crc kubenswrapper[4732]: E0402 13:42:37.465748 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Apr 02 13:42:37 crc kubenswrapper[4732]: E0402 13:42:37.465914 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wxmng,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-vgjz4_openshift-marketplace(50d43b2d-24ec-439f-a418-3673791eb1b1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Apr 02 13:42:37 crc kubenswrapper[4732]: E0402 13:42:37.467484 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-vgjz4" podUID="50d43b2d-24ec-439f-a418-3673791eb1b1" Apr 02 13:42:39 crc kubenswrapper[4732]: E0402 13:42:39.414527 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Apr 02 13:42:39 crc kubenswrapper[4732]: E0402 13:42:39.415159 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r4z2m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-4m6rj_openshift-marketplace(cf030ff0-459d-4453-975f-19ba4ff9641a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Apr 02 13:42:39 crc kubenswrapper[4732]: E0402 13:42:39.416688 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-4m6rj" podUID="cf030ff0-459d-4453-975f-19ba4ff9641a" Apr 02 13:42:39 crc kubenswrapper[4732]: I0402 13:42:39.645610 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-d9c747dbb-jqfln"] Apr 02 13:42:41 crc kubenswrapper[4732]: I0402 13:42:41.281295 4732 patch_prober.go:28] interesting pod/console-f9d7485db-dq9x9 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.29:8443/health\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Apr 02 13:42:41 crc kubenswrapper[4732]: I0402 13:42:41.281555 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dq9x9" podUID="4d77c191-7d04-4381-838f-b7a355e7c2d4" containerName="console" probeResult="failure" output="Get \"https://10.217.0.29:8443/health\": dial tcp 10.217.0.29:8443: connect: connection refused" Apr 02 13:42:42 crc kubenswrapper[4732]: I0402 13:42:42.536561 4732 generic.go:334] "Generic (PLEG): container finished" podID="fc020ab9-6c58-4571-ad12-3e22c8472a85" containerID="6414e232a13670a015e78773cfa4181799b39950e3f68a56932b2e5ff06545ad" exitCode=0 Apr 02 13:42:42 crc kubenswrapper[4732]: I0402 13:42:42.536662 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8m6h" event={"ID":"fc020ab9-6c58-4571-ad12-3e22c8472a85","Type":"ContainerDied","Data":"6414e232a13670a015e78773cfa4181799b39950e3f68a56932b2e5ff06545ad"} Apr 02 13:42:42 crc kubenswrapper[4732]: I0402 13:42:42.537275 4732 scope.go:117] "RemoveContainer" containerID="6414e232a13670a015e78773cfa4181799b39950e3f68a56932b2e5ff06545ad" Apr 02 13:42:43 crc kubenswrapper[4732]: I0402 13:42:43.543331 4732 generic.go:334] "Generic (PLEG): container finished" podID="d0bcf0e9-75b6-443e-afd8-e2fb6f807e90" containerID="14f469ff307311640c12eb90b518e9de9142f56c60b9bbd2a39f4c66c4f401ec" exitCode=0 Apr 02 13:42:43 crc kubenswrapper[4732]: I0402 13:42:43.543405 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-95tg7" event={"ID":"d0bcf0e9-75b6-443e-afd8-e2fb6f807e90","Type":"ContainerDied","Data":"14f469ff307311640c12eb90b518e9de9142f56c60b9bbd2a39f4c66c4f401ec"} Apr 02 13:42:43 crc kubenswrapper[4732]: I0402 13:42:43.543883 4732 scope.go:117] "RemoveContainer" containerID="14f469ff307311640c12eb90b518e9de9142f56c60b9bbd2a39f4c66c4f401ec" Apr 02 13:42:43 crc kubenswrapper[4732]: I0402 13:42:43.546343 4732 generic.go:334] "Generic (PLEG): container finished" podID="ac5f29d0-13ce-46eb-babc-70f32ac34feb" containerID="a38446738c3b02bb950d259ff1359bb753417d4cfcd5b6717155d6ea4a15d11d" exitCode=0 Apr 02 13:42:43 crc kubenswrapper[4732]: I0402 13:42:43.546368 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sxw8j" event={"ID":"ac5f29d0-13ce-46eb-babc-70f32ac34feb","Type":"ContainerDied","Data":"a38446738c3b02bb950d259ff1359bb753417d4cfcd5b6717155d6ea4a15d11d"} Apr 02 13:42:43 crc kubenswrapper[4732]: I0402 13:42:43.546744 4732 scope.go:117] "RemoveContainer" containerID="a38446738c3b02bb950d259ff1359bb753417d4cfcd5b6717155d6ea4a15d11d" Apr 02 13:42:43 crc kubenswrapper[4732]: I0402 13:42:43.864506 4732 scope.go:117] "RemoveContainer" containerID="574ebdb08baf59a41093e85e00cad943cfebcc0093f4627d5e3b85709f8f98e0" Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.150098 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585622-lvtgw"] Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.294991 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-10-crc"] Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.303198 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-11-crc"] Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.311633 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-10-crc"] Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.331675 4732 patch_prober.go:28] interesting pod/image-registry-697d97f7c8-zhxz4 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.217.0.15:5000/healthz\": dial tcp 10.217.0.15:5000: i/o timeout" start-of-body= Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.331726 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" podUID="e2570535-673c-495c-a5aa-392f14ceebb1" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.15:5000/healthz\": dial tcp 10.217.0.15:5000: i/o timeout" Apr 02 13:42:44 crc kubenswrapper[4732]: W0402 13:42:44.387754 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0664f762_59d2_4c95_8e8f_698af0b15611.slice/crio-e2727925452645522fe09d85f2fdfb444185bd45a32d35c8636aebabef125bc3 WatchSource:0}: Error finding container e2727925452645522fe09d85f2fdfb444185bd45a32d35c8636aebabef125bc3: Status 404 returned error can't find the container with id e2727925452645522fe09d85f2fdfb444185bd45a32d35c8636aebabef125bc3 Apr 02 13:42:44 crc kubenswrapper[4732]: E0402 13:42:44.420079 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-fd4xj" podUID="6ac64cdf-a607-481a-9907-e6e72fc8b083" Apr 02 13:42:44 crc kubenswrapper[4732]: E0402 13:42:44.420438 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-4m6rj" podUID="cf030ff0-459d-4453-975f-19ba4ff9641a" Apr 02 13:42:44 crc kubenswrapper[4732]: E0402 13:42:44.420579 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lvckz" podUID="33708fee-32a5-4418-81d0-226813150db7" Apr 02 13:42:44 crc kubenswrapper[4732]: E0402 13:42:44.420753 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-vgjz4" podUID="50d43b2d-24ec-439f-a418-3673791eb1b1" Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.438251 4732 scope.go:117] "RemoveContainer" containerID="30a82c816feba70dd1579b0c688c30adec2675441e955350d7487435fabf989c" Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.446767 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.552816 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-11-crc" event={"ID":"d8ba08f5-0264-484a-a73a-a8659ce79e10","Type":"ContainerStarted","Data":"df57ac8179fa62ea769a42fe4092e643a87561375af704e81c8011a359aee99c"} Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.553625 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-85455b8986-kh7t7" event={"ID":"0664f762-59d2-4c95-8e8f-698af0b15611","Type":"ContainerStarted","Data":"e2727925452645522fe09d85f2fdfb444185bd45a32d35c8636aebabef125bc3"} Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.554566 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585622-lvtgw" event={"ID":"65013539-f3b4-4513-881a-14408a922424","Type":"ContainerStarted","Data":"fcc913ed751fad1711a437b19556ca5c9a9cf874e1130911ecfa456a5ddae367"} Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.556030 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" event={"ID":"e2570535-673c-495c-a5aa-392f14ceebb1","Type":"ContainerDied","Data":"17532bd5afe9e00601608767d5760acdc596adaab8fba875408b53a8ecf51ebf"} Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.556091 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zhxz4" Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.559936 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" event={"ID":"57f69fbf-b90f-411e-8e0c-70a25bb01566","Type":"ContainerStarted","Data":"bef1f039b81372c2a0160ad8768c3351ed910c6e18562099f965ed401a3b2142"} Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.561687 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-10-crc" event={"ID":"34ac97a0-6ef7-4d42-84e9-8926e28a822d","Type":"ContainerStarted","Data":"3b54d41a7194329f721b5e5f726ebb7814762a26e10a1e76fcbe823328d2349e"} Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.591757 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-7-crc"] Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.613735 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gdbx\" (UniqueName: \"kubernetes.io/projected/e2570535-673c-495c-a5aa-392f14ceebb1-kube-api-access-5gdbx\") pod \"e2570535-673c-495c-a5aa-392f14ceebb1\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.614506 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2570535-673c-495c-a5aa-392f14ceebb1-bound-sa-token\") pod \"e2570535-673c-495c-a5aa-392f14ceebb1\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.614535 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2570535-673c-495c-a5aa-392f14ceebb1-installation-pull-secrets\") pod \"e2570535-673c-495c-a5aa-392f14ceebb1\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.614557 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2570535-673c-495c-a5aa-392f14ceebb1-trusted-ca\") pod \"e2570535-673c-495c-a5aa-392f14ceebb1\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.614698 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"e2570535-673c-495c-a5aa-392f14ceebb1\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.614750 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2570535-673c-495c-a5aa-392f14ceebb1-registry-tls\") pod \"e2570535-673c-495c-a5aa-392f14ceebb1\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.614842 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2570535-673c-495c-a5aa-392f14ceebb1-registry-certificates\") pod \"e2570535-673c-495c-a5aa-392f14ceebb1\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.614870 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2570535-673c-495c-a5aa-392f14ceebb1-ca-trust-extracted\") pod \"e2570535-673c-495c-a5aa-392f14ceebb1\" (UID: \"e2570535-673c-495c-a5aa-392f14ceebb1\") " Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.626177 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2570535-673c-495c-a5aa-392f14ceebb1-kube-api-access-5gdbx" (OuterVolumeSpecName: "kube-api-access-5gdbx") pod "e2570535-673c-495c-a5aa-392f14ceebb1" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1"). InnerVolumeSpecName "kube-api-access-5gdbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.626292 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2570535-673c-495c-a5aa-392f14ceebb1-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e2570535-673c-495c-a5aa-392f14ceebb1" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.631186 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.647483 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2570535-673c-495c-a5aa-392f14ceebb1-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e2570535-673c-495c-a5aa-392f14ceebb1" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.655091 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2570535-673c-495c-a5aa-392f14ceebb1-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e2570535-673c-495c-a5aa-392f14ceebb1" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.655405 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2570535-673c-495c-a5aa-392f14ceebb1-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e2570535-673c-495c-a5aa-392f14ceebb1" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.657509 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2570535-673c-495c-a5aa-392f14ceebb1-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e2570535-673c-495c-a5aa-392f14ceebb1" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.658726 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2570535-673c-495c-a5aa-392f14ceebb1-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e2570535-673c-495c-a5aa-392f14ceebb1" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.659016 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "e2570535-673c-495c-a5aa-392f14ceebb1" (UID: "e2570535-673c-495c-a5aa-392f14ceebb1"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Apr 02 13:42:44 crc kubenswrapper[4732]: W0402 13:42:44.667897 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc78649c4_5d4a_4dac_a180_6ee450fd150f.slice/crio-dee30ab317976ac46c6ccd04c66dd3ae5297afb2c97933449128c811ffb49903 WatchSource:0}: Error finding container dee30ab317976ac46c6ccd04c66dd3ae5297afb2c97933449128c811ffb49903: Status 404 returned error can't find the container with id dee30ab317976ac46c6ccd04c66dd3ae5297afb2c97933449128c811ffb49903 Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.722403 4732 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2570535-673c-495c-a5aa-392f14ceebb1-registry-tls\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.722447 4732 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2570535-673c-495c-a5aa-392f14ceebb1-registry-certificates\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.722505 4732 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2570535-673c-495c-a5aa-392f14ceebb1-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.722536 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gdbx\" (UniqueName: \"kubernetes.io/projected/e2570535-673c-495c-a5aa-392f14ceebb1-kube-api-access-5gdbx\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.722922 4732 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2570535-673c-495c-a5aa-392f14ceebb1-bound-sa-token\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.722936 4732 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2570535-673c-495c-a5aa-392f14ceebb1-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.722950 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2570535-673c-495c-a5aa-392f14ceebb1-trusted-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.830343 4732 scope.go:117] "RemoveContainer" containerID="cebfc68889bf68f1dee6260dd61cc2f0f7ebd22790507b1bca9eb8eb36c9a1ed" Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.914299 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zhxz4"] Apr 02 13:42:44 crc kubenswrapper[4732]: I0402 13:42:44.916761 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zhxz4"] Apr 02 13:42:45 crc kubenswrapper[4732]: I0402 13:42:45.131228 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5f47bfd98d-lmc96"] Apr 02 13:42:45 crc kubenswrapper[4732]: I0402 13:42:45.135565 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Apr 02 13:42:45 crc kubenswrapper[4732]: I0402 13:42:45.142287 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4"] Apr 02 13:42:45 crc kubenswrapper[4732]: I0402 13:42:45.164212 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86c47b858b-ktqm8"] Apr 02 13:42:45 crc kubenswrapper[4732]: W0402 13:42:45.187237 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc379e504_2a74_402a_a807_0865a0ada4ba.slice/crio-de69d9113ebe1477bb8867cb7401dc830f8ec90c6d932f0db02d240462de1267 WatchSource:0}: Error finding container de69d9113ebe1477bb8867cb7401dc830f8ec90c6d932f0db02d240462de1267: Status 404 returned error can't find the container with id de69d9113ebe1477bb8867cb7401dc830f8ec90c6d932f0db02d240462de1267 Apr 02 13:42:45 crc kubenswrapper[4732]: W0402 13:42:45.189248 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28e25a15_5386_4a3e_a772_6aba6096471e.slice/crio-00f630d0a24fce94b9715039a314e494d99934dcc71ebd092b78054f5e78c689 WatchSource:0}: Error finding container 00f630d0a24fce94b9715039a314e494d99934dcc71ebd092b78054f5e78c689: Status 404 returned error can't find the container with id 00f630d0a24fce94b9715039a314e494d99934dcc71ebd092b78054f5e78c689 Apr 02 13:42:45 crc kubenswrapper[4732]: I0402 13:42:45.217128 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-11-crc"] Apr 02 13:42:45 crc kubenswrapper[4732]: W0402 13:42:45.247941 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod536912b9_ad03_42a4_bce9_754227ecbf82.slice/crio-a2743df34f20dc28efec2cd4b35b2691ac15e3a08e2667d8a67c1cfa8b626837 WatchSource:0}: Error finding container a2743df34f20dc28efec2cd4b35b2691ac15e3a08e2667d8a67c1cfa8b626837: Status 404 returned error can't find the container with id a2743df34f20dc28efec2cd4b35b2691ac15e3a08e2667d8a67c1cfa8b626837 Apr 02 13:42:45 crc kubenswrapper[4732]: I0402 13:42:45.573460 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" event={"ID":"28e25a15-5386-4a3e-a772-6aba6096471e","Type":"ContainerStarted","Data":"00f630d0a24fce94b9715039a314e494d99934dcc71ebd092b78054f5e78c689"} Apr 02 13:42:45 crc kubenswrapper[4732]: I0402 13:42:45.577412 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"85581866-f172-4f4e-8805-55c1f175201d","Type":"ContainerStarted","Data":"ff321f3dcfa693e047dd8803f74b4a338b473cfba76c14e20b65f450f95ac865"} Apr 02 13:42:45 crc kubenswrapper[4732]: I0402 13:42:45.578540 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"293d710f-0d74-455a-ace7-1dcaa32d9b7e","Type":"ContainerStarted","Data":"04767beffe000397e9fddeeaa02e9baf592f7c081aa24fe8c6151a4cdb4e1a41"} Apr 02 13:42:45 crc kubenswrapper[4732]: I0402 13:42:45.581510 4732 generic.go:334] "Generic (PLEG): container finished" podID="36f26e27-d72e-42f2-9380-598616e5626b" containerID="8edceb65839c67f66eed8bc8e4d52eb4c06c7266fbc4da72f6aeb242445e8805" exitCode=0 Apr 02 13:42:45 crc kubenswrapper[4732]: I0402 13:42:45.581573 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f998c" event={"ID":"36f26e27-d72e-42f2-9380-598616e5626b","Type":"ContainerDied","Data":"8edceb65839c67f66eed8bc8e4d52eb4c06c7266fbc4da72f6aeb242445e8805"} Apr 02 13:42:45 crc kubenswrapper[4732]: I0402 13:42:45.582094 4732 scope.go:117] "RemoveContainer" containerID="8edceb65839c67f66eed8bc8e4d52eb4c06c7266fbc4da72f6aeb242445e8805" Apr 02 13:42:45 crc kubenswrapper[4732]: I0402 13:42:45.583512 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" event={"ID":"3632e96a-6031-4b71-9651-bfef4302cdc1","Type":"ContainerStarted","Data":"d7ce39c953a1f10676d74012bf01437d47390568b365a2a47992b74e479fb944"} Apr 02 13:42:45 crc kubenswrapper[4732]: I0402 13:42:45.596826 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4" event={"ID":"c379e504-2a74-402a-a807-0865a0ada4ba","Type":"ContainerStarted","Data":"de69d9113ebe1477bb8867cb7401dc830f8ec90c6d932f0db02d240462de1267"} Apr 02 13:42:45 crc kubenswrapper[4732]: I0402 13:42:45.606548 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-10-crc" event={"ID":"377f12b4-3628-432e-8132-f75725645672","Type":"ContainerStarted","Data":"7560b3db871dd6db25752f2dbe4b7295b1899afc99f6616ae7cb1caca595a3ae"} Apr 02 13:42:45 crc kubenswrapper[4732]: I0402 13:42:45.607681 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-7-crc" event={"ID":"c78649c4-5d4a-4dac-a180-6ee450fd150f","Type":"ContainerStarted","Data":"dee30ab317976ac46c6ccd04c66dd3ae5297afb2c97933449128c811ffb49903"} Apr 02 13:42:45 crc kubenswrapper[4732]: I0402 13:42:45.608801 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-11-crc" event={"ID":"536912b9-ad03-42a4-bce9-754227ecbf82","Type":"ContainerStarted","Data":"a2743df34f20dc28efec2cd4b35b2691ac15e3a08e2667d8a67c1cfa8b626837"} Apr 02 13:42:46 crc kubenswrapper[4732]: E0402 13:42:46.350250 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Apr 02 13:42:46 crc kubenswrapper[4732]: E0402 13:42:46.350743 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zjk9n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-9lrnf_openshift-marketplace(51a0e365-014c-40e8-8749-7512f2c00758): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Apr 02 13:42:46 crc kubenswrapper[4732]: E0402 13:42:46.352852 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-9lrnf" podUID="51a0e365-014c-40e8-8749-7512f2c00758" Apr 02 13:42:46 crc kubenswrapper[4732]: I0402 13:42:46.623127 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-10-crc" event={"ID":"377f12b4-3628-432e-8132-f75725645672","Type":"ContainerStarted","Data":"e56a5c1770b0ed982a28969862361ec3aa0a550013a431010760bdba8ce138c4"} Apr 02 13:42:46 crc kubenswrapper[4732]: I0402 13:42:46.625299 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-85455b8986-kh7t7" event={"ID":"0664f762-59d2-4c95-8e8f-698af0b15611","Type":"ContainerStarted","Data":"7e14aa9bb1ec7d5d53bd39ca3de09280350f61d5807a65f17ec8cff13a157b62"} Apr 02 13:42:46 crc kubenswrapper[4732]: I0402 13:42:46.626920 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4" event={"ID":"c379e504-2a74-402a-a807-0865a0ada4ba","Type":"ContainerStarted","Data":"c0501f1a3aaf796bc15ec35cecad20d83c066c22c2d1d3117e0ffc5604ad0f31"} Apr 02 13:42:46 crc kubenswrapper[4732]: I0402 13:42:46.628969 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"85581866-f172-4f4e-8805-55c1f175201d","Type":"ContainerStarted","Data":"e6374bf349adff5413cc0d8aed38175d79c4c9b0979472cafc760c99b1204cf2"} Apr 02 13:42:46 crc kubenswrapper[4732]: I0402 13:42:46.637640 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" event={"ID":"3632e96a-6031-4b71-9651-bfef4302cdc1","Type":"ContainerStarted","Data":"276b3da2bcfbb90722f1b91f5cbf1d7d4a6c7697580e81f7f299a39892fba1fe"} Apr 02 13:42:46 crc kubenswrapper[4732]: I0402 13:42:46.640000 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"293d710f-0d74-455a-ace7-1dcaa32d9b7e","Type":"ContainerStarted","Data":"cd6c91874c8a335b52156c576147b2ae9f33aef4dd90600b84e3ea859663e6f1"} Apr 02 13:42:46 crc kubenswrapper[4732]: I0402 13:42:46.646334 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-10-crc" event={"ID":"34ac97a0-6ef7-4d42-84e9-8926e28a822d","Type":"ContainerStarted","Data":"e317b45e0782b66638d2821abb3eaf3f34c0ae712c571e45720b02978806d7bd"} Apr 02 13:42:46 crc kubenswrapper[4732]: I0402 13:42:46.652428 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sxw8j" event={"ID":"ac5f29d0-13ce-46eb-babc-70f32ac34feb","Type":"ContainerStarted","Data":"9761cb07847682e3685770b3c83e8bc7ca190526437027a9b1df7f60053b3296"} Apr 02 13:42:46 crc kubenswrapper[4732]: I0402 13:42:46.656565 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8m6h" event={"ID":"fc020ab9-6c58-4571-ad12-3e22c8472a85","Type":"ContainerStarted","Data":"418b78c96578df3f70a9fe7080a4416e8fdf4289594d3bcab9123e9d2edef193"} Apr 02 13:42:46 crc kubenswrapper[4732]: I0402 13:42:46.662406 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f998c" event={"ID":"36f26e27-d72e-42f2-9380-598616e5626b","Type":"ContainerStarted","Data":"431cefe2b139e6ba73b61339faf9240f4e1b3a7c3145a0f8889eeb88b815f38a"} Apr 02 13:42:46 crc kubenswrapper[4732]: I0402 13:42:46.664396 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" event={"ID":"57f69fbf-b90f-411e-8e0c-70a25bb01566","Type":"ContainerStarted","Data":"0ed979adaba6f9d50a6af5062245f390e4ff9c86cf2c4cb22731d7cc664ad8ff"} Apr 02 13:42:46 crc kubenswrapper[4732]: I0402 13:42:46.664760 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:42:46 crc kubenswrapper[4732]: I0402 13:42:46.666560 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-11-crc" event={"ID":"536912b9-ad03-42a4-bce9-754227ecbf82","Type":"ContainerStarted","Data":"34982d3e5cd4a3198e55e7a07fba2bad10f53ca737a09a58e0bb8620786d7555"} Apr 02 13:42:46 crc kubenswrapper[4732]: I0402 13:42:46.667957 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" event={"ID":"28e25a15-5386-4a3e-a772-6aba6096471e","Type":"ContainerStarted","Data":"71318aec4005acc7bb0146248defa3792899a3c86b676740054654f0c884a5b3"} Apr 02 13:42:46 crc kubenswrapper[4732]: I0402 13:42:46.669367 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68b6f48864-m96db" event={"ID":"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828","Type":"ContainerStarted","Data":"6c36131e366d0172e2e61f924497f33b1ecfc3b51ebfdbbb9cc2bae1361382ce"} Apr 02 13:42:46 crc kubenswrapper[4732]: I0402 13:42:46.670783 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-11-crc" event={"ID":"d8ba08f5-0264-484a-a73a-a8659ce79e10","Type":"ContainerStarted","Data":"3d25971f239eb7e525bf6569daa562afa55bd16b255ab484731b7bba9d60ce25"} Apr 02 13:42:46 crc kubenswrapper[4732]: I0402 13:42:46.678417 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-95tg7" event={"ID":"d0bcf0e9-75b6-443e-afd8-e2fb6f807e90","Type":"ContainerStarted","Data":"15824fd96d99c0c5e109d64efea32a2f8e0b2f69c29932d75189e8e6639aeb10"} Apr 02 13:42:46 crc kubenswrapper[4732]: I0402 13:42:46.699644 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2570535-673c-495c-a5aa-392f14ceebb1" path="/var/lib/kubelet/pods/e2570535-673c-495c-a5aa-392f14ceebb1/volumes" Apr 02 13:42:46 crc kubenswrapper[4732]: E0402 13:42:46.700112 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-9lrnf" podUID="51a0e365-014c-40e8-8749-7512f2c00758" Apr 02 13:42:46 crc kubenswrapper[4732]: I0402 13:42:46.735640 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68b6f48864-m96db" podStartSLOduration=60.735599272 podStartE2EDuration="1m0.735599272s" podCreationTimestamp="2026-04-02 13:41:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:42:46.732991104 +0000 UTC m=+323.637398667" watchObservedRunningTime="2026-04-02 13:42:46.735599272 +0000 UTC m=+323.640006835" Apr 02 13:42:46 crc kubenswrapper[4732]: I0402 13:42:46.776687 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-7-crc" event={"ID":"c78649c4-5d4a-4dac-a180-6ee450fd150f","Type":"ContainerStarted","Data":"8b5be373b9699d8e3dd7d6b2036f18d8b0bd699e3262f684d778f605e1141f6b"} Apr 02 13:42:46 crc kubenswrapper[4732]: I0402 13:42:46.797735 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" podStartSLOduration=52.797716464 podStartE2EDuration="52.797716464s" podCreationTimestamp="2026-04-02 13:41:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:42:46.797055727 +0000 UTC m=+323.701463300" watchObservedRunningTime="2026-04-02 13:42:46.797716464 +0000 UTC m=+323.702124017" Apr 02 13:42:47 crc kubenswrapper[4732]: E0402 13:42:47.155432 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Apr 02 13:42:47 crc kubenswrapper[4732]: E0402 13:42:47.155592 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nhqpl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-z7v99_openshift-marketplace(72981c60-e9a1-4e25-9b64-7493d6fdaab6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Apr 02 13:42:47 crc kubenswrapper[4732]: E0402 13:42:47.156984 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-z7v99" podUID="72981c60-e9a1-4e25-9b64-7493d6fdaab6" Apr 02 13:42:47 crc kubenswrapper[4732]: I0402 13:42:47.707279 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-10-crc_377f12b4-3628-432e-8132-f75725645672/installer/0.log" Apr 02 13:42:47 crc kubenswrapper[4732]: I0402 13:42:47.707551 4732 generic.go:334] "Generic (PLEG): container finished" podID="377f12b4-3628-432e-8132-f75725645672" containerID="e56a5c1770b0ed982a28969862361ec3aa0a550013a431010760bdba8ce138c4" exitCode=1 Apr 02 13:42:47 crc kubenswrapper[4732]: I0402 13:42:47.707624 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-10-crc" event={"ID":"377f12b4-3628-432e-8132-f75725645672","Type":"ContainerDied","Data":"e56a5c1770b0ed982a28969862361ec3aa0a550013a431010760bdba8ce138c4"} Apr 02 13:42:47 crc kubenswrapper[4732]: I0402 13:42:47.709106 4732 generic.go:334] "Generic (PLEG): container finished" podID="34ac97a0-6ef7-4d42-84e9-8926e28a822d" containerID="e317b45e0782b66638d2821abb3eaf3f34c0ae712c571e45720b02978806d7bd" exitCode=0 Apr 02 13:42:47 crc kubenswrapper[4732]: I0402 13:42:47.709185 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-10-crc" event={"ID":"34ac97a0-6ef7-4d42-84e9-8926e28a822d","Type":"ContainerDied","Data":"e317b45e0782b66638d2821abb3eaf3f34c0ae712c571e45720b02978806d7bd"} Apr 02 13:42:47 crc kubenswrapper[4732]: I0402 13:42:47.710871 4732 generic.go:334] "Generic (PLEG): container finished" podID="d8ba08f5-0264-484a-a73a-a8659ce79e10" containerID="3d25971f239eb7e525bf6569daa562afa55bd16b255ab484731b7bba9d60ce25" exitCode=0 Apr 02 13:42:47 crc kubenswrapper[4732]: I0402 13:42:47.710933 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-11-crc" event={"ID":"d8ba08f5-0264-484a-a73a-a8659ce79e10","Type":"ContainerDied","Data":"3d25971f239eb7e525bf6569daa562afa55bd16b255ab484731b7bba9d60ce25"} Apr 02 13:42:47 crc kubenswrapper[4732]: I0402 13:42:47.713819 4732 generic.go:334] "Generic (PLEG): container finished" podID="293d710f-0d74-455a-ace7-1dcaa32d9b7e" containerID="cd6c91874c8a335b52156c576147b2ae9f33aef4dd90600b84e3ea859663e6f1" exitCode=0 Apr 02 13:42:47 crc kubenswrapper[4732]: I0402 13:42:47.713950 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"293d710f-0d74-455a-ace7-1dcaa32d9b7e","Type":"ContainerDied","Data":"cd6c91874c8a335b52156c576147b2ae9f33aef4dd90600b84e3ea859663e6f1"} Apr 02 13:42:47 crc kubenswrapper[4732]: I0402 13:42:47.714768 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4" podUID="c379e504-2a74-402a-a807-0865a0ada4ba" containerName="route-controller-manager" containerID="cri-o://c0501f1a3aaf796bc15ec35cecad20d83c066c22c2d1d3117e0ffc5604ad0f31" gracePeriod=30 Apr 02 13:42:47 crc kubenswrapper[4732]: I0402 13:42:47.747280 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-85455b8986-kh7t7" podStartSLOduration=59.747263708 podStartE2EDuration="59.747263708s" podCreationTimestamp="2026-04-02 13:41:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:42:47.743670223 +0000 UTC m=+324.648077786" watchObservedRunningTime="2026-04-02 13:42:47.747263708 +0000 UTC m=+324.651671261" Apr 02 13:42:47 crc kubenswrapper[4732]: I0402 13:42:47.845726 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=28.845688293 podStartE2EDuration="28.845688293s" podCreationTimestamp="2026-04-02 13:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:42:47.84407313 +0000 UTC m=+324.748480683" watchObservedRunningTime="2026-04-02 13:42:47.845688293 +0000 UTC m=+324.750095876" Apr 02 13:42:47 crc kubenswrapper[4732]: I0402 13:42:47.893433 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4" podStartSLOduration=30.893410896 podStartE2EDuration="30.893410896s" podCreationTimestamp="2026-04-02 13:42:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:42:47.891025992 +0000 UTC m=+324.795433565" watchObservedRunningTime="2026-04-02 13:42:47.893410896 +0000 UTC m=+324.797818449" Apr 02 13:42:47 crc kubenswrapper[4732]: I0402 13:42:47.942474 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-7-crc" podStartSLOduration=35.942457174 podStartE2EDuration="35.942457174s" podCreationTimestamp="2026-04-02 13:42:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:42:47.938691233 +0000 UTC m=+324.843098806" watchObservedRunningTime="2026-04-02 13:42:47.942457174 +0000 UTC m=+324.846864717" Apr 02 13:42:48 crc kubenswrapper[4732]: E0402 13:42:48.007583 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-z7v99" podUID="72981c60-e9a1-4e25-9b64-7493d6fdaab6" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.016423 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-11-crc" podStartSLOduration=29.016405206 podStartE2EDuration="29.016405206s" podCreationTimestamp="2026-04-02 13:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:42:47.990519976 +0000 UTC m=+324.894927529" watchObservedRunningTime="2026-04-02 13:42:48.016405206 +0000 UTC m=+324.920812759" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.017448 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" podStartSLOduration=12.017440543 podStartE2EDuration="12.017440543s" podCreationTimestamp="2026-04-02 13:42:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:42:48.015093311 +0000 UTC m=+324.919500894" watchObservedRunningTime="2026-04-02 13:42:48.017440543 +0000 UTC m=+324.921848106" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.040200 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" podStartSLOduration=32.04018155 podStartE2EDuration="32.04018155s" podCreationTimestamp="2026-04-02 13:42:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:42:48.038229128 +0000 UTC m=+324.942636691" watchObservedRunningTime="2026-04-02 13:42:48.04018155 +0000 UTC m=+324.944589113" Apr 02 13:42:48 crc kubenswrapper[4732]: E0402 13:42:48.082847 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Apr 02 13:42:48 crc kubenswrapper[4732]: E0402 13:42:48.083022 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-729nf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8ps5w_openshift-marketplace(9058e533-24e2-44f1-8631-dd9bf6a37192): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Apr 02 13:42:48 crc kubenswrapper[4732]: E0402 13:42:48.084179 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-8ps5w" podUID="9058e533-24e2-44f1-8631-dd9bf6a37192" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.113692 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-10-crc_377f12b4-3628-432e-8132-f75725645672/installer/0.log" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.114100 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-10-crc" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.176037 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/377f12b4-3628-432e-8132-f75725645672-kube-api-access\") pod \"377f12b4-3628-432e-8132-f75725645672\" (UID: \"377f12b4-3628-432e-8132-f75725645672\") " Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.176102 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/377f12b4-3628-432e-8132-f75725645672-var-lock\") pod \"377f12b4-3628-432e-8132-f75725645672\" (UID: \"377f12b4-3628-432e-8132-f75725645672\") " Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.176163 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/377f12b4-3628-432e-8132-f75725645672-kubelet-dir\") pod \"377f12b4-3628-432e-8132-f75725645672\" (UID: \"377f12b4-3628-432e-8132-f75725645672\") " Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.176319 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/377f12b4-3628-432e-8132-f75725645672-var-lock" (OuterVolumeSpecName: "var-lock") pod "377f12b4-3628-432e-8132-f75725645672" (UID: "377f12b4-3628-432e-8132-f75725645672"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.176387 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/377f12b4-3628-432e-8132-f75725645672-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "377f12b4-3628-432e-8132-f75725645672" (UID: "377f12b4-3628-432e-8132-f75725645672"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.178821 4732 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/377f12b4-3628-432e-8132-f75725645672-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.178842 4732 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/377f12b4-3628-432e-8132-f75725645672-var-lock\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.192674 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/377f12b4-3628-432e-8132-f75725645672-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "377f12b4-3628-432e-8132-f75725645672" (UID: "377f12b4-3628-432e-8132-f75725645672"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.244636 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.280252 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c379e504-2a74-402a-a807-0865a0ada4ba-config\") pod \"c379e504-2a74-402a-a807-0865a0ada4ba\" (UID: \"c379e504-2a74-402a-a807-0865a0ada4ba\") " Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.280309 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxlng\" (UniqueName: \"kubernetes.io/projected/c379e504-2a74-402a-a807-0865a0ada4ba-kube-api-access-rxlng\") pod \"c379e504-2a74-402a-a807-0865a0ada4ba\" (UID: \"c379e504-2a74-402a-a807-0865a0ada4ba\") " Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.280340 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c379e504-2a74-402a-a807-0865a0ada4ba-serving-cert\") pod \"c379e504-2a74-402a-a807-0865a0ada4ba\" (UID: \"c379e504-2a74-402a-a807-0865a0ada4ba\") " Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.280357 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c379e504-2a74-402a-a807-0865a0ada4ba-client-ca\") pod \"c379e504-2a74-402a-a807-0865a0ada4ba\" (UID: \"c379e504-2a74-402a-a807-0865a0ada4ba\") " Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.280678 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/377f12b4-3628-432e-8132-f75725645672-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.281861 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c379e504-2a74-402a-a807-0865a0ada4ba-config" (OuterVolumeSpecName: "config") pod "c379e504-2a74-402a-a807-0865a0ada4ba" (UID: "c379e504-2a74-402a-a807-0865a0ada4ba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.281899 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c379e504-2a74-402a-a807-0865a0ada4ba-client-ca" (OuterVolumeSpecName: "client-ca") pod "c379e504-2a74-402a-a807-0865a0ada4ba" (UID: "c379e504-2a74-402a-a807-0865a0ada4ba"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.284843 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c379e504-2a74-402a-a807-0865a0ada4ba-kube-api-access-rxlng" (OuterVolumeSpecName: "kube-api-access-rxlng") pod "c379e504-2a74-402a-a807-0865a0ada4ba" (UID: "c379e504-2a74-402a-a807-0865a0ada4ba"). InnerVolumeSpecName "kube-api-access-rxlng". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.284976 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c379e504-2a74-402a-a807-0865a0ada4ba-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c379e504-2a74-402a-a807-0865a0ada4ba" (UID: "c379e504-2a74-402a-a807-0865a0ada4ba"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.381884 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxlng\" (UniqueName: \"kubernetes.io/projected/c379e504-2a74-402a-a807-0865a0ada4ba-kube-api-access-rxlng\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.381920 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c379e504-2a74-402a-a807-0865a0ada4ba-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.381955 4732 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c379e504-2a74-402a-a807-0865a0ada4ba-client-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.381967 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c379e504-2a74-402a-a807-0865a0ada4ba-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.722051 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585622-lvtgw" event={"ID":"65013539-f3b4-4513-881a-14408a922424","Type":"ContainerStarted","Data":"4f97db3ca878363f668def2b878f242e7584741f7042e02345e8ff27a6849eaa"} Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.725799 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qlgxl" event={"ID":"1827909b-49ea-4ba8-9995-f525d1d82f45","Type":"ContainerStarted","Data":"cc2f5fdab330cb1eec128d2812084addca824200ee57c2d5d69ce92760196169"} Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.727377 4732 generic.go:334] "Generic (PLEG): container finished" podID="c379e504-2a74-402a-a807-0865a0ada4ba" containerID="c0501f1a3aaf796bc15ec35cecad20d83c066c22c2d1d3117e0ffc5604ad0f31" exitCode=0 Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.727466 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.727625 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4" event={"ID":"c379e504-2a74-402a-a807-0865a0ada4ba","Type":"ContainerDied","Data":"c0501f1a3aaf796bc15ec35cecad20d83c066c22c2d1d3117e0ffc5604ad0f31"} Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.727666 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4" event={"ID":"c379e504-2a74-402a-a807-0865a0ada4ba","Type":"ContainerDied","Data":"de69d9113ebe1477bb8867cb7401dc830f8ec90c6d932f0db02d240462de1267"} Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.727689 4732 scope.go:117] "RemoveContainer" containerID="c0501f1a3aaf796bc15ec35cecad20d83c066c22c2d1d3117e0ffc5604ad0f31" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.734313 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585620-t897v" event={"ID":"9a82c61a-7d7e-4401-963a-1f1fe908002c","Type":"ContainerStarted","Data":"4fc7dc4786a810a2f97f61576688118cfe6a02779cb70bcb97a45de56c11f618"} Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.738134 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29585622-lvtgw" podStartSLOduration=45.073609585 podStartE2EDuration="48.738115211s" podCreationTimestamp="2026-04-02 13:42:00 +0000 UTC" firstStartedPulling="2026-04-02 13:42:44.482733449 +0000 UTC m=+321.387140992" lastFinishedPulling="2026-04-02 13:42:48.147239065 +0000 UTC m=+325.051646618" observedRunningTime="2026-04-02 13:42:48.733077227 +0000 UTC m=+325.637484780" watchObservedRunningTime="2026-04-02 13:42:48.738115211 +0000 UTC m=+325.642522764" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.741699 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-10-crc_377f12b4-3628-432e-8132-f75725645672/installer/0.log" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.742423 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-10-crc" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.742785 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-10-crc" event={"ID":"377f12b4-3628-432e-8132-f75725645672","Type":"ContainerDied","Data":"7560b3db871dd6db25752f2dbe4b7295b1899afc99f6616ae7cb1caca595a3ae"} Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.752007 4732 scope.go:117] "RemoveContainer" containerID="c0501f1a3aaf796bc15ec35cecad20d83c066c22c2d1d3117e0ffc5604ad0f31" Apr 02 13:42:48 crc kubenswrapper[4732]: E0402 13:42:48.751993 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8ps5w" podUID="9058e533-24e2-44f1-8631-dd9bf6a37192" Apr 02 13:42:48 crc kubenswrapper[4732]: E0402 13:42:48.757367 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0501f1a3aaf796bc15ec35cecad20d83c066c22c2d1d3117e0ffc5604ad0f31\": container with ID starting with c0501f1a3aaf796bc15ec35cecad20d83c066c22c2d1d3117e0ffc5604ad0f31 not found: ID does not exist" containerID="c0501f1a3aaf796bc15ec35cecad20d83c066c22c2d1d3117e0ffc5604ad0f31" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.757408 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0501f1a3aaf796bc15ec35cecad20d83c066c22c2d1d3117e0ffc5604ad0f31"} err="failed to get container status \"c0501f1a3aaf796bc15ec35cecad20d83c066c22c2d1d3117e0ffc5604ad0f31\": rpc error: code = NotFound desc = could not find container \"c0501f1a3aaf796bc15ec35cecad20d83c066c22c2d1d3117e0ffc5604ad0f31\": container with ID starting with c0501f1a3aaf796bc15ec35cecad20d83c066c22c2d1d3117e0ffc5604ad0f31 not found: ID does not exist" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.757429 4732 scope.go:117] "RemoveContainer" containerID="e56a5c1770b0ed982a28969862361ec3aa0a550013a431010760bdba8ce138c4" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.775636 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4"] Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.778508 4732 csr.go:261] certificate signing request csr-zcl4c is approved, waiting to be issued Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.783123 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bfd65bd69-hd7v4"] Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.786587 4732 csr.go:257] certificate signing request csr-zcl4c is issued Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.815029 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-10-crc"] Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.826055 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/installer-10-crc"] Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.827948 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29585620-t897v" podStartSLOduration=93.43055876 podStartE2EDuration="2m48.827937377s" podCreationTimestamp="2026-04-02 13:40:00 +0000 UTC" firstStartedPulling="2026-04-02 13:41:32.752102828 +0000 UTC m=+249.656510381" lastFinishedPulling="2026-04-02 13:42:48.149481445 +0000 UTC m=+325.053888998" observedRunningTime="2026-04-02 13:42:48.824414263 +0000 UTC m=+325.728821816" watchObservedRunningTime="2026-04-02 13:42:48.827937377 +0000 UTC m=+325.732344930" Apr 02 13:42:48 crc kubenswrapper[4732]: I0402 13:42:48.989390 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-10-crc" Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.100594 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34ac97a0-6ef7-4d42-84e9-8926e28a822d-kubelet-dir\") pod \"34ac97a0-6ef7-4d42-84e9-8926e28a822d\" (UID: \"34ac97a0-6ef7-4d42-84e9-8926e28a822d\") " Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.101003 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/34ac97a0-6ef7-4d42-84e9-8926e28a822d-kube-api-access\") pod \"34ac97a0-6ef7-4d42-84e9-8926e28a822d\" (UID: \"34ac97a0-6ef7-4d42-84e9-8926e28a822d\") " Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.100729 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34ac97a0-6ef7-4d42-84e9-8926e28a822d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "34ac97a0-6ef7-4d42-84e9-8926e28a822d" (UID: "34ac97a0-6ef7-4d42-84e9-8926e28a822d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.101285 4732 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34ac97a0-6ef7-4d42-84e9-8926e28a822d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.106746 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34ac97a0-6ef7-4d42-84e9-8926e28a822d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "34ac97a0-6ef7-4d42-84e9-8926e28a822d" (UID: "34ac97a0-6ef7-4d42-84e9-8926e28a822d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.170471 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-11-crc" Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.176445 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.205910 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.205992 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8ba08f5-0264-484a-a73a-a8659ce79e10-kube-api-access\") pod \"d8ba08f5-0264-484a-a73a-a8659ce79e10\" (UID: \"d8ba08f5-0264-484a-a73a-a8659ce79e10\") " Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.206079 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8ba08f5-0264-484a-a73a-a8659ce79e10-kubelet-dir\") pod \"d8ba08f5-0264-484a-a73a-a8659ce79e10\" (UID: \"d8ba08f5-0264-484a-a73a-a8659ce79e10\") " Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.206128 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/293d710f-0d74-455a-ace7-1dcaa32d9b7e-kube-api-access\") pod \"293d710f-0d74-455a-ace7-1dcaa32d9b7e\" (UID: \"293d710f-0d74-455a-ace7-1dcaa32d9b7e\") " Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.206146 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/293d710f-0d74-455a-ace7-1dcaa32d9b7e-kubelet-dir\") pod \"293d710f-0d74-455a-ace7-1dcaa32d9b7e\" (UID: \"293d710f-0d74-455a-ace7-1dcaa32d9b7e\") " Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.206288 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/293d710f-0d74-455a-ace7-1dcaa32d9b7e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "293d710f-0d74-455a-ace7-1dcaa32d9b7e" (UID: "293d710f-0d74-455a-ace7-1dcaa32d9b7e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.206319 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8ba08f5-0264-484a-a73a-a8659ce79e10-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d8ba08f5-0264-484a-a73a-a8659ce79e10" (UID: "d8ba08f5-0264-484a-a73a-a8659ce79e10"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.206943 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/34ac97a0-6ef7-4d42-84e9-8926e28a822d-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.206967 4732 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8ba08f5-0264-484a-a73a-a8659ce79e10-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.206981 4732 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/293d710f-0d74-455a-ace7-1dcaa32d9b7e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.212791 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8ba08f5-0264-484a-a73a-a8659ce79e10-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d8ba08f5-0264-484a-a73a-a8659ce79e10" (UID: "d8ba08f5-0264-484a-a73a-a8659ce79e10"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.212936 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/293d710f-0d74-455a-ace7-1dcaa32d9b7e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "293d710f-0d74-455a-ace7-1dcaa32d9b7e" (UID: "293d710f-0d74-455a-ace7-1dcaa32d9b7e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.308715 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8ba08f5-0264-484a-a73a-a8659ce79e10-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.308760 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/293d710f-0d74-455a-ace7-1dcaa32d9b7e-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.748378 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"293d710f-0d74-455a-ace7-1dcaa32d9b7e","Type":"ContainerDied","Data":"04767beffe000397e9fddeeaa02e9baf592f7c081aa24fe8c6151a4cdb4e1a41"} Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.748404 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.748416 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04767beffe000397e9fddeeaa02e9baf592f7c081aa24fe8c6151a4cdb4e1a41" Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.749828 4732 generic.go:334] "Generic (PLEG): container finished" podID="9a82c61a-7d7e-4401-963a-1f1fe908002c" containerID="4fc7dc4786a810a2f97f61576688118cfe6a02779cb70bcb97a45de56c11f618" exitCode=0 Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.749881 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585620-t897v" event={"ID":"9a82c61a-7d7e-4401-963a-1f1fe908002c","Type":"ContainerDied","Data":"4fc7dc4786a810a2f97f61576688118cfe6a02779cb70bcb97a45de56c11f618"} Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.753392 4732 generic.go:334] "Generic (PLEG): container finished" podID="65013539-f3b4-4513-881a-14408a922424" containerID="4f97db3ca878363f668def2b878f242e7584741f7042e02345e8ff27a6849eaa" exitCode=0 Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.753458 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585622-lvtgw" event={"ID":"65013539-f3b4-4513-881a-14408a922424","Type":"ContainerDied","Data":"4f97db3ca878363f668def2b878f242e7584741f7042e02345e8ff27a6849eaa"} Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.754712 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-10-crc" event={"ID":"34ac97a0-6ef7-4d42-84e9-8926e28a822d","Type":"ContainerDied","Data":"3b54d41a7194329f721b5e5f726ebb7814762a26e10a1e76fcbe823328d2349e"} Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.754736 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b54d41a7194329f721b5e5f726ebb7814762a26e10a1e76fcbe823328d2349e" Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.754770 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-10-crc" Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.764130 4732 generic.go:334] "Generic (PLEG): container finished" podID="1827909b-49ea-4ba8-9995-f525d1d82f45" containerID="cc2f5fdab330cb1eec128d2812084addca824200ee57c2d5d69ce92760196169" exitCode=0 Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.764201 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qlgxl" event={"ID":"1827909b-49ea-4ba8-9995-f525d1d82f45","Type":"ContainerDied","Data":"cc2f5fdab330cb1eec128d2812084addca824200ee57c2d5d69ce92760196169"} Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.767525 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-11-crc" event={"ID":"d8ba08f5-0264-484a-a73a-a8659ce79e10","Type":"ContainerDied","Data":"df57ac8179fa62ea769a42fe4092e643a87561375af704e81c8011a359aee99c"} Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.767567 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df57ac8179fa62ea769a42fe4092e643a87561375af704e81c8011a359aee99c" Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.767636 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-11-crc" Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.788139 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-05 02:27:36.955717096 +0000 UTC Apr 02 13:42:49 crc kubenswrapper[4732]: I0402 13:42:49.788175 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 5916h44m47.167544672s for next certificate rotation Apr 02 13:42:50 crc kubenswrapper[4732]: I0402 13:42:50.687186 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="377f12b4-3628-432e-8132-f75725645672" path="/var/lib/kubelet/pods/377f12b4-3628-432e-8132-f75725645672/volumes" Apr 02 13:42:50 crc kubenswrapper[4732]: I0402 13:42:50.688148 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c379e504-2a74-402a-a807-0865a0ada4ba" path="/var/lib/kubelet/pods/c379e504-2a74-402a-a807-0865a0ada4ba/volumes" Apr 02 13:42:50 crc kubenswrapper[4732]: I0402 13:42:50.774998 4732 generic.go:334] "Generic (PLEG): container finished" podID="64a68003-b71d-4ac2-aaaf-76b67ed758cd" containerID="589182311f2c5ce767cc69a22a707d8a821dad9f1f96d1f97d12d6b7ee5c6bf1" exitCode=0 Apr 02 13:42:50 crc kubenswrapper[4732]: I0402 13:42:50.775085 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-8qcmg" event={"ID":"64a68003-b71d-4ac2-aaaf-76b67ed758cd","Type":"ContainerDied","Data":"589182311f2c5ce767cc69a22a707d8a821dad9f1f96d1f97d12d6b7ee5c6bf1"} Apr 02 13:42:50 crc kubenswrapper[4732]: I0402 13:42:50.775663 4732 scope.go:117] "RemoveContainer" containerID="589182311f2c5ce767cc69a22a707d8a821dad9f1f96d1f97d12d6b7ee5c6bf1" Apr 02 13:42:50 crc kubenswrapper[4732]: I0402 13:42:50.777867 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qlgxl" event={"ID":"1827909b-49ea-4ba8-9995-f525d1d82f45","Type":"ContainerStarted","Data":"379848342ab71ffa30dea645fdced0d379fcf457ee71fe018e018fae2d13234b"} Apr 02 13:42:50 crc kubenswrapper[4732]: I0402 13:42:50.783038 4732 generic.go:334] "Generic (PLEG): container finished" podID="f0c6d23c-1aff-4a2f-b92b-6cb07a0ca8b6" containerID="1dae3f93711eab60aaaafe1091230be9890464b600d59f4ec28b86c09c09a4eb" exitCode=0 Apr 02 13:42:50 crc kubenswrapper[4732]: I0402 13:42:50.783120 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vmngg" event={"ID":"f0c6d23c-1aff-4a2f-b92b-6cb07a0ca8b6","Type":"ContainerDied","Data":"1dae3f93711eab60aaaafe1091230be9890464b600d59f4ec28b86c09c09a4eb"} Apr 02 13:42:50 crc kubenswrapper[4732]: I0402 13:42:50.783533 4732 scope.go:117] "RemoveContainer" containerID="1dae3f93711eab60aaaafe1091230be9890464b600d59f4ec28b86c09c09a4eb" Apr 02 13:42:50 crc kubenswrapper[4732]: I0402 13:42:50.788344 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-10 12:22:22.328124435 +0000 UTC Apr 02 13:42:50 crc kubenswrapper[4732]: I0402 13:42:50.788386 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6046h39m31.539742987s for next certificate rotation Apr 02 13:42:50 crc kubenswrapper[4732]: I0402 13:42:50.816136 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qlgxl" podStartSLOduration=4.101177669 podStartE2EDuration="1m14.816118417s" podCreationTimestamp="2026-04-02 13:41:36 +0000 UTC" firstStartedPulling="2026-04-02 13:41:39.587399248 +0000 UTC m=+256.491806801" lastFinishedPulling="2026-04-02 13:42:50.302339996 +0000 UTC m=+327.206747549" observedRunningTime="2026-04-02 13:42:50.811028662 +0000 UTC m=+327.715436215" watchObservedRunningTime="2026-04-02 13:42:50.816118417 +0000 UTC m=+327.720525960" Apr 02 13:42:51 crc kubenswrapper[4732]: I0402 13:42:51.199319 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585622-lvtgw" Apr 02 13:42:51 crc kubenswrapper[4732]: I0402 13:42:51.204927 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585620-t897v" Apr 02 13:42:51 crc kubenswrapper[4732]: I0402 13:42:51.242568 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brhqn\" (UniqueName: \"kubernetes.io/projected/65013539-f3b4-4513-881a-14408a922424-kube-api-access-brhqn\") pod \"65013539-f3b4-4513-881a-14408a922424\" (UID: \"65013539-f3b4-4513-881a-14408a922424\") " Apr 02 13:42:51 crc kubenswrapper[4732]: I0402 13:42:51.242705 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jkkc\" (UniqueName: \"kubernetes.io/projected/9a82c61a-7d7e-4401-963a-1f1fe908002c-kube-api-access-2jkkc\") pod \"9a82c61a-7d7e-4401-963a-1f1fe908002c\" (UID: \"9a82c61a-7d7e-4401-963a-1f1fe908002c\") " Apr 02 13:42:51 crc kubenswrapper[4732]: I0402 13:42:51.248572 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65013539-f3b4-4513-881a-14408a922424-kube-api-access-brhqn" (OuterVolumeSpecName: "kube-api-access-brhqn") pod "65013539-f3b4-4513-881a-14408a922424" (UID: "65013539-f3b4-4513-881a-14408a922424"). InnerVolumeSpecName "kube-api-access-brhqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:42:51 crc kubenswrapper[4732]: I0402 13:42:51.248945 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a82c61a-7d7e-4401-963a-1f1fe908002c-kube-api-access-2jkkc" (OuterVolumeSpecName: "kube-api-access-2jkkc") pod "9a82c61a-7d7e-4401-963a-1f1fe908002c" (UID: "9a82c61a-7d7e-4401-963a-1f1fe908002c"). InnerVolumeSpecName "kube-api-access-2jkkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:42:51 crc kubenswrapper[4732]: I0402 13:42:51.271355 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-69f744f599-8qcmg" Apr 02 13:42:51 crc kubenswrapper[4732]: I0402 13:42:51.281145 4732 patch_prober.go:28] interesting pod/console-f9d7485db-dq9x9 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.29:8443/health\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Apr 02 13:42:51 crc kubenswrapper[4732]: I0402 13:42:51.281232 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dq9x9" podUID="4d77c191-7d04-4381-838f-b7a355e7c2d4" containerName="console" probeResult="failure" output="Get \"https://10.217.0.29:8443/health\": dial tcp 10.217.0.29:8443: connect: connection refused" Apr 02 13:42:51 crc kubenswrapper[4732]: I0402 13:42:51.343909 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jkkc\" (UniqueName: \"kubernetes.io/projected/9a82c61a-7d7e-4401-963a-1f1fe908002c-kube-api-access-2jkkc\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:51 crc kubenswrapper[4732]: I0402 13:42:51.343953 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brhqn\" (UniqueName: \"kubernetes.io/projected/65013539-f3b4-4513-881a-14408a922424-kube-api-access-brhqn\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:51 crc kubenswrapper[4732]: I0402 13:42:51.696890 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vmngg" Apr 02 13:42:51 crc kubenswrapper[4732]: I0402 13:42:51.696946 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vmngg" Apr 02 13:42:51 crc kubenswrapper[4732]: I0402 13:42:51.789537 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vmngg" event={"ID":"f0c6d23c-1aff-4a2f-b92b-6cb07a0ca8b6","Type":"ContainerStarted","Data":"5fc9fd2398d4caa3e2e908f0f835abe587fa34ced36bc1f04365134fe24bd233"} Apr 02 13:42:51 crc kubenswrapper[4732]: I0402 13:42:51.790465 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vmngg" Apr 02 13:42:51 crc kubenswrapper[4732]: I0402 13:42:51.791415 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585620-t897v" event={"ID":"9a82c61a-7d7e-4401-963a-1f1fe908002c","Type":"ContainerDied","Data":"e9e13dddf5e7bb8997e5ced0af1a4f10fa0c20db9f28cc0d660d72a182235460"} Apr 02 13:42:51 crc kubenswrapper[4732]: I0402 13:42:51.791437 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9e13dddf5e7bb8997e5ced0af1a4f10fa0c20db9f28cc0d660d72a182235460" Apr 02 13:42:51 crc kubenswrapper[4732]: I0402 13:42:51.791475 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585620-t897v" Apr 02 13:42:51 crc kubenswrapper[4732]: I0402 13:42:51.798760 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585622-lvtgw" event={"ID":"65013539-f3b4-4513-881a-14408a922424","Type":"ContainerDied","Data":"fcc913ed751fad1711a437b19556ca5c9a9cf874e1130911ecfa456a5ddae367"} Apr 02 13:42:51 crc kubenswrapper[4732]: I0402 13:42:51.798810 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcc913ed751fad1711a437b19556ca5c9a9cf874e1130911ecfa456a5ddae367" Apr 02 13:42:51 crc kubenswrapper[4732]: I0402 13:42:51.798773 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585622-lvtgw" Apr 02 13:42:51 crc kubenswrapper[4732]: I0402 13:42:51.803830 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-8qcmg" event={"ID":"64a68003-b71d-4ac2-aaaf-76b67ed758cd","Type":"ContainerStarted","Data":"8b12c8bb2b25a02d4b91f429a7368fe366f924edc1c84d115e7554831b67c271"} Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.558402 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc"] Apr 02 13:42:52 crc kubenswrapper[4732]: E0402 13:42:52.559277 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2570535-673c-495c-a5aa-392f14ceebb1" containerName="registry" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.559308 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2570535-673c-495c-a5aa-392f14ceebb1" containerName="registry" Apr 02 13:42:52 crc kubenswrapper[4732]: E0402 13:42:52.559320 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65013539-f3b4-4513-881a-14408a922424" containerName="oc" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.559328 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="65013539-f3b4-4513-881a-14408a922424" containerName="oc" Apr 02 13:42:52 crc kubenswrapper[4732]: E0402 13:42:52.559337 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="293d710f-0d74-455a-ace7-1dcaa32d9b7e" containerName="pruner" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.559345 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="293d710f-0d74-455a-ace7-1dcaa32d9b7e" containerName="pruner" Apr 02 13:42:52 crc kubenswrapper[4732]: E0402 13:42:52.559356 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ba08f5-0264-484a-a73a-a8659ce79e10" containerName="pruner" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.559362 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ba08f5-0264-484a-a73a-a8659ce79e10" containerName="pruner" Apr 02 13:42:52 crc kubenswrapper[4732]: E0402 13:42:52.559376 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34ac97a0-6ef7-4d42-84e9-8926e28a822d" containerName="pruner" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.559382 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="34ac97a0-6ef7-4d42-84e9-8926e28a822d" containerName="pruner" Apr 02 13:42:52 crc kubenswrapper[4732]: E0402 13:42:52.559394 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="377f12b4-3628-432e-8132-f75725645672" containerName="installer" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.559401 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="377f12b4-3628-432e-8132-f75725645672" containerName="installer" Apr 02 13:42:52 crc kubenswrapper[4732]: E0402 13:42:52.559413 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c379e504-2a74-402a-a807-0865a0ada4ba" containerName="route-controller-manager" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.559421 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c379e504-2a74-402a-a807-0865a0ada4ba" containerName="route-controller-manager" Apr 02 13:42:52 crc kubenswrapper[4732]: E0402 13:42:52.559430 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a82c61a-7d7e-4401-963a-1f1fe908002c" containerName="oc" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.559436 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a82c61a-7d7e-4401-963a-1f1fe908002c" containerName="oc" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.559558 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="293d710f-0d74-455a-ace7-1dcaa32d9b7e" containerName="pruner" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.559574 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="377f12b4-3628-432e-8132-f75725645672" containerName="installer" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.559583 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="34ac97a0-6ef7-4d42-84e9-8926e28a822d" containerName="pruner" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.559594 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a82c61a-7d7e-4401-963a-1f1fe908002c" containerName="oc" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.559604 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2570535-673c-495c-a5aa-392f14ceebb1" containerName="registry" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.559629 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8ba08f5-0264-484a-a73a-a8659ce79e10" containerName="pruner" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.559636 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="65013539-f3b4-4513-881a-14408a922424" containerName="oc" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.559644 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c379e504-2a74-402a-a807-0865a0ada4ba" containerName="route-controller-manager" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.560346 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.563116 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.563123 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.563131 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.563499 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.564310 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.569761 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.570680 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc"] Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.659928 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdbf4139-8114-4f0b-a66d-90f5c1812574-config\") pod \"route-controller-manager-7456d5dd7f-jpxrc\" (UID: \"cdbf4139-8114-4f0b-a66d-90f5c1812574\") " pod="openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.659993 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cdbf4139-8114-4f0b-a66d-90f5c1812574-client-ca\") pod \"route-controller-manager-7456d5dd7f-jpxrc\" (UID: \"cdbf4139-8114-4f0b-a66d-90f5c1812574\") " pod="openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.660019 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68bcr\" (UniqueName: \"kubernetes.io/projected/cdbf4139-8114-4f0b-a66d-90f5c1812574-kube-api-access-68bcr\") pod \"route-controller-manager-7456d5dd7f-jpxrc\" (UID: \"cdbf4139-8114-4f0b-a66d-90f5c1812574\") " pod="openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.660051 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdbf4139-8114-4f0b-a66d-90f5c1812574-serving-cert\") pod \"route-controller-manager-7456d5dd7f-jpxrc\" (UID: \"cdbf4139-8114-4f0b-a66d-90f5c1812574\") " pod="openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.761313 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdbf4139-8114-4f0b-a66d-90f5c1812574-serving-cert\") pod \"route-controller-manager-7456d5dd7f-jpxrc\" (UID: \"cdbf4139-8114-4f0b-a66d-90f5c1812574\") " pod="openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.761514 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdbf4139-8114-4f0b-a66d-90f5c1812574-config\") pod \"route-controller-manager-7456d5dd7f-jpxrc\" (UID: \"cdbf4139-8114-4f0b-a66d-90f5c1812574\") " pod="openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.761572 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cdbf4139-8114-4f0b-a66d-90f5c1812574-client-ca\") pod \"route-controller-manager-7456d5dd7f-jpxrc\" (UID: \"cdbf4139-8114-4f0b-a66d-90f5c1812574\") " pod="openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.761593 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68bcr\" (UniqueName: \"kubernetes.io/projected/cdbf4139-8114-4f0b-a66d-90f5c1812574-kube-api-access-68bcr\") pod \"route-controller-manager-7456d5dd7f-jpxrc\" (UID: \"cdbf4139-8114-4f0b-a66d-90f5c1812574\") " pod="openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.762798 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cdbf4139-8114-4f0b-a66d-90f5c1812574-client-ca\") pod \"route-controller-manager-7456d5dd7f-jpxrc\" (UID: \"cdbf4139-8114-4f0b-a66d-90f5c1812574\") " pod="openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.763081 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdbf4139-8114-4f0b-a66d-90f5c1812574-config\") pod \"route-controller-manager-7456d5dd7f-jpxrc\" (UID: \"cdbf4139-8114-4f0b-a66d-90f5c1812574\") " pod="openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.771650 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdbf4139-8114-4f0b-a66d-90f5c1812574-serving-cert\") pod \"route-controller-manager-7456d5dd7f-jpxrc\" (UID: \"cdbf4139-8114-4f0b-a66d-90f5c1812574\") " pod="openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.780381 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68bcr\" (UniqueName: \"kubernetes.io/projected/cdbf4139-8114-4f0b-a66d-90f5c1812574-kube-api-access-68bcr\") pod \"route-controller-manager-7456d5dd7f-jpxrc\" (UID: \"cdbf4139-8114-4f0b-a66d-90f5c1812574\") " pod="openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc" Apr 02 13:42:52 crc kubenswrapper[4732]: I0402 13:42:52.895817 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc" Apr 02 13:42:53 crc kubenswrapper[4732]: I0402 13:42:53.307078 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc"] Apr 02 13:42:53 crc kubenswrapper[4732]: W0402 13:42:53.317023 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdbf4139_8114_4f0b_a66d_90f5c1812574.slice/crio-992b3a61c729f26815ae946dd2f55426ad1bb5745821d4b85d18253d220128d9 WatchSource:0}: Error finding container 992b3a61c729f26815ae946dd2f55426ad1bb5745821d4b85d18253d220128d9: Status 404 returned error can't find the container with id 992b3a61c729f26815ae946dd2f55426ad1bb5745821d4b85d18253d220128d9 Apr 02 13:42:53 crc kubenswrapper[4732]: I0402 13:42:53.816302 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc" event={"ID":"cdbf4139-8114-4f0b-a66d-90f5c1812574","Type":"ContainerStarted","Data":"992b3a61c729f26815ae946dd2f55426ad1bb5745821d4b85d18253d220128d9"} Apr 02 13:42:53 crc kubenswrapper[4732]: I0402 13:42:53.820755 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vmngg" Apr 02 13:42:53 crc kubenswrapper[4732]: I0402 13:42:53.869291 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" Apr 02 13:42:53 crc kubenswrapper[4732]: I0402 13:42:53.874158 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" Apr 02 13:42:54 crc kubenswrapper[4732]: I0402 13:42:54.824914 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc" event={"ID":"cdbf4139-8114-4f0b-a66d-90f5c1812574","Type":"ContainerStarted","Data":"39affb00167fa8867971495af6eafe8bb3b7936d14cf3fdc1bd4e185d0895f86"} Apr 02 13:42:54 crc kubenswrapper[4732]: I0402 13:42:54.825519 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc" Apr 02 13:42:54 crc kubenswrapper[4732]: I0402 13:42:54.827664 4732 generic.go:334] "Generic (PLEG): container finished" podID="ff7c4e9d-5437-412b-867d-1e44dfc73df5" containerID="4fb56712a2f1e001912270393323570c7b63626f192f74c0fed77beab80d309a" exitCode=0 Apr 02 13:42:54 crc kubenswrapper[4732]: I0402 13:42:54.828637 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r98n2" event={"ID":"ff7c4e9d-5437-412b-867d-1e44dfc73df5","Type":"ContainerDied","Data":"4fb56712a2f1e001912270393323570c7b63626f192f74c0fed77beab80d309a"} Apr 02 13:42:54 crc kubenswrapper[4732]: I0402 13:42:54.828926 4732 scope.go:117] "RemoveContainer" containerID="4fb56712a2f1e001912270393323570c7b63626f192f74c0fed77beab80d309a" Apr 02 13:42:54 crc kubenswrapper[4732]: I0402 13:42:54.833555 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc" Apr 02 13:42:54 crc kubenswrapper[4732]: I0402 13:42:54.849373 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc" podStartSLOduration=18.849343694 podStartE2EDuration="18.849343694s" podCreationTimestamp="2026-04-02 13:42:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:42:54.84094751 +0000 UTC m=+331.745355073" watchObservedRunningTime="2026-04-02 13:42:54.849343694 +0000 UTC m=+331.753751257" Apr 02 13:42:55 crc kubenswrapper[4732]: I0402 13:42:55.845194 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r98n2" event={"ID":"ff7c4e9d-5437-412b-867d-1e44dfc73df5","Type":"ContainerStarted","Data":"9599a5a67be6e29aad5874b1c1b22ea505060e6f21d5da40511603240eea4ff3"} Apr 02 13:42:56 crc kubenswrapper[4732]: I0402 13:42:56.403505 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:42:56 crc kubenswrapper[4732]: I0402 13:42:56.403587 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:42:56 crc kubenswrapper[4732]: I0402 13:42:56.405951 4732 patch_prober.go:28] interesting pod/console-68b6f48864-m96db container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.58:8443/health\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Apr 02 13:42:56 crc kubenswrapper[4732]: I0402 13:42:56.406085 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-68b6f48864-m96db" podUID="a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828" containerName="console" probeResult="failure" output="Get \"https://10.217.0.58:8443/health\": dial tcp 10.217.0.58:8443: connect: connection refused" Apr 02 13:42:56 crc kubenswrapper[4732]: I0402 13:42:56.896315 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86c47b858b-ktqm8"] Apr 02 13:42:56 crc kubenswrapper[4732]: I0402 13:42:56.896508 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" podUID="28e25a15-5386-4a3e-a772-6aba6096471e" containerName="controller-manager" containerID="cri-o://71318aec4005acc7bb0146248defa3792899a3c86b676740054654f0c884a5b3" gracePeriod=30 Apr 02 13:42:56 crc kubenswrapper[4732]: I0402 13:42:56.975039 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:56 crc kubenswrapper[4732]: I0402 13:42:56.980607 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5f47bfd98d-lmc96" Apr 02 13:42:57 crc kubenswrapper[4732]: I0402 13:42:57.000692 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc"] Apr 02 13:42:57 crc kubenswrapper[4732]: I0402 13:42:57.248730 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qlgxl" Apr 02 13:42:57 crc kubenswrapper[4732]: I0402 13:42:57.249060 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qlgxl" Apr 02 13:42:57 crc kubenswrapper[4732]: I0402 13:42:57.737453 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qlgxl" Apr 02 13:42:57 crc kubenswrapper[4732]: I0402 13:42:57.859931 4732 generic.go:334] "Generic (PLEG): container finished" podID="28e25a15-5386-4a3e-a772-6aba6096471e" containerID="71318aec4005acc7bb0146248defa3792899a3c86b676740054654f0c884a5b3" exitCode=0 Apr 02 13:42:57 crc kubenswrapper[4732]: I0402 13:42:57.860034 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" event={"ID":"28e25a15-5386-4a3e-a772-6aba6096471e","Type":"ContainerDied","Data":"71318aec4005acc7bb0146248defa3792899a3c86b676740054654f0c884a5b3"} Apr 02 13:42:57 crc kubenswrapper[4732]: I0402 13:42:57.860354 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc" podUID="cdbf4139-8114-4f0b-a66d-90f5c1812574" containerName="route-controller-manager" containerID="cri-o://39affb00167fa8867971495af6eafe8bb3b7936d14cf3fdc1bd4e185d0895f86" gracePeriod=30 Apr 02 13:42:57 crc kubenswrapper[4732]: I0402 13:42:57.909535 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qlgxl" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.198497 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.232578 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg"] Apr 02 13:42:58 crc kubenswrapper[4732]: E0402 13:42:58.233640 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e25a15-5386-4a3e-a772-6aba6096471e" containerName="controller-manager" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.233715 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e25a15-5386-4a3e-a772-6aba6096471e" containerName="controller-manager" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.233864 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="28e25a15-5386-4a3e-a772-6aba6096471e" containerName="controller-manager" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.234304 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.243938 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg"] Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.245653 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28e25a15-5386-4a3e-a772-6aba6096471e-config\") pod \"28e25a15-5386-4a3e-a772-6aba6096471e\" (UID: \"28e25a15-5386-4a3e-a772-6aba6096471e\") " Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.245826 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28e25a15-5386-4a3e-a772-6aba6096471e-client-ca\") pod \"28e25a15-5386-4a3e-a772-6aba6096471e\" (UID: \"28e25a15-5386-4a3e-a772-6aba6096471e\") " Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.245868 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28e25a15-5386-4a3e-a772-6aba6096471e-proxy-ca-bundles\") pod \"28e25a15-5386-4a3e-a772-6aba6096471e\" (UID: \"28e25a15-5386-4a3e-a772-6aba6096471e\") " Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.245928 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4959w\" (UniqueName: \"kubernetes.io/projected/28e25a15-5386-4a3e-a772-6aba6096471e-kube-api-access-4959w\") pod \"28e25a15-5386-4a3e-a772-6aba6096471e\" (UID: \"28e25a15-5386-4a3e-a772-6aba6096471e\") " Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.246007 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28e25a15-5386-4a3e-a772-6aba6096471e-serving-cert\") pod \"28e25a15-5386-4a3e-a772-6aba6096471e\" (UID: \"28e25a15-5386-4a3e-a772-6aba6096471e\") " Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.251087 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28e25a15-5386-4a3e-a772-6aba6096471e-config" (OuterVolumeSpecName: "config") pod "28e25a15-5386-4a3e-a772-6aba6096471e" (UID: "28e25a15-5386-4a3e-a772-6aba6096471e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.251368 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28e25a15-5386-4a3e-a772-6aba6096471e-client-ca" (OuterVolumeSpecName: "client-ca") pod "28e25a15-5386-4a3e-a772-6aba6096471e" (UID: "28e25a15-5386-4a3e-a772-6aba6096471e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.252379 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28e25a15-5386-4a3e-a772-6aba6096471e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "28e25a15-5386-4a3e-a772-6aba6096471e" (UID: "28e25a15-5386-4a3e-a772-6aba6096471e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.255064 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28e25a15-5386-4a3e-a772-6aba6096471e-kube-api-access-4959w" (OuterVolumeSpecName: "kube-api-access-4959w") pod "28e25a15-5386-4a3e-a772-6aba6096471e" (UID: "28e25a15-5386-4a3e-a772-6aba6096471e"). InnerVolumeSpecName "kube-api-access-4959w". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.255272 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e25a15-5386-4a3e-a772-6aba6096471e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "28e25a15-5386-4a3e-a772-6aba6096471e" (UID: "28e25a15-5386-4a3e-a772-6aba6096471e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.347093 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-serving-cert\") pod \"controller-manager-5bc5979f8c-gz2dg\" (UID: \"2b48d9e0-5296-4cdf-bd91-205a09bcdaed\") " pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.347148 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-config\") pod \"controller-manager-5bc5979f8c-gz2dg\" (UID: \"2b48d9e0-5296-4cdf-bd91-205a09bcdaed\") " pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.347221 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-proxy-ca-bundles\") pod \"controller-manager-5bc5979f8c-gz2dg\" (UID: \"2b48d9e0-5296-4cdf-bd91-205a09bcdaed\") " pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.347245 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k52hl\" (UniqueName: \"kubernetes.io/projected/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-kube-api-access-k52hl\") pod \"controller-manager-5bc5979f8c-gz2dg\" (UID: \"2b48d9e0-5296-4cdf-bd91-205a09bcdaed\") " pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.347285 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-client-ca\") pod \"controller-manager-5bc5979f8c-gz2dg\" (UID: \"2b48d9e0-5296-4cdf-bd91-205a09bcdaed\") " pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.347331 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28e25a15-5386-4a3e-a772-6aba6096471e-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.347344 4732 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28e25a15-5386-4a3e-a772-6aba6096471e-client-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.347355 4732 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28e25a15-5386-4a3e-a772-6aba6096471e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.347366 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4959w\" (UniqueName: \"kubernetes.io/projected/28e25a15-5386-4a3e-a772-6aba6096471e-kube-api-access-4959w\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.347374 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28e25a15-5386-4a3e-a772-6aba6096471e-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.448442 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-serving-cert\") pod \"controller-manager-5bc5979f8c-gz2dg\" (UID: \"2b48d9e0-5296-4cdf-bd91-205a09bcdaed\") " pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.448529 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-config\") pod \"controller-manager-5bc5979f8c-gz2dg\" (UID: \"2b48d9e0-5296-4cdf-bd91-205a09bcdaed\") " pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.448706 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-proxy-ca-bundles\") pod \"controller-manager-5bc5979f8c-gz2dg\" (UID: \"2b48d9e0-5296-4cdf-bd91-205a09bcdaed\") " pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.448744 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k52hl\" (UniqueName: \"kubernetes.io/projected/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-kube-api-access-k52hl\") pod \"controller-manager-5bc5979f8c-gz2dg\" (UID: \"2b48d9e0-5296-4cdf-bd91-205a09bcdaed\") " pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.448788 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-client-ca\") pod \"controller-manager-5bc5979f8c-gz2dg\" (UID: \"2b48d9e0-5296-4cdf-bd91-205a09bcdaed\") " pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.450346 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-proxy-ca-bundles\") pod \"controller-manager-5bc5979f8c-gz2dg\" (UID: \"2b48d9e0-5296-4cdf-bd91-205a09bcdaed\") " pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.450419 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-client-ca\") pod \"controller-manager-5bc5979f8c-gz2dg\" (UID: \"2b48d9e0-5296-4cdf-bd91-205a09bcdaed\") " pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.451759 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-config\") pod \"controller-manager-5bc5979f8c-gz2dg\" (UID: \"2b48d9e0-5296-4cdf-bd91-205a09bcdaed\") " pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.452937 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-serving-cert\") pod \"controller-manager-5bc5979f8c-gz2dg\" (UID: \"2b48d9e0-5296-4cdf-bd91-205a09bcdaed\") " pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.466825 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k52hl\" (UniqueName: \"kubernetes.io/projected/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-kube-api-access-k52hl\") pod \"controller-manager-5bc5979f8c-gz2dg\" (UID: \"2b48d9e0-5296-4cdf-bd91-205a09bcdaed\") " pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.555339 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.868964 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" event={"ID":"28e25a15-5386-4a3e-a772-6aba6096471e","Type":"ContainerDied","Data":"00f630d0a24fce94b9715039a314e494d99934dcc71ebd092b78054f5e78c689"} Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.869289 4732 scope.go:117] "RemoveContainer" containerID="71318aec4005acc7bb0146248defa3792899a3c86b676740054654f0c884a5b3" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.869013 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86c47b858b-ktqm8" Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.871930 4732 generic.go:334] "Generic (PLEG): container finished" podID="cdbf4139-8114-4f0b-a66d-90f5c1812574" containerID="39affb00167fa8867971495af6eafe8bb3b7936d14cf3fdc1bd4e185d0895f86" exitCode=0 Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.871993 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc" event={"ID":"cdbf4139-8114-4f0b-a66d-90f5c1812574","Type":"ContainerDied","Data":"39affb00167fa8867971495af6eafe8bb3b7936d14cf3fdc1bd4e185d0895f86"} Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.892000 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86c47b858b-ktqm8"] Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.894918 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-86c47b858b-ktqm8"] Apr 02 13:42:58 crc kubenswrapper[4732]: I0402 13:42:58.981896 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg"] Apr 02 13:42:59 crc kubenswrapper[4732]: I0402 13:42:59.879509 4732 generic.go:334] "Generic (PLEG): container finished" podID="10c1127e-ac61-4432-b5a8-828e3b84d61e" containerID="9008b86327a829df7eb0928ebaae8f1272baf080e83af4a35d37bc520feb9f39" exitCode=0 Apr 02 13:42:59 crc kubenswrapper[4732]: I0402 13:42:59.879723 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntdkq" event={"ID":"10c1127e-ac61-4432-b5a8-828e3b84d61e","Type":"ContainerDied","Data":"9008b86327a829df7eb0928ebaae8f1272baf080e83af4a35d37bc520feb9f39"} Apr 02 13:42:59 crc kubenswrapper[4732]: I0402 13:42:59.880173 4732 scope.go:117] "RemoveContainer" containerID="9008b86327a829df7eb0928ebaae8f1272baf080e83af4a35d37bc520feb9f39" Apr 02 13:43:00 crc kubenswrapper[4732]: I0402 13:43:00.687954 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28e25a15-5386-4a3e-a772-6aba6096471e" path="/var/lib/kubelet/pods/28e25a15-5386-4a3e-a772-6aba6096471e/volumes" Apr 02 13:43:00 crc kubenswrapper[4732]: I0402 13:43:00.886435 4732 generic.go:334] "Generic (PLEG): container finished" podID="cbd2d545-1f43-45dd-8f7e-23ea76dbc0a4" containerID="c7eedc6f02caca40d1448c1c6e2ce5b09f9d0cfca0886704d5fce80ccf8c6447" exitCode=0 Apr 02 13:43:00 crc kubenswrapper[4732]: I0402 13:43:00.886509 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4dzs" event={"ID":"cbd2d545-1f43-45dd-8f7e-23ea76dbc0a4","Type":"ContainerDied","Data":"c7eedc6f02caca40d1448c1c6e2ce5b09f9d0cfca0886704d5fce80ccf8c6447"} Apr 02 13:43:00 crc kubenswrapper[4732]: I0402 13:43:00.886952 4732 scope.go:117] "RemoveContainer" containerID="c7eedc6f02caca40d1448c1c6e2ce5b09f9d0cfca0886704d5fce80ccf8c6447" Apr 02 13:43:00 crc kubenswrapper[4732]: I0402 13:43:00.888359 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-756b6f6bc6-2gdhw_803d2a64-1416-46cb-ae46-5a8462b057f9/openshift-controller-manager-operator/0.log" Apr 02 13:43:00 crc kubenswrapper[4732]: I0402 13:43:00.888399 4732 generic.go:334] "Generic (PLEG): container finished" podID="803d2a64-1416-46cb-ae46-5a8462b057f9" containerID="c4d45e51907ae6ac3fd9f938f60ea1cf8ecef248bf3706c0f8b6d1fcee228218" exitCode=1 Apr 02 13:43:00 crc kubenswrapper[4732]: I0402 13:43:00.888420 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2gdhw" event={"ID":"803d2a64-1416-46cb-ae46-5a8462b057f9","Type":"ContainerDied","Data":"c4d45e51907ae6ac3fd9f938f60ea1cf8ecef248bf3706c0f8b6d1fcee228218"} Apr 02 13:43:00 crc kubenswrapper[4732]: I0402 13:43:00.888683 4732 scope.go:117] "RemoveContainer" containerID="c4d45e51907ae6ac3fd9f938f60ea1cf8ecef248bf3706c0f8b6d1fcee228218" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.281684 4732 patch_prober.go:28] interesting pod/console-f9d7485db-dq9x9 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.29:8443/health\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.281776 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dq9x9" podUID="4d77c191-7d04-4381-838f-b7a355e7c2d4" containerName="console" probeResult="failure" output="Get \"https://10.217.0.29:8443/health\": dial tcp 10.217.0.29:8443: connect: connection refused" Apr 02 13:43:01 crc kubenswrapper[4732]: W0402 13:43:01.292581 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b48d9e0_5296_4cdf_bd91_205a09bcdaed.slice/crio-331fb3e655bc7401579b0d24c4e2d7ef04aba6df462175e00f3e0ac52b94ba04 WatchSource:0}: Error finding container 331fb3e655bc7401579b0d24c4e2d7ef04aba6df462175e00f3e0ac52b94ba04: Status 404 returned error can't find the container with id 331fb3e655bc7401579b0d24c4e2d7ef04aba6df462175e00f3e0ac52b94ba04 Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.367833 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.399374 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl"] Apr 02 13:43:01 crc kubenswrapper[4732]: E0402 13:43:01.399862 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdbf4139-8114-4f0b-a66d-90f5c1812574" containerName="route-controller-manager" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.399961 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdbf4139-8114-4f0b-a66d-90f5c1812574" containerName="route-controller-manager" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.400170 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdbf4139-8114-4f0b-a66d-90f5c1812574" containerName="route-controller-manager" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.400709 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.408169 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl"] Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.491812 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cdbf4139-8114-4f0b-a66d-90f5c1812574-client-ca\") pod \"cdbf4139-8114-4f0b-a66d-90f5c1812574\" (UID: \"cdbf4139-8114-4f0b-a66d-90f5c1812574\") " Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.491868 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdbf4139-8114-4f0b-a66d-90f5c1812574-config\") pod \"cdbf4139-8114-4f0b-a66d-90f5c1812574\" (UID: \"cdbf4139-8114-4f0b-a66d-90f5c1812574\") " Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.491908 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68bcr\" (UniqueName: \"kubernetes.io/projected/cdbf4139-8114-4f0b-a66d-90f5c1812574-kube-api-access-68bcr\") pod \"cdbf4139-8114-4f0b-a66d-90f5c1812574\" (UID: \"cdbf4139-8114-4f0b-a66d-90f5c1812574\") " Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.491952 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdbf4139-8114-4f0b-a66d-90f5c1812574-serving-cert\") pod \"cdbf4139-8114-4f0b-a66d-90f5c1812574\" (UID: \"cdbf4139-8114-4f0b-a66d-90f5c1812574\") " Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.492658 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdbf4139-8114-4f0b-a66d-90f5c1812574-client-ca" (OuterVolumeSpecName: "client-ca") pod "cdbf4139-8114-4f0b-a66d-90f5c1812574" (UID: "cdbf4139-8114-4f0b-a66d-90f5c1812574"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.493123 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccddd9d4-7db7-4425-9920-3dac8a779304-serving-cert\") pod \"route-controller-manager-5d9f4656d6-phcgl\" (UID: \"ccddd9d4-7db7-4425-9920-3dac8a779304\") " pod="openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.493153 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ccddd9d4-7db7-4425-9920-3dac8a779304-client-ca\") pod \"route-controller-manager-5d9f4656d6-phcgl\" (UID: \"ccddd9d4-7db7-4425-9920-3dac8a779304\") " pod="openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.493178 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccddd9d4-7db7-4425-9920-3dac8a779304-config\") pod \"route-controller-manager-5d9f4656d6-phcgl\" (UID: \"ccddd9d4-7db7-4425-9920-3dac8a779304\") " pod="openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.493205 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4dwl\" (UniqueName: \"kubernetes.io/projected/ccddd9d4-7db7-4425-9920-3dac8a779304-kube-api-access-x4dwl\") pod \"route-controller-manager-5d9f4656d6-phcgl\" (UID: \"ccddd9d4-7db7-4425-9920-3dac8a779304\") " pod="openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.493244 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdbf4139-8114-4f0b-a66d-90f5c1812574-config" (OuterVolumeSpecName: "config") pod "cdbf4139-8114-4f0b-a66d-90f5c1812574" (UID: "cdbf4139-8114-4f0b-a66d-90f5c1812574"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.493321 4732 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cdbf4139-8114-4f0b-a66d-90f5c1812574-client-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.496937 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdbf4139-8114-4f0b-a66d-90f5c1812574-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cdbf4139-8114-4f0b-a66d-90f5c1812574" (UID: "cdbf4139-8114-4f0b-a66d-90f5c1812574"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.497095 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdbf4139-8114-4f0b-a66d-90f5c1812574-kube-api-access-68bcr" (OuterVolumeSpecName: "kube-api-access-68bcr") pod "cdbf4139-8114-4f0b-a66d-90f5c1812574" (UID: "cdbf4139-8114-4f0b-a66d-90f5c1812574"). InnerVolumeSpecName "kube-api-access-68bcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.594500 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccddd9d4-7db7-4425-9920-3dac8a779304-serving-cert\") pod \"route-controller-manager-5d9f4656d6-phcgl\" (UID: \"ccddd9d4-7db7-4425-9920-3dac8a779304\") " pod="openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.594962 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ccddd9d4-7db7-4425-9920-3dac8a779304-client-ca\") pod \"route-controller-manager-5d9f4656d6-phcgl\" (UID: \"ccddd9d4-7db7-4425-9920-3dac8a779304\") " pod="openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.595000 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccddd9d4-7db7-4425-9920-3dac8a779304-config\") pod \"route-controller-manager-5d9f4656d6-phcgl\" (UID: \"ccddd9d4-7db7-4425-9920-3dac8a779304\") " pod="openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.595031 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4dwl\" (UniqueName: \"kubernetes.io/projected/ccddd9d4-7db7-4425-9920-3dac8a779304-kube-api-access-x4dwl\") pod \"route-controller-manager-5d9f4656d6-phcgl\" (UID: \"ccddd9d4-7db7-4425-9920-3dac8a779304\") " pod="openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.595155 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68bcr\" (UniqueName: \"kubernetes.io/projected/cdbf4139-8114-4f0b-a66d-90f5c1812574-kube-api-access-68bcr\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.595171 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdbf4139-8114-4f0b-a66d-90f5c1812574-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.595184 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdbf4139-8114-4f0b-a66d-90f5c1812574-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.596852 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccddd9d4-7db7-4425-9920-3dac8a779304-config\") pod \"route-controller-manager-5d9f4656d6-phcgl\" (UID: \"ccddd9d4-7db7-4425-9920-3dac8a779304\") " pod="openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.597187 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ccddd9d4-7db7-4425-9920-3dac8a779304-client-ca\") pod \"route-controller-manager-5d9f4656d6-phcgl\" (UID: \"ccddd9d4-7db7-4425-9920-3dac8a779304\") " pod="openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.602594 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccddd9d4-7db7-4425-9920-3dac8a779304-serving-cert\") pod \"route-controller-manager-5d9f4656d6-phcgl\" (UID: \"ccddd9d4-7db7-4425-9920-3dac8a779304\") " pod="openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.617357 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4dwl\" (UniqueName: \"kubernetes.io/projected/ccddd9d4-7db7-4425-9920-3dac8a779304-kube-api-access-x4dwl\") pod \"route-controller-manager-5d9f4656d6-phcgl\" (UID: \"ccddd9d4-7db7-4425-9920-3dac8a779304\") " pod="openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.728547 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.901607 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" event={"ID":"2b48d9e0-5296-4cdf-bd91-205a09bcdaed","Type":"ContainerStarted","Data":"331fb3e655bc7401579b0d24c4e2d7ef04aba6df462175e00f3e0ac52b94ba04"} Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.905768 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc" event={"ID":"cdbf4139-8114-4f0b-a66d-90f5c1812574","Type":"ContainerDied","Data":"992b3a61c729f26815ae946dd2f55426ad1bb5745821d4b85d18253d220128d9"} Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.905946 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.929530 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.929709 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.929784 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.934986 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f"} pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.935173 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" containerID="cri-o://09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f" gracePeriod=600 Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.981060 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc"] Apr 02 13:43:01 crc kubenswrapper[4732]: I0402 13:43:01.984395 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7456d5dd7f-jpxrc"] Apr 02 13:43:02 crc kubenswrapper[4732]: I0402 13:43:02.714585 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdbf4139-8114-4f0b-a66d-90f5c1812574" path="/var/lib/kubelet/pods/cdbf4139-8114-4f0b-a66d-90f5c1812574/volumes" Apr 02 13:43:02 crc kubenswrapper[4732]: I0402 13:43:02.914807 4732 generic.go:334] "Generic (PLEG): container finished" podID="38409e5e-4545-49da-8f6c-4bfb30582878" containerID="09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f" exitCode=0 Apr 02 13:43:02 crc kubenswrapper[4732]: I0402 13:43:02.914862 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerDied","Data":"09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f"} Apr 02 13:43:03 crc kubenswrapper[4732]: I0402 13:43:03.803361 4732 scope.go:117] "RemoveContainer" containerID="39affb00167fa8867971495af6eafe8bb3b7936d14cf3fdc1bd4e185d0895f86" Apr 02 13:43:03 crc kubenswrapper[4732]: I0402 13:43:03.924339 4732 generic.go:334] "Generic (PLEG): container finished" podID="49c54f44-4a94-4b19-b03d-8469355931d0" containerID="46f5da7593589bd05c00cedd6eb69a24b8824f34adf32da57810f78efd3d8a0c" exitCode=0 Apr 02 13:43:03 crc kubenswrapper[4732]: I0402 13:43:03.924395 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qdvzn" event={"ID":"49c54f44-4a94-4b19-b03d-8469355931d0","Type":"ContainerDied","Data":"46f5da7593589bd05c00cedd6eb69a24b8824f34adf32da57810f78efd3d8a0c"} Apr 02 13:43:03 crc kubenswrapper[4732]: I0402 13:43:03.925780 4732 scope.go:117] "RemoveContainer" containerID="46f5da7593589bd05c00cedd6eb69a24b8824f34adf32da57810f78efd3d8a0c" Apr 02 13:43:04 crc kubenswrapper[4732]: I0402 13:43:04.669514 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl"] Apr 02 13:43:04 crc kubenswrapper[4732]: W0402 13:43:04.674905 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccddd9d4_7db7_4425_9920_3dac8a779304.slice/crio-ad6734aadeaefc316d1d7a7761b542046d2796c351039437126f2ab4873239f2 WatchSource:0}: Error finding container ad6734aadeaefc316d1d7a7761b542046d2796c351039437126f2ab4873239f2: Status 404 returned error can't find the container with id ad6734aadeaefc316d1d7a7761b542046d2796c351039437126f2ab4873239f2 Apr 02 13:43:04 crc kubenswrapper[4732]: I0402 13:43:04.935140 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-756b6f6bc6-2gdhw_803d2a64-1416-46cb-ae46-5a8462b057f9/openshift-controller-manager-operator/0.log" Apr 02 13:43:04 crc kubenswrapper[4732]: I0402 13:43:04.935459 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2gdhw" event={"ID":"803d2a64-1416-46cb-ae46-5a8462b057f9","Type":"ContainerStarted","Data":"119cbea276b2b59bfc11d7836499458a898b71fa06df7d5b3c92ea1f2d02ebfd"} Apr 02 13:43:04 crc kubenswrapper[4732]: I0402 13:43:04.937550 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4dzs" event={"ID":"cbd2d545-1f43-45dd-8f7e-23ea76dbc0a4","Type":"ContainerStarted","Data":"b20f39433e51aeaf01ade79b9f743e1eb69dc7af4dd8a24d10b843aae0341538"} Apr 02 13:43:04 crc kubenswrapper[4732]: I0402 13:43:04.939055 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgjz4" event={"ID":"50d43b2d-24ec-439f-a418-3673791eb1b1","Type":"ContainerStarted","Data":"90cf6284bc60f80e12333f177cf852a10f14c55a6bb7da1c6ad61c3b50212e39"} Apr 02 13:43:04 crc kubenswrapper[4732]: I0402 13:43:04.941673 4732 generic.go:334] "Generic (PLEG): container finished" podID="5425ff81-73c8-4fca-b208-9c9dbc6a949d" containerID="543cf36fcde29b722d5119bf8091e93d966b4f0d31fd814907f5760a3a6e6d45" exitCode=0 Apr 02 13:43:04 crc kubenswrapper[4732]: I0402 13:43:04.941755 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7sxt" event={"ID":"5425ff81-73c8-4fca-b208-9c9dbc6a949d","Type":"ContainerDied","Data":"543cf36fcde29b722d5119bf8091e93d966b4f0d31fd814907f5760a3a6e6d45"} Apr 02 13:43:04 crc kubenswrapper[4732]: I0402 13:43:04.942337 4732 scope.go:117] "RemoveContainer" containerID="543cf36fcde29b722d5119bf8091e93d966b4f0d31fd814907f5760a3a6e6d45" Apr 02 13:43:04 crc kubenswrapper[4732]: I0402 13:43:04.943321 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m6rj" event={"ID":"cf030ff0-459d-4453-975f-19ba4ff9641a","Type":"ContainerStarted","Data":"c0ae4d4789a9b315a1d1c0369d0032385d80233a8d22f3bbbcfa5e78dc247fcb"} Apr 02 13:43:04 crc kubenswrapper[4732]: I0402 13:43:04.944542 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" event={"ID":"2b48d9e0-5296-4cdf-bd91-205a09bcdaed","Type":"ContainerStarted","Data":"801f2e29d81a19d42d2a8c35affaa9eb310feee819d3b2b60189b99fbf025e34"} Apr 02 13:43:04 crc kubenswrapper[4732]: I0402 13:43:04.944763 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" Apr 02 13:43:04 crc kubenswrapper[4732]: I0402 13:43:04.946754 4732 patch_prober.go:28] interesting pod/controller-manager-5bc5979f8c-gz2dg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.74:8443/healthz\": dial tcp 10.217.0.74:8443: connect: connection refused" start-of-body= Apr 02 13:43:04 crc kubenswrapper[4732]: I0402 13:43:04.946802 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" podUID="2b48d9e0-5296-4cdf-bd91-205a09bcdaed" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.74:8443/healthz\": dial tcp 10.217.0.74:8443: connect: connection refused" Apr 02 13:43:04 crc kubenswrapper[4732]: I0402 13:43:04.947562 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-76hwt_4a34a201-8137-4efe-a99a-1ebd89e40c68/console-operator/0.log" Apr 02 13:43:04 crc kubenswrapper[4732]: I0402 13:43:04.947602 4732 generic.go:334] "Generic (PLEG): container finished" podID="4a34a201-8137-4efe-a99a-1ebd89e40c68" containerID="90fe6d65b20f3fd0bf4d34fdfba2175c7c517e0dbb295feeadae6b483e3ce60d" exitCode=1 Apr 02 13:43:04 crc kubenswrapper[4732]: I0402 13:43:04.947680 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-76hwt" event={"ID":"4a34a201-8137-4efe-a99a-1ebd89e40c68","Type":"ContainerDied","Data":"90fe6d65b20f3fd0bf4d34fdfba2175c7c517e0dbb295feeadae6b483e3ce60d"} Apr 02 13:43:04 crc kubenswrapper[4732]: I0402 13:43:04.948257 4732 scope.go:117] "RemoveContainer" containerID="90fe6d65b20f3fd0bf4d34fdfba2175c7c517e0dbb295feeadae6b483e3ce60d" Apr 02 13:43:04 crc kubenswrapper[4732]: I0402 13:43:04.954829 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerStarted","Data":"6679fff77ada4a54a69b7189491d8feac3c5def6519c359d285b772063d2ad8d"} Apr 02 13:43:04 crc kubenswrapper[4732]: I0402 13:43:04.964169 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntdkq" event={"ID":"10c1127e-ac61-4432-b5a8-828e3b84d61e","Type":"ContainerStarted","Data":"e0757ac2dcca1cda2885609cb0ad107167f44d584363da0a2f9c4e9839b33fed"} Apr 02 13:43:04 crc kubenswrapper[4732]: I0402 13:43:04.970048 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qdvzn" event={"ID":"49c54f44-4a94-4b19-b03d-8469355931d0","Type":"ContainerStarted","Data":"39aeba4ad5c219210e38ca10f7957b93c9b86bafb7182f34187d113a00b0b79f"} Apr 02 13:43:04 crc kubenswrapper[4732]: I0402 13:43:04.972514 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl" event={"ID":"ccddd9d4-7db7-4425-9920-3dac8a779304","Type":"ContainerStarted","Data":"aacadcb720b27cc871fb117adcce026e03ce7431a867e7ac8f770df7c58cf443"} Apr 02 13:43:04 crc kubenswrapper[4732]: I0402 13:43:04.972556 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl" event={"ID":"ccddd9d4-7db7-4425-9920-3dac8a779304","Type":"ContainerStarted","Data":"ad6734aadeaefc316d1d7a7761b542046d2796c351039437126f2ab4873239f2"} Apr 02 13:43:04 crc kubenswrapper[4732]: I0402 13:43:04.973040 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl" Apr 02 13:43:05 crc kubenswrapper[4732]: I0402 13:43:05.076690 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" podStartSLOduration=9.076671471 podStartE2EDuration="9.076671471s" podCreationTimestamp="2026-04-02 13:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:43:05.053165465 +0000 UTC m=+341.957573038" watchObservedRunningTime="2026-04-02 13:43:05.076671471 +0000 UTC m=+341.981079024" Apr 02 13:43:05 crc kubenswrapper[4732]: I0402 13:43:05.257405 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl" podStartSLOduration=8.257372759999999 podStartE2EDuration="8.25737276s" podCreationTimestamp="2026-04-02 13:42:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:43:05.256912078 +0000 UTC m=+342.161319631" watchObservedRunningTime="2026-04-02 13:43:05.25737276 +0000 UTC m=+342.161780323" Apr 02 13:43:05 crc kubenswrapper[4732]: I0402 13:43:05.266657 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:43:05 crc kubenswrapper[4732]: I0402 13:43:05.343561 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-85455b8986-kh7t7"] Apr 02 13:43:05 crc kubenswrapper[4732]: I0402 13:43:05.356048 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:43:05 crc kubenswrapper[4732]: I0402 13:43:05.424345 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl" Apr 02 13:43:05 crc kubenswrapper[4732]: I0402 13:43:05.980731 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvckz" event={"ID":"33708fee-32a5-4418-81d0-226813150db7","Type":"ContainerStarted","Data":"a01d592b0ba86583dfa1fdecff24f378de8e39e237965b26e4cb0da5fcc7b523"} Apr 02 13:43:05 crc kubenswrapper[4732]: I0402 13:43:05.984130 4732 generic.go:334] "Generic (PLEG): container finished" podID="50d43b2d-24ec-439f-a418-3673791eb1b1" containerID="90cf6284bc60f80e12333f177cf852a10f14c55a6bb7da1c6ad61c3b50212e39" exitCode=0 Apr 02 13:43:05 crc kubenswrapper[4732]: I0402 13:43:05.984243 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgjz4" event={"ID":"50d43b2d-24ec-439f-a418-3673791eb1b1","Type":"ContainerDied","Data":"90cf6284bc60f80e12333f177cf852a10f14c55a6bb7da1c6ad61c3b50212e39"} Apr 02 13:43:05 crc kubenswrapper[4732]: I0402 13:43:05.986507 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-76hwt_4a34a201-8137-4efe-a99a-1ebd89e40c68/console-operator/0.log" Apr 02 13:43:05 crc kubenswrapper[4732]: I0402 13:43:05.986593 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-76hwt" event={"ID":"4a34a201-8137-4efe-a99a-1ebd89e40c68","Type":"ContainerStarted","Data":"8fb21f7772e52a7e1e32ce59510ba83ce8cab18cc026e0cf62722dfffae067db"} Apr 02 13:43:05 crc kubenswrapper[4732]: I0402 13:43:05.987090 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-76hwt" Apr 02 13:43:05 crc kubenswrapper[4732]: I0402 13:43:05.987826 4732 generic.go:334] "Generic (PLEG): container finished" podID="cf030ff0-459d-4453-975f-19ba4ff9641a" containerID="c0ae4d4789a9b315a1d1c0369d0032385d80233a8d22f3bbbcfa5e78dc247fcb" exitCode=0 Apr 02 13:43:05 crc kubenswrapper[4732]: I0402 13:43:05.987874 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m6rj" event={"ID":"cf030ff0-459d-4453-975f-19ba4ff9641a","Type":"ContainerDied","Data":"c0ae4d4789a9b315a1d1c0369d0032385d80233a8d22f3bbbcfa5e78dc247fcb"} Apr 02 13:43:05 crc kubenswrapper[4732]: I0402 13:43:05.989589 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7sxt" event={"ID":"5425ff81-73c8-4fca-b208-9c9dbc6a949d","Type":"ContainerStarted","Data":"176d5ab2e195124cdd70e934b16d967e4dfe3298fae497dddf2494084cdbac0b"} Apr 02 13:43:05 crc kubenswrapper[4732]: I0402 13:43:05.994032 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" Apr 02 13:43:06 crc kubenswrapper[4732]: I0402 13:43:06.247601 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg"] Apr 02 13:43:06 crc kubenswrapper[4732]: I0402 13:43:06.264562 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl"] Apr 02 13:43:06 crc kubenswrapper[4732]: I0402 13:43:06.408111 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:43:06 crc kubenswrapper[4732]: I0402 13:43:06.413788 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:43:06 crc kubenswrapper[4732]: I0402 13:43:06.422183 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-76hwt" Apr 02 13:43:06 crc kubenswrapper[4732]: I0402 13:43:06.503639 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-dq9x9"] Apr 02 13:43:06 crc kubenswrapper[4732]: I0402 13:43:06.998287 4732 generic.go:334] "Generic (PLEG): container finished" podID="33708fee-32a5-4418-81d0-226813150db7" containerID="a01d592b0ba86583dfa1fdecff24f378de8e39e237965b26e4cb0da5fcc7b523" exitCode=0 Apr 02 13:43:06 crc kubenswrapper[4732]: I0402 13:43:06.998367 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvckz" event={"ID":"33708fee-32a5-4418-81d0-226813150db7","Type":"ContainerDied","Data":"a01d592b0ba86583dfa1fdecff24f378de8e39e237965b26e4cb0da5fcc7b523"} Apr 02 13:43:07 crc kubenswrapper[4732]: I0402 13:43:07.000555 4732 generic.go:334] "Generic (PLEG): container finished" podID="9058e533-24e2-44f1-8631-dd9bf6a37192" containerID="59f478c20e6d6939d2cf216ee8b5458617d57a9367164d367d059afc2a7adcd8" exitCode=0 Apr 02 13:43:07 crc kubenswrapper[4732]: I0402 13:43:07.000636 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ps5w" event={"ID":"9058e533-24e2-44f1-8631-dd9bf6a37192","Type":"ContainerDied","Data":"59f478c20e6d6939d2cf216ee8b5458617d57a9367164d367d059afc2a7adcd8"} Apr 02 13:43:07 crc kubenswrapper[4732]: I0402 13:43:07.004520 4732 generic.go:334] "Generic (PLEG): container finished" podID="6ac64cdf-a607-481a-9907-e6e72fc8b083" containerID="a9b93dc09d7caa15cd7c426478c8dede2b7b339e7df716f78786c6cfc90be2fa" exitCode=0 Apr 02 13:43:07 crc kubenswrapper[4732]: I0402 13:43:07.004551 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fd4xj" event={"ID":"6ac64cdf-a607-481a-9907-e6e72fc8b083","Type":"ContainerDied","Data":"a9b93dc09d7caa15cd7c426478c8dede2b7b339e7df716f78786c6cfc90be2fa"} Apr 02 13:43:07 crc kubenswrapper[4732]: I0402 13:43:07.008458 4732 generic.go:334] "Generic (PLEG): container finished" podID="51a0e365-014c-40e8-8749-7512f2c00758" containerID="300bdd0e2ae6c21103f4f45e67123d7de8eaa3cbef3e2562926b7864e0bb52df" exitCode=0 Apr 02 13:43:07 crc kubenswrapper[4732]: I0402 13:43:07.008565 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9lrnf" event={"ID":"51a0e365-014c-40e8-8749-7512f2c00758","Type":"ContainerDied","Data":"300bdd0e2ae6c21103f4f45e67123d7de8eaa3cbef3e2562926b7864e0bb52df"} Apr 02 13:43:07 crc kubenswrapper[4732]: I0402 13:43:07.025234 4732 generic.go:334] "Generic (PLEG): container finished" podID="72981c60-e9a1-4e25-9b64-7493d6fdaab6" containerID="16541e33e966d2f1de439d8a63aece7d1c528e83e6fe30547f444b7984f3d70e" exitCode=0 Apr 02 13:43:07 crc kubenswrapper[4732]: I0402 13:43:07.028127 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z7v99" event={"ID":"72981c60-e9a1-4e25-9b64-7493d6fdaab6","Type":"ContainerDied","Data":"16541e33e966d2f1de439d8a63aece7d1c528e83e6fe30547f444b7984f3d70e"} Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.035788 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m6rj" event={"ID":"cf030ff0-459d-4453-975f-19ba4ff9641a","Type":"ContainerStarted","Data":"425cedb83d984da24336a079437685c8d001e2247a41453b3dda436bf0d02899"} Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.038073 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvckz" event={"ID":"33708fee-32a5-4418-81d0-226813150db7","Type":"ContainerStarted","Data":"28818acdc7b448e9a253990a4d438b9341394b606d83467632fbb10396de5953"} Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.040396 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgjz4" event={"ID":"50d43b2d-24ec-439f-a418-3673791eb1b1","Type":"ContainerStarted","Data":"ed364b40491f81b7c45fc61749d3ea7a1c9489cbbc60f255d7178b263131dda6"} Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.042953 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ps5w" event={"ID":"9058e533-24e2-44f1-8631-dd9bf6a37192","Type":"ContainerStarted","Data":"71072a927cc5d72b38e51db2b3913ccae6840c19ca7a27befca43a2c1adbe86b"} Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.044932 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fd4xj" event={"ID":"6ac64cdf-a607-481a-9907-e6e72fc8b083","Type":"ContainerStarted","Data":"2761fef5975eef45a310fec86c715ca4d66cc77b41884e98ddaf26c5dfe89d20"} Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.047260 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9lrnf" event={"ID":"51a0e365-014c-40e8-8749-7512f2c00758","Type":"ContainerStarted","Data":"f0b0b4ffe3ae302d71148a33d0d55749459f5819b20e8bc8265f0c4145447656"} Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.049602 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z7v99" event={"ID":"72981c60-e9a1-4e25-9b64-7493d6fdaab6","Type":"ContainerStarted","Data":"71bc9d5e90fdd9566b9aac6dfb38870bf6d6ce71be3b7eb7354b04408fe86b37"} Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.049851 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" podUID="2b48d9e0-5296-4cdf-bd91-205a09bcdaed" containerName="controller-manager" containerID="cri-o://801f2e29d81a19d42d2a8c35affaa9eb310feee819d3b2b60189b99fbf025e34" gracePeriod=30 Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.050007 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl" podUID="ccddd9d4-7db7-4425-9920-3dac8a779304" containerName="route-controller-manager" containerID="cri-o://aacadcb720b27cc871fb117adcce026e03ce7431a867e7ac8f770df7c58cf443" gracePeriod=30 Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.096913 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4m6rj" podStartSLOduration=3.951429852 podStartE2EDuration="1m32.096895703s" podCreationTimestamp="2026-04-02 13:41:36 +0000 UTC" firstStartedPulling="2026-04-02 13:41:38.57066721 +0000 UTC m=+255.475074763" lastFinishedPulling="2026-04-02 13:43:06.716133061 +0000 UTC m=+343.620540614" observedRunningTime="2026-04-02 13:43:08.074089695 +0000 UTC m=+344.978497258" watchObservedRunningTime="2026-04-02 13:43:08.096895703 +0000 UTC m=+345.001303256" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.118730 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8ps5w" podStartSLOduration=3.302336538 podStartE2EDuration="1m31.118709305s" podCreationTimestamp="2026-04-02 13:41:37 +0000 UTC" firstStartedPulling="2026-04-02 13:41:39.612148112 +0000 UTC m=+256.516555665" lastFinishedPulling="2026-04-02 13:43:07.428520849 +0000 UTC m=+344.332928432" observedRunningTime="2026-04-02 13:43:08.117882713 +0000 UTC m=+345.022290266" watchObservedRunningTime="2026-04-02 13:43:08.118709305 +0000 UTC m=+345.023116858" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.120177 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lvckz" podStartSLOduration=3.462055676 podStartE2EDuration="1m29.120169234s" podCreationTimestamp="2026-04-02 13:41:39 +0000 UTC" firstStartedPulling="2026-04-02 13:41:41.872295048 +0000 UTC m=+258.776702601" lastFinishedPulling="2026-04-02 13:43:07.530408606 +0000 UTC m=+344.434816159" observedRunningTime="2026-04-02 13:43:08.097801487 +0000 UTC m=+345.002209050" watchObservedRunningTime="2026-04-02 13:43:08.120169234 +0000 UTC m=+345.024576787" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.137741 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fd4xj" podStartSLOduration=3.236517258 podStartE2EDuration="1m31.137723432s" podCreationTimestamp="2026-04-02 13:41:37 +0000 UTC" firstStartedPulling="2026-04-02 13:41:39.6343825 +0000 UTC m=+256.538790053" lastFinishedPulling="2026-04-02 13:43:07.535588674 +0000 UTC m=+344.439996227" observedRunningTime="2026-04-02 13:43:08.137711931 +0000 UTC m=+345.042119504" watchObservedRunningTime="2026-04-02 13:43:08.137723432 +0000 UTC m=+345.042130975" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.195519 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vgjz4" podStartSLOduration=3.254666828 podStartE2EDuration="1m28.195503233s" podCreationTimestamp="2026-04-02 13:41:40 +0000 UTC" firstStartedPulling="2026-04-02 13:41:41.851643612 +0000 UTC m=+258.756051165" lastFinishedPulling="2026-04-02 13:43:06.792480017 +0000 UTC m=+343.696887570" observedRunningTime="2026-04-02 13:43:08.194968298 +0000 UTC m=+345.099375851" watchObservedRunningTime="2026-04-02 13:43:08.195503233 +0000 UTC m=+345.099910786" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.196927 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z7v99" podStartSLOduration=2.199596587 podStartE2EDuration="1m29.19692117s" podCreationTimestamp="2026-04-02 13:41:39 +0000 UTC" firstStartedPulling="2026-04-02 13:41:40.66982044 +0000 UTC m=+257.574227993" lastFinishedPulling="2026-04-02 13:43:07.667145023 +0000 UTC m=+344.571552576" observedRunningTime="2026-04-02 13:43:08.179975748 +0000 UTC m=+345.084383331" watchObservedRunningTime="2026-04-02 13:43:08.19692117 +0000 UTC m=+345.101328723" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.217478 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9lrnf" podStartSLOduration=3.414793299 podStartE2EDuration="1m30.217462318s" podCreationTimestamp="2026-04-02 13:41:38 +0000 UTC" firstStartedPulling="2026-04-02 13:41:40.664035938 +0000 UTC m=+257.568443491" lastFinishedPulling="2026-04-02 13:43:07.466704957 +0000 UTC m=+344.371112510" observedRunningTime="2026-04-02 13:43:08.216655157 +0000 UTC m=+345.121062710" watchObservedRunningTime="2026-04-02 13:43:08.217462318 +0000 UTC m=+345.121869861" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.606526 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.652452 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5799f7746f-qdhcp"] Apr 02 13:43:08 crc kubenswrapper[4732]: E0402 13:43:08.652845 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b48d9e0-5296-4cdf-bd91-205a09bcdaed" containerName="controller-manager" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.652863 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b48d9e0-5296-4cdf-bd91-205a09bcdaed" containerName="controller-manager" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.652978 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b48d9e0-5296-4cdf-bd91-205a09bcdaed" containerName="controller-manager" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.656022 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5799f7746f-qdhcp" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.673923 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5799f7746f-qdhcp"] Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.682222 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.800836 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-serving-cert\") pod \"2b48d9e0-5296-4cdf-bd91-205a09bcdaed\" (UID: \"2b48d9e0-5296-4cdf-bd91-205a09bcdaed\") " Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.802095 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-client-ca\") pod \"2b48d9e0-5296-4cdf-bd91-205a09bcdaed\" (UID: \"2b48d9e0-5296-4cdf-bd91-205a09bcdaed\") " Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.802514 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-client-ca" (OuterVolumeSpecName: "client-ca") pod "2b48d9e0-5296-4cdf-bd91-205a09bcdaed" (UID: "2b48d9e0-5296-4cdf-bd91-205a09bcdaed"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.802574 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ccddd9d4-7db7-4425-9920-3dac8a779304-client-ca\") pod \"ccddd9d4-7db7-4425-9920-3dac8a779304\" (UID: \"ccddd9d4-7db7-4425-9920-3dac8a779304\") " Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.802601 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccddd9d4-7db7-4425-9920-3dac8a779304-config\") pod \"ccddd9d4-7db7-4425-9920-3dac8a779304\" (UID: \"ccddd9d4-7db7-4425-9920-3dac8a779304\") " Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.803338 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccddd9d4-7db7-4425-9920-3dac8a779304-serving-cert\") pod \"ccddd9d4-7db7-4425-9920-3dac8a779304\" (UID: \"ccddd9d4-7db7-4425-9920-3dac8a779304\") " Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.803368 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-proxy-ca-bundles\") pod \"2b48d9e0-5296-4cdf-bd91-205a09bcdaed\" (UID: \"2b48d9e0-5296-4cdf-bd91-205a09bcdaed\") " Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.803393 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4dwl\" (UniqueName: \"kubernetes.io/projected/ccddd9d4-7db7-4425-9920-3dac8a779304-kube-api-access-x4dwl\") pod \"ccddd9d4-7db7-4425-9920-3dac8a779304\" (UID: \"ccddd9d4-7db7-4425-9920-3dac8a779304\") " Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.803419 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k52hl\" (UniqueName: \"kubernetes.io/projected/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-kube-api-access-k52hl\") pod \"2b48d9e0-5296-4cdf-bd91-205a09bcdaed\" (UID: \"2b48d9e0-5296-4cdf-bd91-205a09bcdaed\") " Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.803461 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-config\") pod \"2b48d9e0-5296-4cdf-bd91-205a09bcdaed\" (UID: \"2b48d9e0-5296-4cdf-bd91-205a09bcdaed\") " Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.803627 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e-client-ca\") pod \"controller-manager-5799f7746f-qdhcp\" (UID: \"84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e\") " pod="openshift-controller-manager/controller-manager-5799f7746f-qdhcp" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.802941 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccddd9d4-7db7-4425-9920-3dac8a779304-client-ca" (OuterVolumeSpecName: "client-ca") pod "ccddd9d4-7db7-4425-9920-3dac8a779304" (UID: "ccddd9d4-7db7-4425-9920-3dac8a779304"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.803232 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccddd9d4-7db7-4425-9920-3dac8a779304-config" (OuterVolumeSpecName: "config") pod "ccddd9d4-7db7-4425-9920-3dac8a779304" (UID: "ccddd9d4-7db7-4425-9920-3dac8a779304"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.804177 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2b48d9e0-5296-4cdf-bd91-205a09bcdaed" (UID: "2b48d9e0-5296-4cdf-bd91-205a09bcdaed"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.804788 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-config" (OuterVolumeSpecName: "config") pod "2b48d9e0-5296-4cdf-bd91-205a09bcdaed" (UID: "2b48d9e0-5296-4cdf-bd91-205a09bcdaed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.804861 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e-config\") pod \"controller-manager-5799f7746f-qdhcp\" (UID: \"84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e\") " pod="openshift-controller-manager/controller-manager-5799f7746f-qdhcp" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.804884 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e-proxy-ca-bundles\") pod \"controller-manager-5799f7746f-qdhcp\" (UID: \"84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e\") " pod="openshift-controller-manager/controller-manager-5799f7746f-qdhcp" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.804930 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e-serving-cert\") pod \"controller-manager-5799f7746f-qdhcp\" (UID: \"84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e\") " pod="openshift-controller-manager/controller-manager-5799f7746f-qdhcp" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.804948 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72s8h\" (UniqueName: \"kubernetes.io/projected/84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e-kube-api-access-72s8h\") pod \"controller-manager-5799f7746f-qdhcp\" (UID: \"84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e\") " pod="openshift-controller-manager/controller-manager-5799f7746f-qdhcp" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.805073 4732 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.805092 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.805104 4732 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-client-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.805115 4732 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ccddd9d4-7db7-4425-9920-3dac8a779304-client-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.805127 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccddd9d4-7db7-4425-9920-3dac8a779304-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.806226 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2b48d9e0-5296-4cdf-bd91-205a09bcdaed" (UID: "2b48d9e0-5296-4cdf-bd91-205a09bcdaed"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.806576 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccddd9d4-7db7-4425-9920-3dac8a779304-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ccddd9d4-7db7-4425-9920-3dac8a779304" (UID: "ccddd9d4-7db7-4425-9920-3dac8a779304"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.806602 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-kube-api-access-k52hl" (OuterVolumeSpecName: "kube-api-access-k52hl") pod "2b48d9e0-5296-4cdf-bd91-205a09bcdaed" (UID: "2b48d9e0-5296-4cdf-bd91-205a09bcdaed"). InnerVolumeSpecName "kube-api-access-k52hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.807907 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccddd9d4-7db7-4425-9920-3dac8a779304-kube-api-access-x4dwl" (OuterVolumeSpecName: "kube-api-access-x4dwl") pod "ccddd9d4-7db7-4425-9920-3dac8a779304" (UID: "ccddd9d4-7db7-4425-9920-3dac8a779304"). InnerVolumeSpecName "kube-api-access-x4dwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.906408 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e-serving-cert\") pod \"controller-manager-5799f7746f-qdhcp\" (UID: \"84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e\") " pod="openshift-controller-manager/controller-manager-5799f7746f-qdhcp" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.906485 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72s8h\" (UniqueName: \"kubernetes.io/projected/84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e-kube-api-access-72s8h\") pod \"controller-manager-5799f7746f-qdhcp\" (UID: \"84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e\") " pod="openshift-controller-manager/controller-manager-5799f7746f-qdhcp" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.906586 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e-client-ca\") pod \"controller-manager-5799f7746f-qdhcp\" (UID: \"84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e\") " pod="openshift-controller-manager/controller-manager-5799f7746f-qdhcp" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.906654 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e-config\") pod \"controller-manager-5799f7746f-qdhcp\" (UID: \"84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e\") " pod="openshift-controller-manager/controller-manager-5799f7746f-qdhcp" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.906682 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e-proxy-ca-bundles\") pod \"controller-manager-5799f7746f-qdhcp\" (UID: \"84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e\") " pod="openshift-controller-manager/controller-manager-5799f7746f-qdhcp" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.906731 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.906745 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccddd9d4-7db7-4425-9920-3dac8a779304-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.906825 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4dwl\" (UniqueName: \"kubernetes.io/projected/ccddd9d4-7db7-4425-9920-3dac8a779304-kube-api-access-x4dwl\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.906846 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k52hl\" (UniqueName: \"kubernetes.io/projected/2b48d9e0-5296-4cdf-bd91-205a09bcdaed-kube-api-access-k52hl\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.908099 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e-proxy-ca-bundles\") pod \"controller-manager-5799f7746f-qdhcp\" (UID: \"84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e\") " pod="openshift-controller-manager/controller-manager-5799f7746f-qdhcp" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.909034 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e-client-ca\") pod \"controller-manager-5799f7746f-qdhcp\" (UID: \"84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e\") " pod="openshift-controller-manager/controller-manager-5799f7746f-qdhcp" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.909994 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e-config\") pod \"controller-manager-5799f7746f-qdhcp\" (UID: \"84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e\") " pod="openshift-controller-manager/controller-manager-5799f7746f-qdhcp" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.911129 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e-serving-cert\") pod \"controller-manager-5799f7746f-qdhcp\" (UID: \"84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e\") " pod="openshift-controller-manager/controller-manager-5799f7746f-qdhcp" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.943823 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72s8h\" (UniqueName: \"kubernetes.io/projected/84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e-kube-api-access-72s8h\") pod \"controller-manager-5799f7746f-qdhcp\" (UID: \"84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e\") " pod="openshift-controller-manager/controller-manager-5799f7746f-qdhcp" Apr 02 13:43:08 crc kubenswrapper[4732]: I0402 13:43:08.988600 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5799f7746f-qdhcp" Apr 02 13:43:09 crc kubenswrapper[4732]: I0402 13:43:09.055906 4732 generic.go:334] "Generic (PLEG): container finished" podID="ccddd9d4-7db7-4425-9920-3dac8a779304" containerID="aacadcb720b27cc871fb117adcce026e03ce7431a867e7ac8f770df7c58cf443" exitCode=0 Apr 02 13:43:09 crc kubenswrapper[4732]: I0402 13:43:09.055950 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl" Apr 02 13:43:09 crc kubenswrapper[4732]: I0402 13:43:09.055986 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl" event={"ID":"ccddd9d4-7db7-4425-9920-3dac8a779304","Type":"ContainerDied","Data":"aacadcb720b27cc871fb117adcce026e03ce7431a867e7ac8f770df7c58cf443"} Apr 02 13:43:09 crc kubenswrapper[4732]: I0402 13:43:09.056017 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl" event={"ID":"ccddd9d4-7db7-4425-9920-3dac8a779304","Type":"ContainerDied","Data":"ad6734aadeaefc316d1d7a7761b542046d2796c351039437126f2ab4873239f2"} Apr 02 13:43:09 crc kubenswrapper[4732]: I0402 13:43:09.056034 4732 scope.go:117] "RemoveContainer" containerID="aacadcb720b27cc871fb117adcce026e03ce7431a867e7ac8f770df7c58cf443" Apr 02 13:43:09 crc kubenswrapper[4732]: I0402 13:43:09.058559 4732 generic.go:334] "Generic (PLEG): container finished" podID="2b48d9e0-5296-4cdf-bd91-205a09bcdaed" containerID="801f2e29d81a19d42d2a8c35affaa9eb310feee819d3b2b60189b99fbf025e34" exitCode=0 Apr 02 13:43:09 crc kubenswrapper[4732]: I0402 13:43:09.058584 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" event={"ID":"2b48d9e0-5296-4cdf-bd91-205a09bcdaed","Type":"ContainerDied","Data":"801f2e29d81a19d42d2a8c35affaa9eb310feee819d3b2b60189b99fbf025e34"} Apr 02 13:43:09 crc kubenswrapper[4732]: I0402 13:43:09.058600 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" event={"ID":"2b48d9e0-5296-4cdf-bd91-205a09bcdaed","Type":"ContainerDied","Data":"331fb3e655bc7401579b0d24c4e2d7ef04aba6df462175e00f3e0ac52b94ba04"} Apr 02 13:43:09 crc kubenswrapper[4732]: I0402 13:43:09.058642 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" Apr 02 13:43:09 crc kubenswrapper[4732]: I0402 13:43:09.077515 4732 scope.go:117] "RemoveContainer" containerID="aacadcb720b27cc871fb117adcce026e03ce7431a867e7ac8f770df7c58cf443" Apr 02 13:43:09 crc kubenswrapper[4732]: E0402 13:43:09.078199 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aacadcb720b27cc871fb117adcce026e03ce7431a867e7ac8f770df7c58cf443\": container with ID starting with aacadcb720b27cc871fb117adcce026e03ce7431a867e7ac8f770df7c58cf443 not found: ID does not exist" containerID="aacadcb720b27cc871fb117adcce026e03ce7431a867e7ac8f770df7c58cf443" Apr 02 13:43:09 crc kubenswrapper[4732]: I0402 13:43:09.078228 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aacadcb720b27cc871fb117adcce026e03ce7431a867e7ac8f770df7c58cf443"} err="failed to get container status \"aacadcb720b27cc871fb117adcce026e03ce7431a867e7ac8f770df7c58cf443\": rpc error: code = NotFound desc = could not find container \"aacadcb720b27cc871fb117adcce026e03ce7431a867e7ac8f770df7c58cf443\": container with ID starting with aacadcb720b27cc871fb117adcce026e03ce7431a867e7ac8f770df7c58cf443 not found: ID does not exist" Apr 02 13:43:09 crc kubenswrapper[4732]: I0402 13:43:09.078246 4732 scope.go:117] "RemoveContainer" containerID="801f2e29d81a19d42d2a8c35affaa9eb310feee819d3b2b60189b99fbf025e34" Apr 02 13:43:09 crc kubenswrapper[4732]: I0402 13:43:09.091278 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl"] Apr 02 13:43:09 crc kubenswrapper[4732]: I0402 13:43:09.094460 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d9f4656d6-phcgl"] Apr 02 13:43:09 crc kubenswrapper[4732]: I0402 13:43:09.105416 4732 scope.go:117] "RemoveContainer" containerID="801f2e29d81a19d42d2a8c35affaa9eb310feee819d3b2b60189b99fbf025e34" Apr 02 13:43:09 crc kubenswrapper[4732]: E0402 13:43:09.105828 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"801f2e29d81a19d42d2a8c35affaa9eb310feee819d3b2b60189b99fbf025e34\": container with ID starting with 801f2e29d81a19d42d2a8c35affaa9eb310feee819d3b2b60189b99fbf025e34 not found: ID does not exist" containerID="801f2e29d81a19d42d2a8c35affaa9eb310feee819d3b2b60189b99fbf025e34" Apr 02 13:43:09 crc kubenswrapper[4732]: I0402 13:43:09.105855 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"801f2e29d81a19d42d2a8c35affaa9eb310feee819d3b2b60189b99fbf025e34"} err="failed to get container status \"801f2e29d81a19d42d2a8c35affaa9eb310feee819d3b2b60189b99fbf025e34\": rpc error: code = NotFound desc = could not find container \"801f2e29d81a19d42d2a8c35affaa9eb310feee819d3b2b60189b99fbf025e34\": container with ID starting with 801f2e29d81a19d42d2a8c35affaa9eb310feee819d3b2b60189b99fbf025e34 not found: ID does not exist" Apr 02 13:43:09 crc kubenswrapper[4732]: I0402 13:43:09.113595 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg"] Apr 02 13:43:09 crc kubenswrapper[4732]: I0402 13:43:09.117594 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg"] Apr 02 13:43:09 crc kubenswrapper[4732]: I0402 13:43:09.357363 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9lrnf" Apr 02 13:43:09 crc kubenswrapper[4732]: I0402 13:43:09.358551 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9lrnf" Apr 02 13:43:09 crc kubenswrapper[4732]: I0402 13:43:09.541799 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5799f7746f-qdhcp"] Apr 02 13:43:09 crc kubenswrapper[4732]: I0402 13:43:09.556538 4732 patch_prober.go:28] interesting pod/controller-manager-5bc5979f8c-gz2dg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.74:8443/healthz\": dial tcp 10.217.0.74:8443: i/o timeout" start-of-body= Apr 02 13:43:09 crc kubenswrapper[4732]: I0402 13:43:09.556581 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5bc5979f8c-gz2dg" podUID="2b48d9e0-5296-4cdf-bd91-205a09bcdaed" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.74:8443/healthz\": dial tcp 10.217.0.74:8443: i/o timeout" Apr 02 13:43:09 crc kubenswrapper[4732]: I0402 13:43:09.618288 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z7v99" Apr 02 13:43:09 crc kubenswrapper[4732]: I0402 13:43:09.618332 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z7v99" Apr 02 13:43:10 crc kubenswrapper[4732]: I0402 13:43:10.066272 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5799f7746f-qdhcp" event={"ID":"84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e","Type":"ContainerStarted","Data":"3c9043dc8a7177d9f6bd4216cce4147d5c14e71a7ac18348c390e9e4cbd43fa2"} Apr 02 13:43:10 crc kubenswrapper[4732]: I0402 13:43:10.066650 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5799f7746f-qdhcp" event={"ID":"84e51ffd-8b0d-4ea0-a082-652ed5bf0b9e","Type":"ContainerStarted","Data":"adf71a7a071aa4a3165c75a2bc38dfc70be1a05567908a5f5fa9431f0520ba0a"} Apr 02 13:43:10 crc kubenswrapper[4732]: I0402 13:43:10.068285 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5799f7746f-qdhcp" Apr 02 13:43:10 crc kubenswrapper[4732]: I0402 13:43:10.072649 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5799f7746f-qdhcp" Apr 02 13:43:10 crc kubenswrapper[4732]: I0402 13:43:10.091901 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5799f7746f-qdhcp" podStartSLOduration=4.091882714 podStartE2EDuration="4.091882714s" podCreationTimestamp="2026-04-02 13:43:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:43:10.090836466 +0000 UTC m=+346.995244039" watchObservedRunningTime="2026-04-02 13:43:10.091882714 +0000 UTC m=+346.996290267" Apr 02 13:43:10 crc kubenswrapper[4732]: I0402 13:43:10.239336 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lvckz" Apr 02 13:43:10 crc kubenswrapper[4732]: I0402 13:43:10.239870 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lvckz" Apr 02 13:43:10 crc kubenswrapper[4732]: I0402 13:43:10.410974 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-9lrnf" podUID="51a0e365-014c-40e8-8749-7512f2c00758" containerName="registry-server" probeResult="failure" output=< Apr 02 13:43:10 crc kubenswrapper[4732]: timeout: failed to connect service ":50051" within 1s Apr 02 13:43:10 crc kubenswrapper[4732]: > Apr 02 13:43:10 crc kubenswrapper[4732]: I0402 13:43:10.671762 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-z7v99" podUID="72981c60-e9a1-4e25-9b64-7493d6fdaab6" containerName="registry-server" probeResult="failure" output=< Apr 02 13:43:10 crc kubenswrapper[4732]: timeout: failed to connect service ":50051" within 1s Apr 02 13:43:10 crc kubenswrapper[4732]: > Apr 02 13:43:10 crc kubenswrapper[4732]: I0402 13:43:10.689088 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b48d9e0-5296-4cdf-bd91-205a09bcdaed" path="/var/lib/kubelet/pods/2b48d9e0-5296-4cdf-bd91-205a09bcdaed/volumes" Apr 02 13:43:10 crc kubenswrapper[4732]: I0402 13:43:10.689765 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccddd9d4-7db7-4425-9920-3dac8a779304" path="/var/lib/kubelet/pods/ccddd9d4-7db7-4425-9920-3dac8a779304/volumes" Apr 02 13:43:10 crc kubenswrapper[4732]: I0402 13:43:10.726176 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vgjz4" Apr 02 13:43:10 crc kubenswrapper[4732]: I0402 13:43:10.726246 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vgjz4" Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.290470 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lvckz" podUID="33708fee-32a5-4418-81d0-226813150db7" containerName="registry-server" probeResult="failure" output=< Apr 02 13:43:11 crc kubenswrapper[4732]: timeout: failed to connect service ":50051" within 1s Apr 02 13:43:11 crc kubenswrapper[4732]: > Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.575731 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f679f984c-vlkgl"] Apr 02 13:43:11 crc kubenswrapper[4732]: E0402 13:43:11.575937 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccddd9d4-7db7-4425-9920-3dac8a779304" containerName="route-controller-manager" Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.575949 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccddd9d4-7db7-4425-9920-3dac8a779304" containerName="route-controller-manager" Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.576054 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccddd9d4-7db7-4425-9920-3dac8a779304" containerName="route-controller-manager" Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.576388 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f679f984c-vlkgl" Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.579936 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.580217 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.580534 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.581100 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.581929 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.583050 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.591386 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f679f984c-vlkgl"] Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.747802 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0-serving-cert\") pod \"route-controller-manager-f679f984c-vlkgl\" (UID: \"1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0\") " pod="openshift-route-controller-manager/route-controller-manager-f679f984c-vlkgl" Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.748494 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgkj9\" (UniqueName: \"kubernetes.io/projected/1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0-kube-api-access-rgkj9\") pod \"route-controller-manager-f679f984c-vlkgl\" (UID: \"1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0\") " pod="openshift-route-controller-manager/route-controller-manager-f679f984c-vlkgl" Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.748892 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0-client-ca\") pod \"route-controller-manager-f679f984c-vlkgl\" (UID: \"1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0\") " pod="openshift-route-controller-manager/route-controller-manager-f679f984c-vlkgl" Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.749056 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0-config\") pod \"route-controller-manager-f679f984c-vlkgl\" (UID: \"1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0\") " pod="openshift-route-controller-manager/route-controller-manager-f679f984c-vlkgl" Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.763668 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vgjz4" podUID="50d43b2d-24ec-439f-a418-3673791eb1b1" containerName="registry-server" probeResult="failure" output=< Apr 02 13:43:11 crc kubenswrapper[4732]: timeout: failed to connect service ":50051" within 1s Apr 02 13:43:11 crc kubenswrapper[4732]: > Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.851967 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgkj9\" (UniqueName: \"kubernetes.io/projected/1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0-kube-api-access-rgkj9\") pod \"route-controller-manager-f679f984c-vlkgl\" (UID: \"1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0\") " pod="openshift-route-controller-manager/route-controller-manager-f679f984c-vlkgl" Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.852285 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0-client-ca\") pod \"route-controller-manager-f679f984c-vlkgl\" (UID: \"1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0\") " pod="openshift-route-controller-manager/route-controller-manager-f679f984c-vlkgl" Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.852308 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0-config\") pod \"route-controller-manager-f679f984c-vlkgl\" (UID: \"1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0\") " pod="openshift-route-controller-manager/route-controller-manager-f679f984c-vlkgl" Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.853049 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0-client-ca\") pod \"route-controller-manager-f679f984c-vlkgl\" (UID: \"1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0\") " pod="openshift-route-controller-manager/route-controller-manager-f679f984c-vlkgl" Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.853086 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0-serving-cert\") pod \"route-controller-manager-f679f984c-vlkgl\" (UID: \"1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0\") " pod="openshift-route-controller-manager/route-controller-manager-f679f984c-vlkgl" Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.853366 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0-config\") pod \"route-controller-manager-f679f984c-vlkgl\" (UID: \"1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0\") " pod="openshift-route-controller-manager/route-controller-manager-f679f984c-vlkgl" Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.861645 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0-serving-cert\") pod \"route-controller-manager-f679f984c-vlkgl\" (UID: \"1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0\") " pod="openshift-route-controller-manager/route-controller-manager-f679f984c-vlkgl" Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.870967 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgkj9\" (UniqueName: \"kubernetes.io/projected/1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0-kube-api-access-rgkj9\") pod \"route-controller-manager-f679f984c-vlkgl\" (UID: \"1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0\") " pod="openshift-route-controller-manager/route-controller-manager-f679f984c-vlkgl" Apr 02 13:43:11 crc kubenswrapper[4732]: I0402 13:43:11.927596 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f679f984c-vlkgl" Apr 02 13:43:12 crc kubenswrapper[4732]: I0402 13:43:12.328236 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f679f984c-vlkgl"] Apr 02 13:43:13 crc kubenswrapper[4732]: I0402 13:43:13.086754 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f679f984c-vlkgl" event={"ID":"1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0","Type":"ContainerStarted","Data":"02f885143982b400944debc8522fe772fcf9bb57a44934e4202687eee8ed0e7e"} Apr 02 13:43:14 crc kubenswrapper[4732]: I0402 13:43:14.095655 4732 generic.go:334] "Generic (PLEG): container finished" podID="37a5e44f-9a88-4405-be8a-b645485e7312" containerID="960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e" exitCode=0 Apr 02 13:43:14 crc kubenswrapper[4732]: I0402 13:43:14.095989 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerDied","Data":"960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e"} Apr 02 13:43:14 crc kubenswrapper[4732]: I0402 13:43:14.096495 4732 scope.go:117] "RemoveContainer" containerID="960aab148ced3ecc2648e0b866cb9330dc955e8ab123f3afb95d377e3efa5a9e" Apr 02 13:43:14 crc kubenswrapper[4732]: I0402 13:43:14.097684 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f679f984c-vlkgl" event={"ID":"1cbf30e1-2945-47b8-8f6d-e8d1e46d87c0","Type":"ContainerStarted","Data":"79b3feeec4d7a1993e73846cff94ea095346d434e1a61937baf551db261afb15"} Apr 02 13:43:14 crc kubenswrapper[4732]: I0402 13:43:14.097951 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f679f984c-vlkgl" Apr 02 13:43:14 crc kubenswrapper[4732]: I0402 13:43:14.103550 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f679f984c-vlkgl" Apr 02 13:43:14 crc kubenswrapper[4732]: I0402 13:43:14.142192 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f679f984c-vlkgl" podStartSLOduration=8.142169955 podStartE2EDuration="8.142169955s" podCreationTimestamp="2026-04-02 13:43:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:43:14.14010349 +0000 UTC m=+351.044511053" watchObservedRunningTime="2026-04-02 13:43:14.142169955 +0000 UTC m=+351.046577518" Apr 02 13:43:15 crc kubenswrapper[4732]: I0402 13:43:15.106469 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"158ff7872549dfa4b3e82b611395e2802243bbfd9d81fc6905e35aaf4b32cd49"} Apr 02 13:43:15 crc kubenswrapper[4732]: I0402 13:43:15.462534 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-q7cfc"] Apr 02 13:43:15 crc kubenswrapper[4732]: I0402 13:43:15.464052 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" Apr 02 13:43:15 crc kubenswrapper[4732]: I0402 13:43:15.465994 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Apr 02 13:43:15 crc kubenswrapper[4732]: I0402 13:43:15.615026 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bd27c473-5209-45b9-a8da-b285f03920f8-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-q7cfc\" (UID: \"bd27c473-5209-45b9-a8da-b285f03920f8\") " pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" Apr 02 13:43:15 crc kubenswrapper[4732]: I0402 13:43:15.615103 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdbzf\" (UniqueName: \"kubernetes.io/projected/bd27c473-5209-45b9-a8da-b285f03920f8-kube-api-access-mdbzf\") pod \"cni-sysctl-allowlist-ds-q7cfc\" (UID: \"bd27c473-5209-45b9-a8da-b285f03920f8\") " pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" Apr 02 13:43:15 crc kubenswrapper[4732]: I0402 13:43:15.615169 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd27c473-5209-45b9-a8da-b285f03920f8-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-q7cfc\" (UID: \"bd27c473-5209-45b9-a8da-b285f03920f8\") " pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" Apr 02 13:43:15 crc kubenswrapper[4732]: I0402 13:43:15.615202 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/bd27c473-5209-45b9-a8da-b285f03920f8-ready\") pod \"cni-sysctl-allowlist-ds-q7cfc\" (UID: \"bd27c473-5209-45b9-a8da-b285f03920f8\") " pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" Apr 02 13:43:15 crc kubenswrapper[4732]: I0402 13:43:15.715950 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdbzf\" (UniqueName: \"kubernetes.io/projected/bd27c473-5209-45b9-a8da-b285f03920f8-kube-api-access-mdbzf\") pod \"cni-sysctl-allowlist-ds-q7cfc\" (UID: \"bd27c473-5209-45b9-a8da-b285f03920f8\") " pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" Apr 02 13:43:15 crc kubenswrapper[4732]: I0402 13:43:15.716041 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd27c473-5209-45b9-a8da-b285f03920f8-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-q7cfc\" (UID: \"bd27c473-5209-45b9-a8da-b285f03920f8\") " pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" Apr 02 13:43:15 crc kubenswrapper[4732]: I0402 13:43:15.716068 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/bd27c473-5209-45b9-a8da-b285f03920f8-ready\") pod \"cni-sysctl-allowlist-ds-q7cfc\" (UID: \"bd27c473-5209-45b9-a8da-b285f03920f8\") " pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" Apr 02 13:43:15 crc kubenswrapper[4732]: I0402 13:43:15.716113 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bd27c473-5209-45b9-a8da-b285f03920f8-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-q7cfc\" (UID: \"bd27c473-5209-45b9-a8da-b285f03920f8\") " pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" Apr 02 13:43:15 crc kubenswrapper[4732]: I0402 13:43:15.716273 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd27c473-5209-45b9-a8da-b285f03920f8-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-q7cfc\" (UID: \"bd27c473-5209-45b9-a8da-b285f03920f8\") " pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" Apr 02 13:43:15 crc kubenswrapper[4732]: I0402 13:43:15.716780 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/bd27c473-5209-45b9-a8da-b285f03920f8-ready\") pod \"cni-sysctl-allowlist-ds-q7cfc\" (UID: \"bd27c473-5209-45b9-a8da-b285f03920f8\") " pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" Apr 02 13:43:15 crc kubenswrapper[4732]: I0402 13:43:15.716881 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bd27c473-5209-45b9-a8da-b285f03920f8-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-q7cfc\" (UID: \"bd27c473-5209-45b9-a8da-b285f03920f8\") " pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" Apr 02 13:43:15 crc kubenswrapper[4732]: I0402 13:43:15.740246 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdbzf\" (UniqueName: \"kubernetes.io/projected/bd27c473-5209-45b9-a8da-b285f03920f8-kube-api-access-mdbzf\") pod \"cni-sysctl-allowlist-ds-q7cfc\" (UID: \"bd27c473-5209-45b9-a8da-b285f03920f8\") " pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" Apr 02 13:43:15 crc kubenswrapper[4732]: I0402 13:43:15.777868 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" Apr 02 13:43:16 crc kubenswrapper[4732]: I0402 13:43:16.115797 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" event={"ID":"bd27c473-5209-45b9-a8da-b285f03920f8","Type":"ContainerStarted","Data":"5e71134fa6603ed5e6cabcf01ae0a45a55bfeb58e31e8c3441ad80aee6ea2ca8"} Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.065962 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4m6rj" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.066020 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4m6rj" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.113327 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4m6rj" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.128187 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" event={"ID":"bd27c473-5209-45b9-a8da-b285f03920f8","Type":"ContainerStarted","Data":"e68473d7cc0d08293a0f48d1224a39e782dd7b899994ad1d070d72a3c06dfc01"} Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.128245 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.156860 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" podStartSLOduration=2.156844949 podStartE2EDuration="2.156844949s" podCreationTimestamp="2026-04-02 13:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:43:17.154469596 +0000 UTC m=+354.058877159" watchObservedRunningTime="2026-04-02 13:43:17.156844949 +0000 UTC m=+354.061252502" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.160316 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.178896 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4m6rj" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.462700 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-q7cfc"] Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.639857 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fd4xj" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.639895 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fd4xj" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.649490 4732 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.650539 4732 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Apr 02 13:43:17 crc kubenswrapper[4732]: E0402 13:43:17.650728 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler-recovery-controller" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.650745 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler-recovery-controller" Apr 02 13:43:17 crc kubenswrapper[4732]: E0402 13:43:17.650761 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler-cert-syncer" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.650767 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler-cert-syncer" Apr 02 13:43:17 crc kubenswrapper[4732]: E0402 13:43:17.650775 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="wait-for-host-port" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.650782 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="wait-for-host-port" Apr 02 13:43:17 crc kubenswrapper[4732]: E0402 13:43:17.650797 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.650802 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.650899 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler-cert-syncer" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.650913 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.650925 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler-recovery-controller" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.651850 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" containerID="cri-o://e6f4f8c5575530ca3fe7ef2e6cd25a4f63888841ae0564192385b7371093305d" gracePeriod=30 Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.651966 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler-recovery-controller" containerID="cri-o://dbb15ce78ba79858cf5ae05773b302a03fd75dd3a3cdd53d8ca5ff2bdbc6d48d" gracePeriod=30 Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.652000 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler-cert-syncer" containerID="cri-o://e03db4bfdbd69007c0b4a4d3f465fd21957c5a829c84cac0dd39914617e44cf7" gracePeriod=30 Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.704005 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fd4xj" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.733148 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" oldPodUID="3dcd261975c3d6b9a6ad6367fd4facd3" podUID="815516d0756bb9282f4d0a28cef72670" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.756946 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8ps5w" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.756982 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8ps5w" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.799824 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8ps5w" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.851564 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/815516d0756bb9282f4d0a28cef72670-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"815516d0756bb9282f4d0a28cef72670\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.851666 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/815516d0756bb9282f4d0a28cef72670-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"815516d0756bb9282f4d0a28cef72670\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.953120 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/815516d0756bb9282f4d0a28cef72670-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"815516d0756bb9282f4d0a28cef72670\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.953255 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/815516d0756bb9282f4d0a28cef72670-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"815516d0756bb9282f4d0a28cef72670\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.953258 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/815516d0756bb9282f4d0a28cef72670-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"815516d0756bb9282f4d0a28cef72670\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:43:17 crc kubenswrapper[4732]: I0402 13:43:17.953307 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/815516d0756bb9282f4d0a28cef72670-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"815516d0756bb9282f4d0a28cef72670\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:43:18 crc kubenswrapper[4732]: I0402 13:43:18.140669 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-crc_3dcd261975c3d6b9a6ad6367fd4facd3/kube-scheduler-cert-syncer/0.log" Apr 02 13:43:18 crc kubenswrapper[4732]: I0402 13:43:18.141852 4732 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="e03db4bfdbd69007c0b4a4d3f465fd21957c5a829c84cac0dd39914617e44cf7" exitCode=2 Apr 02 13:43:18 crc kubenswrapper[4732]: I0402 13:43:18.214810 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fd4xj" Apr 02 13:43:18 crc kubenswrapper[4732]: I0402 13:43:18.218408 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8ps5w" Apr 02 13:43:18 crc kubenswrapper[4732]: I0402 13:43:18.759312 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fd4xj"] Apr 02 13:43:19 crc kubenswrapper[4732]: I0402 13:43:19.160516 4732 generic.go:334] "Generic (PLEG): container finished" podID="c78649c4-5d4a-4dac-a180-6ee450fd150f" containerID="8b5be373b9699d8e3dd7d6b2036f18d8b0bd699e3262f684d778f605e1141f6b" exitCode=0 Apr 02 13:43:19 crc kubenswrapper[4732]: I0402 13:43:19.160736 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-7-crc" event={"ID":"c78649c4-5d4a-4dac-a180-6ee450fd150f","Type":"ContainerDied","Data":"8b5be373b9699d8e3dd7d6b2036f18d8b0bd699e3262f684d778f605e1141f6b"} Apr 02 13:43:19 crc kubenswrapper[4732]: I0402 13:43:19.163988 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-crc_3dcd261975c3d6b9a6ad6367fd4facd3/kube-scheduler-cert-syncer/0.log" Apr 02 13:43:19 crc kubenswrapper[4732]: I0402 13:43:19.165641 4732 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="dbb15ce78ba79858cf5ae05773b302a03fd75dd3a3cdd53d8ca5ff2bdbc6d48d" exitCode=0 Apr 02 13:43:19 crc kubenswrapper[4732]: I0402 13:43:19.165690 4732 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="e6f4f8c5575530ca3fe7ef2e6cd25a4f63888841ae0564192385b7371093305d" exitCode=0 Apr 02 13:43:19 crc kubenswrapper[4732]: I0402 13:43:19.166457 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" podUID="bd27c473-5209-45b9-a8da-b285f03920f8" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://e68473d7cc0d08293a0f48d1224a39e782dd7b899994ad1d070d72a3c06dfc01" gracePeriod=30 Apr 02 13:43:19 crc kubenswrapper[4732]: I0402 13:43:19.403940 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9lrnf" Apr 02 13:43:19 crc kubenswrapper[4732]: I0402 13:43:19.474603 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9lrnf" Apr 02 13:43:19 crc kubenswrapper[4732]: I0402 13:43:19.665914 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z7v99" Apr 02 13:43:19 crc kubenswrapper[4732]: I0402 13:43:19.720073 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z7v99" Apr 02 13:43:19 crc kubenswrapper[4732]: I0402 13:43:19.722469 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-crc_3dcd261975c3d6b9a6ad6367fd4facd3/kube-scheduler-cert-syncer/0.log" Apr 02 13:43:19 crc kubenswrapper[4732]: I0402 13:43:19.723730 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:43:19 crc kubenswrapper[4732]: I0402 13:43:19.739827 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" oldPodUID="3dcd261975c3d6b9a6ad6367fd4facd3" podUID="815516d0756bb9282f4d0a28cef72670" Apr 02 13:43:19 crc kubenswrapper[4732]: I0402 13:43:19.789251 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"3dcd261975c3d6b9a6ad6367fd4facd3\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " Apr 02 13:43:19 crc kubenswrapper[4732]: I0402 13:43:19.789711 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"3dcd261975c3d6b9a6ad6367fd4facd3\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " Apr 02 13:43:19 crc kubenswrapper[4732]: I0402 13:43:19.789366 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "3dcd261975c3d6b9a6ad6367fd4facd3" (UID: "3dcd261975c3d6b9a6ad6367fd4facd3"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:43:19 crc kubenswrapper[4732]: I0402 13:43:19.789747 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "3dcd261975c3d6b9a6ad6367fd4facd3" (UID: "3dcd261975c3d6b9a6ad6367fd4facd3"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:43:19 crc kubenswrapper[4732]: I0402 13:43:19.790879 4732 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:19 crc kubenswrapper[4732]: I0402 13:43:19.791061 4732 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.156668 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8ps5w"] Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.176246 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-crc_3dcd261975c3d6b9a6ad6367fd4facd3/kube-scheduler-cert-syncer/0.log" Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.177362 4732 scope.go:117] "RemoveContainer" containerID="dbb15ce78ba79858cf5ae05773b302a03fd75dd3a3cdd53d8ca5ff2bdbc6d48d" Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.177534 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.177576 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fd4xj" podUID="6ac64cdf-a607-481a-9907-e6e72fc8b083" containerName="registry-server" containerID="cri-o://2761fef5975eef45a310fec86c715ca4d66cc77b41884e98ddaf26c5dfe89d20" gracePeriod=2 Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.177695 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8ps5w" podUID="9058e533-24e2-44f1-8631-dd9bf6a37192" containerName="registry-server" containerID="cri-o://71072a927cc5d72b38e51db2b3913ccae6840c19ca7a27befca43a2c1adbe86b" gracePeriod=2 Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.189521 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" oldPodUID="3dcd261975c3d6b9a6ad6367fd4facd3" podUID="815516d0756bb9282f4d0a28cef72670" Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.200450 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" oldPodUID="3dcd261975c3d6b9a6ad6367fd4facd3" podUID="815516d0756bb9282f4d0a28cef72670" Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.205529 4732 scope.go:117] "RemoveContainer" containerID="e03db4bfdbd69007c0b4a4d3f465fd21957c5a829c84cac0dd39914617e44cf7" Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.235115 4732 scope.go:117] "RemoveContainer" containerID="e6f4f8c5575530ca3fe7ef2e6cd25a4f63888841ae0564192385b7371093305d" Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.251266 4732 scope.go:117] "RemoveContainer" containerID="18c38540c1e54421b3069733149cadd0da6b74c4f6aa0160090c8a14429797dd" Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.301595 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lvckz" Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.356124 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lvckz" Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.512385 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-7-crc" Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.693139 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" path="/var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/volumes" Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.705246 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c78649c4-5d4a-4dac-a180-6ee450fd150f-kube-api-access\") pod \"c78649c4-5d4a-4dac-a180-6ee450fd150f\" (UID: \"c78649c4-5d4a-4dac-a180-6ee450fd150f\") " Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.705318 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c78649c4-5d4a-4dac-a180-6ee450fd150f-var-lock\") pod \"c78649c4-5d4a-4dac-a180-6ee450fd150f\" (UID: \"c78649c4-5d4a-4dac-a180-6ee450fd150f\") " Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.705495 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c78649c4-5d4a-4dac-a180-6ee450fd150f-kubelet-dir\") pod \"c78649c4-5d4a-4dac-a180-6ee450fd150f\" (UID: \"c78649c4-5d4a-4dac-a180-6ee450fd150f\") " Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.705558 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c78649c4-5d4a-4dac-a180-6ee450fd150f-var-lock" (OuterVolumeSpecName: "var-lock") pod "c78649c4-5d4a-4dac-a180-6ee450fd150f" (UID: "c78649c4-5d4a-4dac-a180-6ee450fd150f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.705762 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c78649c4-5d4a-4dac-a180-6ee450fd150f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c78649c4-5d4a-4dac-a180-6ee450fd150f" (UID: "c78649c4-5d4a-4dac-a180-6ee450fd150f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.715008 4732 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c78649c4-5d4a-4dac-a180-6ee450fd150f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.715069 4732 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c78649c4-5d4a-4dac-a180-6ee450fd150f-var-lock\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.715556 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c78649c4-5d4a-4dac-a180-6ee450fd150f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c78649c4-5d4a-4dac-a180-6ee450fd150f" (UID: "c78649c4-5d4a-4dac-a180-6ee450fd150f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.816545 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c78649c4-5d4a-4dac-a180-6ee450fd150f-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.834153 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vgjz4" Apr 02 13:43:20 crc kubenswrapper[4732]: I0402 13:43:20.891857 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vgjz4" Apr 02 13:43:21 crc kubenswrapper[4732]: I0402 13:43:21.185356 4732 generic.go:334] "Generic (PLEG): container finished" podID="9058e533-24e2-44f1-8631-dd9bf6a37192" containerID="71072a927cc5d72b38e51db2b3913ccae6840c19ca7a27befca43a2c1adbe86b" exitCode=0 Apr 02 13:43:21 crc kubenswrapper[4732]: I0402 13:43:21.185423 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ps5w" event={"ID":"9058e533-24e2-44f1-8631-dd9bf6a37192","Type":"ContainerDied","Data":"71072a927cc5d72b38e51db2b3913ccae6840c19ca7a27befca43a2c1adbe86b"} Apr 02 13:43:21 crc kubenswrapper[4732]: I0402 13:43:21.187328 4732 generic.go:334] "Generic (PLEG): container finished" podID="6ac64cdf-a607-481a-9907-e6e72fc8b083" containerID="2761fef5975eef45a310fec86c715ca4d66cc77b41884e98ddaf26c5dfe89d20" exitCode=0 Apr 02 13:43:21 crc kubenswrapper[4732]: I0402 13:43:21.187374 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fd4xj" event={"ID":"6ac64cdf-a607-481a-9907-e6e72fc8b083","Type":"ContainerDied","Data":"2761fef5975eef45a310fec86c715ca4d66cc77b41884e98ddaf26c5dfe89d20"} Apr 02 13:43:21 crc kubenswrapper[4732]: I0402 13:43:21.188950 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-7-crc" event={"ID":"c78649c4-5d4a-4dac-a180-6ee450fd150f","Type":"ContainerDied","Data":"dee30ab317976ac46c6ccd04c66dd3ae5297afb2c97933449128c811ffb49903"} Apr 02 13:43:21 crc kubenswrapper[4732]: I0402 13:43:21.188987 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dee30ab317976ac46c6ccd04c66dd3ae5297afb2c97933449128c811ffb49903" Apr 02 13:43:21 crc kubenswrapper[4732]: I0402 13:43:21.188986 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-7-crc" Apr 02 13:43:21 crc kubenswrapper[4732]: I0402 13:43:21.558176 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fd4xj" Apr 02 13:43:21 crc kubenswrapper[4732]: I0402 13:43:21.626602 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ac64cdf-a607-481a-9907-e6e72fc8b083-utilities\") pod \"6ac64cdf-a607-481a-9907-e6e72fc8b083\" (UID: \"6ac64cdf-a607-481a-9907-e6e72fc8b083\") " Apr 02 13:43:21 crc kubenswrapper[4732]: I0402 13:43:21.626925 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ac64cdf-a607-481a-9907-e6e72fc8b083-catalog-content\") pod \"6ac64cdf-a607-481a-9907-e6e72fc8b083\" (UID: \"6ac64cdf-a607-481a-9907-e6e72fc8b083\") " Apr 02 13:43:21 crc kubenswrapper[4732]: I0402 13:43:21.627010 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnp2h\" (UniqueName: \"kubernetes.io/projected/6ac64cdf-a607-481a-9907-e6e72fc8b083-kube-api-access-pnp2h\") pod \"6ac64cdf-a607-481a-9907-e6e72fc8b083\" (UID: \"6ac64cdf-a607-481a-9907-e6e72fc8b083\") " Apr 02 13:43:21 crc kubenswrapper[4732]: I0402 13:43:21.627668 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ac64cdf-a607-481a-9907-e6e72fc8b083-utilities" (OuterVolumeSpecName: "utilities") pod "6ac64cdf-a607-481a-9907-e6e72fc8b083" (UID: "6ac64cdf-a607-481a-9907-e6e72fc8b083"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:43:21 crc kubenswrapper[4732]: I0402 13:43:21.636370 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ac64cdf-a607-481a-9907-e6e72fc8b083-kube-api-access-pnp2h" (OuterVolumeSpecName: "kube-api-access-pnp2h") pod "6ac64cdf-a607-481a-9907-e6e72fc8b083" (UID: "6ac64cdf-a607-481a-9907-e6e72fc8b083"). InnerVolumeSpecName "kube-api-access-pnp2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:43:21 crc kubenswrapper[4732]: I0402 13:43:21.684739 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ac64cdf-a607-481a-9907-e6e72fc8b083-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ac64cdf-a607-481a-9907-e6e72fc8b083" (UID: "6ac64cdf-a607-481a-9907-e6e72fc8b083"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:43:21 crc kubenswrapper[4732]: I0402 13:43:21.728033 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ac64cdf-a607-481a-9907-e6e72fc8b083-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:21 crc kubenswrapper[4732]: I0402 13:43:21.728075 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ac64cdf-a607-481a-9907-e6e72fc8b083-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:21 crc kubenswrapper[4732]: I0402 13:43:21.728091 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnp2h\" (UniqueName: \"kubernetes.io/projected/6ac64cdf-a607-481a-9907-e6e72fc8b083-kube-api-access-pnp2h\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.006240 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ps5w" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.033076 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-729nf\" (UniqueName: \"kubernetes.io/projected/9058e533-24e2-44f1-8631-dd9bf6a37192-kube-api-access-729nf\") pod \"9058e533-24e2-44f1-8631-dd9bf6a37192\" (UID: \"9058e533-24e2-44f1-8631-dd9bf6a37192\") " Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.033245 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9058e533-24e2-44f1-8631-dd9bf6a37192-catalog-content\") pod \"9058e533-24e2-44f1-8631-dd9bf6a37192\" (UID: \"9058e533-24e2-44f1-8631-dd9bf6a37192\") " Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.033265 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9058e533-24e2-44f1-8631-dd9bf6a37192-utilities\") pod \"9058e533-24e2-44f1-8631-dd9bf6a37192\" (UID: \"9058e533-24e2-44f1-8631-dd9bf6a37192\") " Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.034170 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9058e533-24e2-44f1-8631-dd9bf6a37192-utilities" (OuterVolumeSpecName: "utilities") pod "9058e533-24e2-44f1-8631-dd9bf6a37192" (UID: "9058e533-24e2-44f1-8631-dd9bf6a37192"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.036863 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9058e533-24e2-44f1-8631-dd9bf6a37192-kube-api-access-729nf" (OuterVolumeSpecName: "kube-api-access-729nf") pod "9058e533-24e2-44f1-8631-dd9bf6a37192" (UID: "9058e533-24e2-44f1-8631-dd9bf6a37192"). InnerVolumeSpecName "kube-api-access-729nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.101032 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9058e533-24e2-44f1-8631-dd9bf6a37192-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9058e533-24e2-44f1-8631-dd9bf6a37192" (UID: "9058e533-24e2-44f1-8631-dd9bf6a37192"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.137654 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9058e533-24e2-44f1-8631-dd9bf6a37192-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.137705 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9058e533-24e2-44f1-8631-dd9bf6a37192-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.137718 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-729nf\" (UniqueName: \"kubernetes.io/projected/9058e533-24e2-44f1-8631-dd9bf6a37192-kube-api-access-729nf\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.198290 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ps5w" event={"ID":"9058e533-24e2-44f1-8631-dd9bf6a37192","Type":"ContainerDied","Data":"78737cb551255129fbb3203f7982eb1b28e6b2d57c02a9c97fd94c120fc1cf4f"} Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.198325 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ps5w" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.198361 4732 scope.go:117] "RemoveContainer" containerID="71072a927cc5d72b38e51db2b3913ccae6840c19ca7a27befca43a2c1adbe86b" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.201142 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fd4xj" event={"ID":"6ac64cdf-a607-481a-9907-e6e72fc8b083","Type":"ContainerDied","Data":"69eab28f470ca7f2b836a4317fe17162e04269abf8b52aec33aaa3bef380b973"} Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.201232 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fd4xj" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.215352 4732 scope.go:117] "RemoveContainer" containerID="59f478c20e6d6939d2cf216ee8b5458617d57a9367164d367d059afc2a7adcd8" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.235793 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8ps5w"] Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.239846 4732 scope.go:117] "RemoveContainer" containerID="ecbe9a47fc07081a457ade95202e51e947d903d2cac58f72e34bc96e96791f2c" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.243805 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8ps5w"] Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.261192 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fd4xj"] Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.272792 4732 scope.go:117] "RemoveContainer" containerID="2761fef5975eef45a310fec86c715ca4d66cc77b41884e98ddaf26c5dfe89d20" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.272940 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fd4xj"] Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.297878 4732 scope.go:117] "RemoveContainer" containerID="a9b93dc09d7caa15cd7c426478c8dede2b7b339e7df716f78786c6cfc90be2fa" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.320255 4732 scope.go:117] "RemoveContainer" containerID="946618f3b3e5eb08548d122e42d9d04ac4bbc3030f5ff340eb464f99d279f52e" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.554172 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z7v99"] Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.555008 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z7v99" podUID="72981c60-e9a1-4e25-9b64-7493d6fdaab6" containerName="registry-server" containerID="cri-o://71bc9d5e90fdd9566b9aac6dfb38870bf6d6ce71be3b7eb7354b04408fe86b37" gracePeriod=2 Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.690208 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ac64cdf-a607-481a-9907-e6e72fc8b083" path="/var/lib/kubelet/pods/6ac64cdf-a607-481a-9907-e6e72fc8b083/volumes" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.691580 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9058e533-24e2-44f1-8631-dd9bf6a37192" path="/var/lib/kubelet/pods/9058e533-24e2-44f1-8631-dd9bf6a37192/volumes" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.733839 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-10-crc"] Apr 02 13:43:22 crc kubenswrapper[4732]: E0402 13:43:22.734476 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac64cdf-a607-481a-9907-e6e72fc8b083" containerName="registry-server" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.734658 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac64cdf-a607-481a-9907-e6e72fc8b083" containerName="registry-server" Apr 02 13:43:22 crc kubenswrapper[4732]: E0402 13:43:22.734794 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c78649c4-5d4a-4dac-a180-6ee450fd150f" containerName="installer" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.734903 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c78649c4-5d4a-4dac-a180-6ee450fd150f" containerName="installer" Apr 02 13:43:22 crc kubenswrapper[4732]: E0402 13:43:22.735015 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9058e533-24e2-44f1-8631-dd9bf6a37192" containerName="extract-content" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.735119 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9058e533-24e2-44f1-8631-dd9bf6a37192" containerName="extract-content" Apr 02 13:43:22 crc kubenswrapper[4732]: E0402 13:43:22.735241 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9058e533-24e2-44f1-8631-dd9bf6a37192" containerName="extract-utilities" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.735348 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9058e533-24e2-44f1-8631-dd9bf6a37192" containerName="extract-utilities" Apr 02 13:43:22 crc kubenswrapper[4732]: E0402 13:43:22.735479 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac64cdf-a607-481a-9907-e6e72fc8b083" containerName="extract-utilities" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.735590 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac64cdf-a607-481a-9907-e6e72fc8b083" containerName="extract-utilities" Apr 02 13:43:22 crc kubenswrapper[4732]: E0402 13:43:22.735741 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9058e533-24e2-44f1-8631-dd9bf6a37192" containerName="registry-server" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.735873 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9058e533-24e2-44f1-8631-dd9bf6a37192" containerName="registry-server" Apr 02 13:43:22 crc kubenswrapper[4732]: E0402 13:43:22.736000 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac64cdf-a607-481a-9907-e6e72fc8b083" containerName="extract-content" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.736128 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac64cdf-a607-481a-9907-e6e72fc8b083" containerName="extract-content" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.736446 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9058e533-24e2-44f1-8631-dd9bf6a37192" containerName="registry-server" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.736592 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c78649c4-5d4a-4dac-a180-6ee450fd150f" containerName="installer" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.736796 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac64cdf-a607-481a-9907-e6e72fc8b083" containerName="registry-server" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.737941 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-10-crc" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.751554 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-10-crc"] Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.760937 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dfc68032-3ed5-47ad-adde-2d764f4b8868-kube-api-access\") pod \"revision-pruner-10-crc\" (UID: \"dfc68032-3ed5-47ad-adde-2d764f4b8868\") " pod="openshift-kube-apiserver/revision-pruner-10-crc" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.761033 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dfc68032-3ed5-47ad-adde-2d764f4b8868-kubelet-dir\") pod \"revision-pruner-10-crc\" (UID: \"dfc68032-3ed5-47ad-adde-2d764f4b8868\") " pod="openshift-kube-apiserver/revision-pruner-10-crc" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.862483 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dfc68032-3ed5-47ad-adde-2d764f4b8868-kubelet-dir\") pod \"revision-pruner-10-crc\" (UID: \"dfc68032-3ed5-47ad-adde-2d764f4b8868\") " pod="openshift-kube-apiserver/revision-pruner-10-crc" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.862637 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dfc68032-3ed5-47ad-adde-2d764f4b8868-kube-api-access\") pod \"revision-pruner-10-crc\" (UID: \"dfc68032-3ed5-47ad-adde-2d764f4b8868\") " pod="openshift-kube-apiserver/revision-pruner-10-crc" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.863000 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dfc68032-3ed5-47ad-adde-2d764f4b8868-kubelet-dir\") pod \"revision-pruner-10-crc\" (UID: \"dfc68032-3ed5-47ad-adde-2d764f4b8868\") " pod="openshift-kube-apiserver/revision-pruner-10-crc" Apr 02 13:43:22 crc kubenswrapper[4732]: I0402 13:43:22.889463 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dfc68032-3ed5-47ad-adde-2d764f4b8868-kube-api-access\") pod \"revision-pruner-10-crc\" (UID: \"dfc68032-3ed5-47ad-adde-2d764f4b8868\") " pod="openshift-kube-apiserver/revision-pruner-10-crc" Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.051920 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z7v99" Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.085804 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-10-crc" Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.165222 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72981c60-e9a1-4e25-9b64-7493d6fdaab6-utilities\") pod \"72981c60-e9a1-4e25-9b64-7493d6fdaab6\" (UID: \"72981c60-e9a1-4e25-9b64-7493d6fdaab6\") " Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.165666 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhqpl\" (UniqueName: \"kubernetes.io/projected/72981c60-e9a1-4e25-9b64-7493d6fdaab6-kube-api-access-nhqpl\") pod \"72981c60-e9a1-4e25-9b64-7493d6fdaab6\" (UID: \"72981c60-e9a1-4e25-9b64-7493d6fdaab6\") " Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.165800 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72981c60-e9a1-4e25-9b64-7493d6fdaab6-catalog-content\") pod \"72981c60-e9a1-4e25-9b64-7493d6fdaab6\" (UID: \"72981c60-e9a1-4e25-9b64-7493d6fdaab6\") " Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.166389 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72981c60-e9a1-4e25-9b64-7493d6fdaab6-utilities" (OuterVolumeSpecName: "utilities") pod "72981c60-e9a1-4e25-9b64-7493d6fdaab6" (UID: "72981c60-e9a1-4e25-9b64-7493d6fdaab6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.170533 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72981c60-e9a1-4e25-9b64-7493d6fdaab6-kube-api-access-nhqpl" (OuterVolumeSpecName: "kube-api-access-nhqpl") pod "72981c60-e9a1-4e25-9b64-7493d6fdaab6" (UID: "72981c60-e9a1-4e25-9b64-7493d6fdaab6"). InnerVolumeSpecName "kube-api-access-nhqpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.204273 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72981c60-e9a1-4e25-9b64-7493d6fdaab6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72981c60-e9a1-4e25-9b64-7493d6fdaab6" (UID: "72981c60-e9a1-4e25-9b64-7493d6fdaab6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.210353 4732 generic.go:334] "Generic (PLEG): container finished" podID="72981c60-e9a1-4e25-9b64-7493d6fdaab6" containerID="71bc9d5e90fdd9566b9aac6dfb38870bf6d6ce71be3b7eb7354b04408fe86b37" exitCode=0 Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.210424 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z7v99" event={"ID":"72981c60-e9a1-4e25-9b64-7493d6fdaab6","Type":"ContainerDied","Data":"71bc9d5e90fdd9566b9aac6dfb38870bf6d6ce71be3b7eb7354b04408fe86b37"} Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.210603 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z7v99" event={"ID":"72981c60-e9a1-4e25-9b64-7493d6fdaab6","Type":"ContainerDied","Data":"d4d772450856dfbebd55c6fec998caee6c1d412523c56bc65e0766342de36120"} Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.210729 4732 scope.go:117] "RemoveContainer" containerID="71bc9d5e90fdd9566b9aac6dfb38870bf6d6ce71be3b7eb7354b04408fe86b37" Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.211094 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z7v99" Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.235349 4732 scope.go:117] "RemoveContainer" containerID="16541e33e966d2f1de439d8a63aece7d1c528e83e6fe30547f444b7984f3d70e" Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.251241 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z7v99"] Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.254988 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z7v99"] Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.269093 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72981c60-e9a1-4e25-9b64-7493d6fdaab6-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.269130 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72981c60-e9a1-4e25-9b64-7493d6fdaab6-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.269144 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhqpl\" (UniqueName: \"kubernetes.io/projected/72981c60-e9a1-4e25-9b64-7493d6fdaab6-kube-api-access-nhqpl\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.292318 4732 scope.go:117] "RemoveContainer" containerID="193dd2b4bb23f9d733807fda8e1d414d8610bf2650f00c987038316f39b50247" Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.313727 4732 scope.go:117] "RemoveContainer" containerID="71bc9d5e90fdd9566b9aac6dfb38870bf6d6ce71be3b7eb7354b04408fe86b37" Apr 02 13:43:23 crc kubenswrapper[4732]: E0402 13:43:23.314298 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71bc9d5e90fdd9566b9aac6dfb38870bf6d6ce71be3b7eb7354b04408fe86b37\": container with ID starting with 71bc9d5e90fdd9566b9aac6dfb38870bf6d6ce71be3b7eb7354b04408fe86b37 not found: ID does not exist" containerID="71bc9d5e90fdd9566b9aac6dfb38870bf6d6ce71be3b7eb7354b04408fe86b37" Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.314340 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71bc9d5e90fdd9566b9aac6dfb38870bf6d6ce71be3b7eb7354b04408fe86b37"} err="failed to get container status \"71bc9d5e90fdd9566b9aac6dfb38870bf6d6ce71be3b7eb7354b04408fe86b37\": rpc error: code = NotFound desc = could not find container \"71bc9d5e90fdd9566b9aac6dfb38870bf6d6ce71be3b7eb7354b04408fe86b37\": container with ID starting with 71bc9d5e90fdd9566b9aac6dfb38870bf6d6ce71be3b7eb7354b04408fe86b37 not found: ID does not exist" Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.314369 4732 scope.go:117] "RemoveContainer" containerID="16541e33e966d2f1de439d8a63aece7d1c528e83e6fe30547f444b7984f3d70e" Apr 02 13:43:23 crc kubenswrapper[4732]: E0402 13:43:23.314806 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16541e33e966d2f1de439d8a63aece7d1c528e83e6fe30547f444b7984f3d70e\": container with ID starting with 16541e33e966d2f1de439d8a63aece7d1c528e83e6fe30547f444b7984f3d70e not found: ID does not exist" containerID="16541e33e966d2f1de439d8a63aece7d1c528e83e6fe30547f444b7984f3d70e" Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.314848 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16541e33e966d2f1de439d8a63aece7d1c528e83e6fe30547f444b7984f3d70e"} err="failed to get container status \"16541e33e966d2f1de439d8a63aece7d1c528e83e6fe30547f444b7984f3d70e\": rpc error: code = NotFound desc = could not find container \"16541e33e966d2f1de439d8a63aece7d1c528e83e6fe30547f444b7984f3d70e\": container with ID starting with 16541e33e966d2f1de439d8a63aece7d1c528e83e6fe30547f444b7984f3d70e not found: ID does not exist" Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.314872 4732 scope.go:117] "RemoveContainer" containerID="193dd2b4bb23f9d733807fda8e1d414d8610bf2650f00c987038316f39b50247" Apr 02 13:43:23 crc kubenswrapper[4732]: E0402 13:43:23.315090 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"193dd2b4bb23f9d733807fda8e1d414d8610bf2650f00c987038316f39b50247\": container with ID starting with 193dd2b4bb23f9d733807fda8e1d414d8610bf2650f00c987038316f39b50247 not found: ID does not exist" containerID="193dd2b4bb23f9d733807fda8e1d414d8610bf2650f00c987038316f39b50247" Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.315116 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"193dd2b4bb23f9d733807fda8e1d414d8610bf2650f00c987038316f39b50247"} err="failed to get container status \"193dd2b4bb23f9d733807fda8e1d414d8610bf2650f00c987038316f39b50247\": rpc error: code = NotFound desc = could not find container \"193dd2b4bb23f9d733807fda8e1d414d8610bf2650f00c987038316f39b50247\": container with ID starting with 193dd2b4bb23f9d733807fda8e1d414d8610bf2650f00c987038316f39b50247 not found: ID does not exist" Apr 02 13:43:23 crc kubenswrapper[4732]: I0402 13:43:23.486887 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-10-crc"] Apr 02 13:43:23 crc kubenswrapper[4732]: W0402 13:43:23.488263 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddfc68032_3ed5_47ad_adde_2d764f4b8868.slice/crio-cf3d0f07e31bdab7797c39f77961e62108c77607d6c752e146821dce0a4026a1 WatchSource:0}: Error finding container cf3d0f07e31bdab7797c39f77961e62108c77607d6c752e146821dce0a4026a1: Status 404 returned error can't find the container with id cf3d0f07e31bdab7797c39f77961e62108c77607d6c752e146821dce0a4026a1 Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.055426 4732 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Apr 02 13:43:24 crc kubenswrapper[4732]: E0402 13:43:24.055886 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72981c60-e9a1-4e25-9b64-7493d6fdaab6" containerName="extract-utilities" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.055898 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="72981c60-e9a1-4e25-9b64-7493d6fdaab6" containerName="extract-utilities" Apr 02 13:43:24 crc kubenswrapper[4732]: E0402 13:43:24.055907 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72981c60-e9a1-4e25-9b64-7493d6fdaab6" containerName="extract-content" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.055914 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="72981c60-e9a1-4e25-9b64-7493d6fdaab6" containerName="extract-content" Apr 02 13:43:24 crc kubenswrapper[4732]: E0402 13:43:24.055927 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72981c60-e9a1-4e25-9b64-7493d6fdaab6" containerName="registry-server" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.055932 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="72981c60-e9a1-4e25-9b64-7493d6fdaab6" containerName="registry-server" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.056030 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="72981c60-e9a1-4e25-9b64-7493d6fdaab6" containerName="registry-server" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.056346 4732 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.056415 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.056566 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6" gracePeriod=15 Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.056687 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://19f830282d549fbab00b4a19b6698b2abc656f62287b73347cd4d0698fd5de90" gracePeriod=15 Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.056690 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c" gracePeriod=15 Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.056747 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a" gracePeriod=15 Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.056868 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571" gracePeriod=15 Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.057550 4732 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Apr 02 13:43:24 crc kubenswrapper[4732]: E0402 13:43:24.057825 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.057877 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Apr 02 13:43:24 crc kubenswrapper[4732]: E0402 13:43:24.057891 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.057899 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 02 13:43:24 crc kubenswrapper[4732]: E0402 13:43:24.057907 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.057915 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 02 13:43:24 crc kubenswrapper[4732]: E0402 13:43:24.057932 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.057940 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Apr 02 13:43:24 crc kubenswrapper[4732]: E0402 13:43:24.057951 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.057960 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Apr 02 13:43:24 crc kubenswrapper[4732]: E0402 13:43:24.057975 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.057982 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 02 13:43:24 crc kubenswrapper[4732]: E0402 13:43:24.057991 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.057999 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Apr 02 13:43:24 crc kubenswrapper[4732]: E0402 13:43:24.058009 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.058017 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 02 13:43:24 crc kubenswrapper[4732]: E0402 13:43:24.058026 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.058033 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.058150 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.058162 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.058176 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.058183 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.058192 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.058203 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.058212 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.058222 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.058231 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Apr 02 13:43:24 crc kubenswrapper[4732]: E0402 13:43:24.058373 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.058384 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 02 13:43:24 crc kubenswrapper[4732]: E0402 13:43:24.058393 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.058401 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.058518 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.078936 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.079435 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.079481 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.079508 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.079536 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.079565 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.079582 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.079599 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.079636 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.180944 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.181012 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.181037 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.181062 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.181092 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.181123 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.181138 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.181157 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.181215 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.181257 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.181245 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.181231 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.181298 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.181304 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.181326 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.181375 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.227059 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.228483 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.229488 4732 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="19f830282d549fbab00b4a19b6698b2abc656f62287b73347cd4d0698fd5de90" exitCode=0 Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.229529 4732 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571" exitCode=0 Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.229546 4732 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c" exitCode=0 Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.229564 4732 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a" exitCode=2 Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.229584 4732 scope.go:117] "RemoveContainer" containerID="5014ef20daa51a5633b1e4a5a9361ef76afd89e05243d89217cce9ac1fcc155a" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.232594 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-10-crc" event={"ID":"dfc68032-3ed5-47ad-adde-2d764f4b8868","Type":"ContainerStarted","Data":"7cf71f04de5c1900fecdf2c19f88b6ea8e2ef8a049a8452f8d9267e1dad19bae"} Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.232697 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-10-crc" event={"ID":"dfc68032-3ed5-47ad-adde-2d764f4b8868","Type":"ContainerStarted","Data":"cf3d0f07e31bdab7797c39f77961e62108c77607d6c752e146821dce0a4026a1"} Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.233757 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.234260 4732 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.234904 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.377413 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.410198 4732 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.410309 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Apr 02 13:43:24 crc kubenswrapper[4732]: W0402 13:43:24.475440 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-c40290bfa2a2b570d4432249caaf8dad7c43a759ab8f11b1aaf953309ca455a7 WatchSource:0}: Error finding container c40290bfa2a2b570d4432249caaf8dad7c43a759ab8f11b1aaf953309ca455a7: Status 404 returned error can't find the container with id c40290bfa2a2b570d4432249caaf8dad7c43a759ab8f11b1aaf953309ca455a7 Apr 02 13:43:24 crc kubenswrapper[4732]: E0402 13:43:24.478895 4732 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.138:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18a28e107571bed5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:43:24.478406357 +0000 UTC m=+361.382813900,LastTimestamp:2026-04-02 13:43:24.478406357 +0000 UTC m=+361.382813900,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.683433 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.684231 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.684509 4732 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:24 crc kubenswrapper[4732]: I0402 13:43:24.692901 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72981c60-e9a1-4e25-9b64-7493d6fdaab6" path="/var/lib/kubelet/pods/72981c60-e9a1-4e25-9b64-7493d6fdaab6/volumes" Apr 02 13:43:25 crc kubenswrapper[4732]: I0402 13:43:25.242089 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Apr 02 13:43:25 crc kubenswrapper[4732]: I0402 13:43:25.244003 4732 generic.go:334] "Generic (PLEG): container finished" podID="dfc68032-3ed5-47ad-adde-2d764f4b8868" containerID="7cf71f04de5c1900fecdf2c19f88b6ea8e2ef8a049a8452f8d9267e1dad19bae" exitCode=0 Apr 02 13:43:25 crc kubenswrapper[4732]: I0402 13:43:25.244045 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-10-crc" event={"ID":"dfc68032-3ed5-47ad-adde-2d764f4b8868","Type":"ContainerDied","Data":"7cf71f04de5c1900fecdf2c19f88b6ea8e2ef8a049a8452f8d9267e1dad19bae"} Apr 02 13:43:25 crc kubenswrapper[4732]: I0402 13:43:25.244520 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:25 crc kubenswrapper[4732]: I0402 13:43:25.244815 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:25 crc kubenswrapper[4732]: I0402 13:43:25.245823 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"96840158fe7da1d384adaa2477702b2ebc8af51caba2dfb7559ac028fbfe2523"} Apr 02 13:43:25 crc kubenswrapper[4732]: I0402 13:43:25.245850 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c40290bfa2a2b570d4432249caaf8dad7c43a759ab8f11b1aaf953309ca455a7"} Apr 02 13:43:25 crc kubenswrapper[4732]: I0402 13:43:25.246208 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:25 crc kubenswrapper[4732]: I0402 13:43:25.246487 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:25 crc kubenswrapper[4732]: I0402 13:43:25.247713 4732 generic.go:334] "Generic (PLEG): container finished" podID="85581866-f172-4f4e-8805-55c1f175201d" containerID="e6374bf349adff5413cc0d8aed38175d79c4c9b0979472cafc760c99b1204cf2" exitCode=0 Apr 02 13:43:25 crc kubenswrapper[4732]: I0402 13:43:25.247745 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"85581866-f172-4f4e-8805-55c1f175201d","Type":"ContainerDied","Data":"e6374bf349adff5413cc0d8aed38175d79c4c9b0979472cafc760c99b1204cf2"} Apr 02 13:43:25 crc kubenswrapper[4732]: I0402 13:43:25.248198 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:25 crc kubenswrapper[4732]: I0402 13:43:25.248472 4732 status_manager.go:851] "Failed to get status for pod" podUID="85581866-f172-4f4e-8805-55c1f175201d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:25 crc kubenswrapper[4732]: I0402 13:43:25.248911 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:25 crc kubenswrapper[4732]: E0402 13:43:25.780947 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e68473d7cc0d08293a0f48d1224a39e782dd7b899994ad1d070d72a3c06dfc01" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 02 13:43:25 crc kubenswrapper[4732]: E0402 13:43:25.783738 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e68473d7cc0d08293a0f48d1224a39e782dd7b899994ad1d070d72a3c06dfc01" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 02 13:43:25 crc kubenswrapper[4732]: E0402 13:43:25.785568 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e68473d7cc0d08293a0f48d1224a39e782dd7b899994ad1d070d72a3c06dfc01" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 02 13:43:25 crc kubenswrapper[4732]: E0402 13:43:25.785603 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" podUID="bd27c473-5209-45b9-a8da-b285f03920f8" containerName="kube-multus-additional-cni-plugins" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.467500 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.469079 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.470548 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.471046 4732 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.471404 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.471612 4732 status_manager.go:851] "Failed to get status for pod" podUID="85581866-f172-4f4e-8805-55c1f175201d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.520275 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.520360 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.520401 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.520700 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.520729 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.520744 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.583800 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-10-crc" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.584348 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.584693 4732 status_manager.go:851] "Failed to get status for pod" podUID="85581866-f172-4f4e-8805-55c1f175201d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.585101 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.585331 4732 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.621001 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dfc68032-3ed5-47ad-adde-2d764f4b8868-kube-api-access\") pod \"dfc68032-3ed5-47ad-adde-2d764f4b8868\" (UID: \"dfc68032-3ed5-47ad-adde-2d764f4b8868\") " Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.621391 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dfc68032-3ed5-47ad-adde-2d764f4b8868-kubelet-dir\") pod \"dfc68032-3ed5-47ad-adde-2d764f4b8868\" (UID: \"dfc68032-3ed5-47ad-adde-2d764f4b8868\") " Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.621509 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dfc68032-3ed5-47ad-adde-2d764f4b8868-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dfc68032-3ed5-47ad-adde-2d764f4b8868" (UID: "dfc68032-3ed5-47ad-adde-2d764f4b8868"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.621737 4732 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.621807 4732 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dfc68032-3ed5-47ad-adde-2d764f4b8868-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.621880 4732 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.621946 4732 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.628872 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfc68032-3ed5-47ad-adde-2d764f4b8868-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dfc68032-3ed5-47ad-adde-2d764f4b8868" (UID: "dfc68032-3ed5-47ad-adde-2d764f4b8868"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.638423 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.639453 4732 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.639893 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.640300 4732 status_manager.go:851] "Failed to get status for pod" podUID="85581866-f172-4f4e-8805-55c1f175201d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.640682 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.687176 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.722781 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85581866-f172-4f4e-8805-55c1f175201d-kube-api-access\") pod \"85581866-f172-4f4e-8805-55c1f175201d\" (UID: \"85581866-f172-4f4e-8805-55c1f175201d\") " Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.722816 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/85581866-f172-4f4e-8805-55c1f175201d-kubelet-dir\") pod \"85581866-f172-4f4e-8805-55c1f175201d\" (UID: \"85581866-f172-4f4e-8805-55c1f175201d\") " Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.722856 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/85581866-f172-4f4e-8805-55c1f175201d-var-lock\") pod \"85581866-f172-4f4e-8805-55c1f175201d\" (UID: \"85581866-f172-4f4e-8805-55c1f175201d\") " Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.723188 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dfc68032-3ed5-47ad-adde-2d764f4b8868-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.723219 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85581866-f172-4f4e-8805-55c1f175201d-var-lock" (OuterVolumeSpecName: "var-lock") pod "85581866-f172-4f4e-8805-55c1f175201d" (UID: "85581866-f172-4f4e-8805-55c1f175201d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.723246 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85581866-f172-4f4e-8805-55c1f175201d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "85581866-f172-4f4e-8805-55c1f175201d" (UID: "85581866-f172-4f4e-8805-55c1f175201d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.725326 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85581866-f172-4f4e-8805-55c1f175201d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "85581866-f172-4f4e-8805-55c1f175201d" (UID: "85581866-f172-4f4e-8805-55c1f175201d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.824151 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85581866-f172-4f4e-8805-55c1f175201d-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.824186 4732 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/85581866-f172-4f4e-8805-55c1f175201d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:26 crc kubenswrapper[4732]: I0402 13:43:26.824198 4732 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/85581866-f172-4f4e-8805-55c1f175201d-var-lock\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.261511 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.264847 4732 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6" exitCode=0 Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.264948 4732 scope.go:117] "RemoveContainer" containerID="19f830282d549fbab00b4a19b6698b2abc656f62287b73347cd4d0698fd5de90" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.264990 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.265770 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.266072 4732 status_manager.go:851] "Failed to get status for pod" podUID="85581866-f172-4f4e-8805-55c1f175201d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.266339 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.266647 4732 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.267802 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.268046 4732 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.268213 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.268229 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-10-crc" event={"ID":"dfc68032-3ed5-47ad-adde-2d764f4b8868","Type":"ContainerDied","Data":"cf3d0f07e31bdab7797c39f77961e62108c77607d6c752e146821dce0a4026a1"} Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.268253 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf3d0f07e31bdab7797c39f77961e62108c77607d6c752e146821dce0a4026a1" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.268234 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-10-crc" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.268761 4732 status_manager.go:851] "Failed to get status for pod" podUID="85581866-f172-4f4e-8805-55c1f175201d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.270280 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-11-crc_536912b9-ad03-42a4-bce9-754227ecbf82/installer/0.log" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.270328 4732 generic.go:334] "Generic (PLEG): container finished" podID="536912b9-ad03-42a4-bce9-754227ecbf82" containerID="34982d3e5cd4a3198e55e7a07fba2bad10f53ca737a09a58e0bb8620786d7555" exitCode=1 Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.270382 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-11-crc" event={"ID":"536912b9-ad03-42a4-bce9-754227ecbf82","Type":"ContainerDied","Data":"34982d3e5cd4a3198e55e7a07fba2bad10f53ca737a09a58e0bb8620786d7555"} Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.270783 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.271198 4732 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.271627 4732 status_manager.go:851] "Failed to get status for pod" podUID="536912b9-ad03-42a4-bce9-754227ecbf82" pod="openshift-kube-controller-manager/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.271860 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.271904 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"85581866-f172-4f4e-8805-55c1f175201d","Type":"ContainerDied","Data":"ff321f3dcfa693e047dd8803f74b4a338b473cfba76c14e20b65f450f95ac865"} Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.271922 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff321f3dcfa693e047dd8803f74b4a338b473cfba76c14e20b65f450f95ac865" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.271959 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.272095 4732 status_manager.go:851] "Failed to get status for pod" podUID="85581866-f172-4f4e-8805-55c1f175201d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.275522 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.275742 4732 status_manager.go:851] "Failed to get status for pod" podUID="85581866-f172-4f4e-8805-55c1f175201d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.275982 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.276189 4732 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.276360 4732 status_manager.go:851] "Failed to get status for pod" podUID="536912b9-ad03-42a4-bce9-754227ecbf82" pod="openshift-kube-controller-manager/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.284875 4732 scope.go:117] "RemoveContainer" containerID="7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.287129 4732 status_manager.go:851] "Failed to get status for pod" podUID="536912b9-ad03-42a4-bce9-754227ecbf82" pod="openshift-kube-controller-manager/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.287353 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.287794 4732 status_manager.go:851] "Failed to get status for pod" podUID="85581866-f172-4f4e-8805-55c1f175201d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.288201 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.288565 4732 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.302777 4732 scope.go:117] "RemoveContainer" containerID="e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.315670 4732 scope.go:117] "RemoveContainer" containerID="bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.328184 4732 scope.go:117] "RemoveContainer" containerID="bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.351905 4732 scope.go:117] "RemoveContainer" containerID="3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.381563 4732 scope.go:117] "RemoveContainer" containerID="19f830282d549fbab00b4a19b6698b2abc656f62287b73347cd4d0698fd5de90" Apr 02 13:43:27 crc kubenswrapper[4732]: E0402 13:43:27.382382 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19f830282d549fbab00b4a19b6698b2abc656f62287b73347cd4d0698fd5de90\": container with ID starting with 19f830282d549fbab00b4a19b6698b2abc656f62287b73347cd4d0698fd5de90 not found: ID does not exist" containerID="19f830282d549fbab00b4a19b6698b2abc656f62287b73347cd4d0698fd5de90" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.382425 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19f830282d549fbab00b4a19b6698b2abc656f62287b73347cd4d0698fd5de90"} err="failed to get container status \"19f830282d549fbab00b4a19b6698b2abc656f62287b73347cd4d0698fd5de90\": rpc error: code = NotFound desc = could not find container \"19f830282d549fbab00b4a19b6698b2abc656f62287b73347cd4d0698fd5de90\": container with ID starting with 19f830282d549fbab00b4a19b6698b2abc656f62287b73347cd4d0698fd5de90 not found: ID does not exist" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.382451 4732 scope.go:117] "RemoveContainer" containerID="7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571" Apr 02 13:43:27 crc kubenswrapper[4732]: E0402 13:43:27.382951 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\": container with ID starting with 7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571 not found: ID does not exist" containerID="7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.382986 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571"} err="failed to get container status \"7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\": rpc error: code = NotFound desc = could not find container \"7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571\": container with ID starting with 7b46da8b2cda4252f897adfca92275c92924a998d5b1ad57e6897ddea29b4571 not found: ID does not exist" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.383010 4732 scope.go:117] "RemoveContainer" containerID="e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c" Apr 02 13:43:27 crc kubenswrapper[4732]: E0402 13:43:27.383253 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\": container with ID starting with e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c not found: ID does not exist" containerID="e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.383272 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c"} err="failed to get container status \"e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\": rpc error: code = NotFound desc = could not find container \"e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c\": container with ID starting with e000359211728ef1d437757b0246d85c224dae170131ba8bd221fa84dafa026c not found: ID does not exist" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.383314 4732 scope.go:117] "RemoveContainer" containerID="bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a" Apr 02 13:43:27 crc kubenswrapper[4732]: E0402 13:43:27.383590 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\": container with ID starting with bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a not found: ID does not exist" containerID="bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.383626 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a"} err="failed to get container status \"bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\": rpc error: code = NotFound desc = could not find container \"bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a\": container with ID starting with bf6191436462297532ffdc93e43468bbe37e850115ddcbcd0265c07359ac845a not found: ID does not exist" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.383666 4732 scope.go:117] "RemoveContainer" containerID="bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6" Apr 02 13:43:27 crc kubenswrapper[4732]: E0402 13:43:27.384154 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\": container with ID starting with bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6 not found: ID does not exist" containerID="bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.384180 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6"} err="failed to get container status \"bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\": rpc error: code = NotFound desc = could not find container \"bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6\": container with ID starting with bf50344e79adec991160632c244c40543c468db050f176949b5840a16d1777a6 not found: ID does not exist" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.384196 4732 scope.go:117] "RemoveContainer" containerID="3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02" Apr 02 13:43:27 crc kubenswrapper[4732]: E0402 13:43:27.384749 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\": container with ID starting with 3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02 not found: ID does not exist" containerID="3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02" Apr 02 13:43:27 crc kubenswrapper[4732]: I0402 13:43:27.384893 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02"} err="failed to get container status \"3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\": rpc error: code = NotFound desc = could not find container \"3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02\": container with ID starting with 3da324ee44a6b4198036432fce7100927cc4aa30f168f7aff1efafb3e46fbe02 not found: ID does not exist" Apr 02 13:43:28 crc kubenswrapper[4732]: I0402 13:43:28.679928 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:43:28 crc kubenswrapper[4732]: I0402 13:43:28.681493 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:28 crc kubenswrapper[4732]: I0402 13:43:28.681991 4732 status_manager.go:851] "Failed to get status for pod" podUID="85581866-f172-4f4e-8805-55c1f175201d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:28 crc kubenswrapper[4732]: I0402 13:43:28.682520 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:28 crc kubenswrapper[4732]: I0402 13:43:28.682987 4732 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:28 crc kubenswrapper[4732]: I0402 13:43:28.683302 4732 status_manager.go:851] "Failed to get status for pod" podUID="536912b9-ad03-42a4-bce9-754227ecbf82" pod="openshift-kube-controller-manager/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:28 crc kubenswrapper[4732]: I0402 13:43:28.705191 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-11-crc_536912b9-ad03-42a4-bce9-754227ecbf82/installer/0.log" Apr 02 13:43:28 crc kubenswrapper[4732]: I0402 13:43:28.705287 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-11-crc" Apr 02 13:43:28 crc kubenswrapper[4732]: I0402 13:43:28.706049 4732 status_manager.go:851] "Failed to get status for pod" podUID="536912b9-ad03-42a4-bce9-754227ecbf82" pod="openshift-kube-controller-manager/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:28 crc kubenswrapper[4732]: I0402 13:43:28.706537 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:28 crc kubenswrapper[4732]: I0402 13:43:28.706986 4732 status_manager.go:851] "Failed to get status for pod" podUID="85581866-f172-4f4e-8805-55c1f175201d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:28 crc kubenswrapper[4732]: I0402 13:43:28.707467 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:28 crc kubenswrapper[4732]: I0402 13:43:28.719278 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="e0654c6f-b103-4146-b591-b2acee4900a9" Apr 02 13:43:28 crc kubenswrapper[4732]: I0402 13:43:28.719311 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="e0654c6f-b103-4146-b591-b2acee4900a9" Apr 02 13:43:28 crc kubenswrapper[4732]: E0402 13:43:28.719795 4732 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:43:28 crc kubenswrapper[4732]: I0402 13:43:28.720235 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:43:28 crc kubenswrapper[4732]: W0402 13:43:28.739444 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod815516d0756bb9282f4d0a28cef72670.slice/crio-57fb8da83276f6728323ec4bbba281e746308e772d70069f59925d47630e91dc WatchSource:0}: Error finding container 57fb8da83276f6728323ec4bbba281e746308e772d70069f59925d47630e91dc: Status 404 returned error can't find the container with id 57fb8da83276f6728323ec4bbba281e746308e772d70069f59925d47630e91dc Apr 02 13:43:28 crc kubenswrapper[4732]: I0402 13:43:28.754009 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/536912b9-ad03-42a4-bce9-754227ecbf82-var-lock\") pod \"536912b9-ad03-42a4-bce9-754227ecbf82\" (UID: \"536912b9-ad03-42a4-bce9-754227ecbf82\") " Apr 02 13:43:28 crc kubenswrapper[4732]: I0402 13:43:28.754103 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/536912b9-ad03-42a4-bce9-754227ecbf82-kube-api-access\") pod \"536912b9-ad03-42a4-bce9-754227ecbf82\" (UID: \"536912b9-ad03-42a4-bce9-754227ecbf82\") " Apr 02 13:43:28 crc kubenswrapper[4732]: I0402 13:43:28.754156 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/536912b9-ad03-42a4-bce9-754227ecbf82-var-lock" (OuterVolumeSpecName: "var-lock") pod "536912b9-ad03-42a4-bce9-754227ecbf82" (UID: "536912b9-ad03-42a4-bce9-754227ecbf82"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:43:28 crc kubenswrapper[4732]: I0402 13:43:28.754210 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/536912b9-ad03-42a4-bce9-754227ecbf82-kubelet-dir\") pod \"536912b9-ad03-42a4-bce9-754227ecbf82\" (UID: \"536912b9-ad03-42a4-bce9-754227ecbf82\") " Apr 02 13:43:28 crc kubenswrapper[4732]: I0402 13:43:28.754408 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/536912b9-ad03-42a4-bce9-754227ecbf82-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "536912b9-ad03-42a4-bce9-754227ecbf82" (UID: "536912b9-ad03-42a4-bce9-754227ecbf82"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:43:28 crc kubenswrapper[4732]: I0402 13:43:28.754492 4732 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/536912b9-ad03-42a4-bce9-754227ecbf82-var-lock\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:28 crc kubenswrapper[4732]: I0402 13:43:28.760894 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/536912b9-ad03-42a4-bce9-754227ecbf82-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "536912b9-ad03-42a4-bce9-754227ecbf82" (UID: "536912b9-ad03-42a4-bce9-754227ecbf82"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:43:28 crc kubenswrapper[4732]: I0402 13:43:28.856149 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/536912b9-ad03-42a4-bce9-754227ecbf82-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:28 crc kubenswrapper[4732]: I0402 13:43:28.856181 4732 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/536912b9-ad03-42a4-bce9-754227ecbf82-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:29 crc kubenswrapper[4732]: I0402 13:43:29.289344 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-11-crc_536912b9-ad03-42a4-bce9-754227ecbf82/installer/0.log" Apr 02 13:43:29 crc kubenswrapper[4732]: I0402 13:43:29.289448 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-11-crc" event={"ID":"536912b9-ad03-42a4-bce9-754227ecbf82","Type":"ContainerDied","Data":"a2743df34f20dc28efec2cd4b35b2691ac15e3a08e2667d8a67c1cfa8b626837"} Apr 02 13:43:29 crc kubenswrapper[4732]: I0402 13:43:29.289483 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2743df34f20dc28efec2cd4b35b2691ac15e3a08e2667d8a67c1cfa8b626837" Apr 02 13:43:29 crc kubenswrapper[4732]: I0402 13:43:29.289490 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-11-crc" Apr 02 13:43:29 crc kubenswrapper[4732]: I0402 13:43:29.291617 4732 generic.go:334] "Generic (PLEG): container finished" podID="815516d0756bb9282f4d0a28cef72670" containerID="f32f0b52a1cae674c9138498583a3bcfbdf8a79fb8948fd230a9af6aded4f582" exitCode=0 Apr 02 13:43:29 crc kubenswrapper[4732]: I0402 13:43:29.291678 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"815516d0756bb9282f4d0a28cef72670","Type":"ContainerDied","Data":"f32f0b52a1cae674c9138498583a3bcfbdf8a79fb8948fd230a9af6aded4f582"} Apr 02 13:43:29 crc kubenswrapper[4732]: I0402 13:43:29.291734 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"815516d0756bb9282f4d0a28cef72670","Type":"ContainerStarted","Data":"57fb8da83276f6728323ec4bbba281e746308e772d70069f59925d47630e91dc"} Apr 02 13:43:29 crc kubenswrapper[4732]: I0402 13:43:29.292068 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="e0654c6f-b103-4146-b591-b2acee4900a9" Apr 02 13:43:29 crc kubenswrapper[4732]: I0402 13:43:29.292089 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="e0654c6f-b103-4146-b591-b2acee4900a9" Apr 02 13:43:29 crc kubenswrapper[4732]: E0402 13:43:29.292592 4732 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:43:29 crc kubenswrapper[4732]: I0402 13:43:29.292653 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:29 crc kubenswrapper[4732]: I0402 13:43:29.293120 4732 status_manager.go:851] "Failed to get status for pod" podUID="85581866-f172-4f4e-8805-55c1f175201d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:29 crc kubenswrapper[4732]: I0402 13:43:29.293550 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:29 crc kubenswrapper[4732]: I0402 13:43:29.293950 4732 status_manager.go:851] "Failed to get status for pod" podUID="536912b9-ad03-42a4-bce9-754227ecbf82" pod="openshift-kube-controller-manager/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:29 crc kubenswrapper[4732]: I0402 13:43:29.308798 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:29 crc kubenswrapper[4732]: I0402 13:43:29.309492 4732 status_manager.go:851] "Failed to get status for pod" podUID="85581866-f172-4f4e-8805-55c1f175201d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:29 crc kubenswrapper[4732]: I0402 13:43:29.309720 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:29 crc kubenswrapper[4732]: I0402 13:43:29.310071 4732 status_manager.go:851] "Failed to get status for pod" podUID="536912b9-ad03-42a4-bce9-754227ecbf82" pod="openshift-kube-controller-manager/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:29 crc kubenswrapper[4732]: E0402 13:43:29.815657 4732 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:29 crc kubenswrapper[4732]: E0402 13:43:29.815983 4732 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:29 crc kubenswrapper[4732]: E0402 13:43:29.816295 4732 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:29 crc kubenswrapper[4732]: E0402 13:43:29.816549 4732 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:29 crc kubenswrapper[4732]: E0402 13:43:29.816808 4732 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:29 crc kubenswrapper[4732]: I0402 13:43:29.816835 4732 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Apr 02 13:43:29 crc kubenswrapper[4732]: E0402 13:43:29.817054 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="200ms" Apr 02 13:43:30 crc kubenswrapper[4732]: E0402 13:43:30.018001 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="400ms" Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.302470 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"815516d0756bb9282f4d0a28cef72670","Type":"ContainerStarted","Data":"1e4dabd56041117649788f504b9c27c19def6d0699d4cb6d55cd4d16aed89a13"} Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.302893 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"815516d0756bb9282f4d0a28cef72670","Type":"ContainerStarted","Data":"ca9c3b985666916a34338e9f991c3bea2b33029e002346a256d5f70d34c5a009"} Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.302907 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"815516d0756bb9282f4d0a28cef72670","Type":"ContainerStarted","Data":"a06c8fb9cbb6779ec889350af216d97764fbc5b1426de87ec0d07df751528119"} Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.303164 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="e0654c6f-b103-4146-b591-b2acee4900a9" Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.303194 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="e0654c6f-b103-4146-b591-b2acee4900a9" Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.303221 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:43:30 crc kubenswrapper[4732]: E0402 13:43:30.303593 4732 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.303610 4732 status_manager.go:851] "Failed to get status for pod" podUID="536912b9-ad03-42a4-bce9-754227ecbf82" pod="openshift-kube-controller-manager/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.303993 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.304285 4732 status_manager.go:851] "Failed to get status for pod" podUID="85581866-f172-4f4e-8805-55c1f175201d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.304546 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:30 crc kubenswrapper[4732]: E0402 13:43:30.418467 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="800ms" Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.424192 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-85455b8986-kh7t7" podUID="0664f762-59d2-4c95-8e8f-698af0b15611" containerName="registry" containerID="cri-o://7e14aa9bb1ec7d5d53bd39ca3de09280350f61d5807a65f17ec8cff13a157b62" gracePeriod=30 Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.923596 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.924559 4732 status_manager.go:851] "Failed to get status for pod" podUID="0664f762-59d2-4c95-8e8f-698af0b15611" pod="openshift-image-registry/image-registry-85455b8986-kh7t7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-85455b8986-kh7t7\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.925002 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.925837 4732 status_manager.go:851] "Failed to get status for pod" podUID="85581866-f172-4f4e-8805-55c1f175201d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.926252 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.926517 4732 status_manager.go:851] "Failed to get status for pod" podUID="536912b9-ad03-42a4-bce9-754227ecbf82" pod="openshift-kube-controller-manager/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.985243 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0664f762-59d2-4c95-8e8f-698af0b15611-bound-sa-token\") pod \"0664f762-59d2-4c95-8e8f-698af0b15611\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.985301 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0664f762-59d2-4c95-8e8f-698af0b15611-registry-tls\") pod \"0664f762-59d2-4c95-8e8f-698af0b15611\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.985345 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0664f762-59d2-4c95-8e8f-698af0b15611-trusted-ca\") pod \"0664f762-59d2-4c95-8e8f-698af0b15611\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.985426 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0664f762-59d2-4c95-8e8f-698af0b15611-ca-trust-extracted\") pod \"0664f762-59d2-4c95-8e8f-698af0b15611\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.985484 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0664f762-59d2-4c95-8e8f-698af0b15611-installation-pull-secrets\") pod \"0664f762-59d2-4c95-8e8f-698af0b15611\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.985589 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0664f762-59d2-4c95-8e8f-698af0b15611-registry-certificates\") pod \"0664f762-59d2-4c95-8e8f-698af0b15611\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.985738 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lxzt\" (UniqueName: \"kubernetes.io/projected/0664f762-59d2-4c95-8e8f-698af0b15611-kube-api-access-8lxzt\") pod \"0664f762-59d2-4c95-8e8f-698af0b15611\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.985957 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"0664f762-59d2-4c95-8e8f-698af0b15611\" (UID: \"0664f762-59d2-4c95-8e8f-698af0b15611\") " Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.987404 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0664f762-59d2-4c95-8e8f-698af0b15611-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0664f762-59d2-4c95-8e8f-698af0b15611" (UID: "0664f762-59d2-4c95-8e8f-698af0b15611"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.987582 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0664f762-59d2-4c95-8e8f-698af0b15611-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0664f762-59d2-4c95-8e8f-698af0b15611" (UID: "0664f762-59d2-4c95-8e8f-698af0b15611"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.990847 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0664f762-59d2-4c95-8e8f-698af0b15611-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0664f762-59d2-4c95-8e8f-698af0b15611" (UID: "0664f762-59d2-4c95-8e8f-698af0b15611"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.991264 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0664f762-59d2-4c95-8e8f-698af0b15611-kube-api-access-8lxzt" (OuterVolumeSpecName: "kube-api-access-8lxzt") pod "0664f762-59d2-4c95-8e8f-698af0b15611" (UID: "0664f762-59d2-4c95-8e8f-698af0b15611"). InnerVolumeSpecName "kube-api-access-8lxzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.991651 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0664f762-59d2-4c95-8e8f-698af0b15611-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0664f762-59d2-4c95-8e8f-698af0b15611" (UID: "0664f762-59d2-4c95-8e8f-698af0b15611"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.992114 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0664f762-59d2-4c95-8e8f-698af0b15611-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0664f762-59d2-4c95-8e8f-698af0b15611" (UID: "0664f762-59d2-4c95-8e8f-698af0b15611"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:43:30 crc kubenswrapper[4732]: I0402 13:43:30.999435 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "0664f762-59d2-4c95-8e8f-698af0b15611" (UID: "0664f762-59d2-4c95-8e8f-698af0b15611"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.008801 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0664f762-59d2-4c95-8e8f-698af0b15611-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0664f762-59d2-4c95-8e8f-698af0b15611" (UID: "0664f762-59d2-4c95-8e8f-698af0b15611"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.088206 4732 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0664f762-59d2-4c95-8e8f-698af0b15611-bound-sa-token\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.088274 4732 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0664f762-59d2-4c95-8e8f-698af0b15611-registry-tls\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.088288 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0664f762-59d2-4c95-8e8f-698af0b15611-trusted-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.088299 4732 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0664f762-59d2-4c95-8e8f-698af0b15611-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.088313 4732 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0664f762-59d2-4c95-8e8f-698af0b15611-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.088329 4732 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0664f762-59d2-4c95-8e8f-698af0b15611-registry-certificates\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.088361 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lxzt\" (UniqueName: \"kubernetes.io/projected/0664f762-59d2-4c95-8e8f-698af0b15611-kube-api-access-8lxzt\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:31 crc kubenswrapper[4732]: E0402 13:43:31.219888 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="1.6s" Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.308461 4732 generic.go:334] "Generic (PLEG): container finished" podID="0664f762-59d2-4c95-8e8f-698af0b15611" containerID="7e14aa9bb1ec7d5d53bd39ca3de09280350f61d5807a65f17ec8cff13a157b62" exitCode=0 Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.308561 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-85455b8986-kh7t7" event={"ID":"0664f762-59d2-4c95-8e8f-698af0b15611","Type":"ContainerDied","Data":"7e14aa9bb1ec7d5d53bd39ca3de09280350f61d5807a65f17ec8cff13a157b62"} Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.308637 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-85455b8986-kh7t7" event={"ID":"0664f762-59d2-4c95-8e8f-698af0b15611","Type":"ContainerDied","Data":"e2727925452645522fe09d85f2fdfb444185bd45a32d35c8636aebabef125bc3"} Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.308694 4732 scope.go:117] "RemoveContainer" containerID="7e14aa9bb1ec7d5d53bd39ca3de09280350f61d5807a65f17ec8cff13a157b62" Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.308972 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="e0654c6f-b103-4146-b591-b2acee4900a9" Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.308994 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="e0654c6f-b103-4146-b591-b2acee4900a9" Apr 02 13:43:31 crc kubenswrapper[4732]: E0402 13:43:31.309415 4732 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.309560 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85455b8986-kh7t7" Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.310467 4732 status_manager.go:851] "Failed to get status for pod" podUID="0664f762-59d2-4c95-8e8f-698af0b15611" pod="openshift-image-registry/image-registry-85455b8986-kh7t7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-85455b8986-kh7t7\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.310863 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.311288 4732 status_manager.go:851] "Failed to get status for pod" podUID="85581866-f172-4f4e-8805-55c1f175201d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.311604 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.311974 4732 status_manager.go:851] "Failed to get status for pod" podUID="536912b9-ad03-42a4-bce9-754227ecbf82" pod="openshift-kube-controller-manager/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.321754 4732 scope.go:117] "RemoveContainer" containerID="7e14aa9bb1ec7d5d53bd39ca3de09280350f61d5807a65f17ec8cff13a157b62" Apr 02 13:43:31 crc kubenswrapper[4732]: E0402 13:43:31.322553 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e14aa9bb1ec7d5d53bd39ca3de09280350f61d5807a65f17ec8cff13a157b62\": container with ID starting with 7e14aa9bb1ec7d5d53bd39ca3de09280350f61d5807a65f17ec8cff13a157b62 not found: ID does not exist" containerID="7e14aa9bb1ec7d5d53bd39ca3de09280350f61d5807a65f17ec8cff13a157b62" Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.322612 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e14aa9bb1ec7d5d53bd39ca3de09280350f61d5807a65f17ec8cff13a157b62"} err="failed to get container status \"7e14aa9bb1ec7d5d53bd39ca3de09280350f61d5807a65f17ec8cff13a157b62\": rpc error: code = NotFound desc = could not find container \"7e14aa9bb1ec7d5d53bd39ca3de09280350f61d5807a65f17ec8cff13a157b62\": container with ID starting with 7e14aa9bb1ec7d5d53bd39ca3de09280350f61d5807a65f17ec8cff13a157b62 not found: ID does not exist" Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.330220 4732 status_manager.go:851] "Failed to get status for pod" podUID="0664f762-59d2-4c95-8e8f-698af0b15611" pod="openshift-image-registry/image-registry-85455b8986-kh7t7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-85455b8986-kh7t7\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.330787 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.331183 4732 status_manager.go:851] "Failed to get status for pod" podUID="85581866-f172-4f4e-8805-55c1f175201d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.331578 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.332033 4732 status_manager.go:851] "Failed to get status for pod" podUID="536912b9-ad03-42a4-bce9-754227ecbf82" pod="openshift-kube-controller-manager/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:31 crc kubenswrapper[4732]: I0402 13:43:31.582339 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-dq9x9" podUID="4d77c191-7d04-4381-838f-b7a355e7c2d4" containerName="console" containerID="cri-o://73e91406ab034efa71f59bdd70be88e46d545b72f509e189125711058eb982eb" gracePeriod=15 Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.132139 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-dq9x9_4d77c191-7d04-4381-838f-b7a355e7c2d4/console/0.log" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.132456 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.133239 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.133705 4732 status_manager.go:851] "Failed to get status for pod" podUID="85581866-f172-4f4e-8805-55c1f175201d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.134014 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.134325 4732 status_manager.go:851] "Failed to get status for pod" podUID="4d77c191-7d04-4381-838f-b7a355e7c2d4" pod="openshift-console/console-f9d7485db-dq9x9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dq9x9\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.134599 4732 status_manager.go:851] "Failed to get status for pod" podUID="536912b9-ad03-42a4-bce9-754227ecbf82" pod="openshift-kube-controller-manager/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.134874 4732 status_manager.go:851] "Failed to get status for pod" podUID="0664f762-59d2-4c95-8e8f-698af0b15611" pod="openshift-image-registry/image-registry-85455b8986-kh7t7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-85455b8986-kh7t7\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.203233 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d77c191-7d04-4381-838f-b7a355e7c2d4-console-serving-cert\") pod \"4d77c191-7d04-4381-838f-b7a355e7c2d4\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.203318 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4d77c191-7d04-4381-838f-b7a355e7c2d4-oauth-serving-cert\") pod \"4d77c191-7d04-4381-838f-b7a355e7c2d4\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.203359 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4d77c191-7d04-4381-838f-b7a355e7c2d4-service-ca\") pod \"4d77c191-7d04-4381-838f-b7a355e7c2d4\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.203428 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkkwm\" (UniqueName: \"kubernetes.io/projected/4d77c191-7d04-4381-838f-b7a355e7c2d4-kube-api-access-nkkwm\") pod \"4d77c191-7d04-4381-838f-b7a355e7c2d4\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.203501 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d77c191-7d04-4381-838f-b7a355e7c2d4-trusted-ca-bundle\") pod \"4d77c191-7d04-4381-838f-b7a355e7c2d4\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.203543 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4d77c191-7d04-4381-838f-b7a355e7c2d4-console-oauth-config\") pod \"4d77c191-7d04-4381-838f-b7a355e7c2d4\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.203574 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4d77c191-7d04-4381-838f-b7a355e7c2d4-console-config\") pod \"4d77c191-7d04-4381-838f-b7a355e7c2d4\" (UID: \"4d77c191-7d04-4381-838f-b7a355e7c2d4\") " Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.204390 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d77c191-7d04-4381-838f-b7a355e7c2d4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4d77c191-7d04-4381-838f-b7a355e7c2d4" (UID: "4d77c191-7d04-4381-838f-b7a355e7c2d4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.204551 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d77c191-7d04-4381-838f-b7a355e7c2d4-service-ca" (OuterVolumeSpecName: "service-ca") pod "4d77c191-7d04-4381-838f-b7a355e7c2d4" (UID: "4d77c191-7d04-4381-838f-b7a355e7c2d4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.204773 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d77c191-7d04-4381-838f-b7a355e7c2d4-console-config" (OuterVolumeSpecName: "console-config") pod "4d77c191-7d04-4381-838f-b7a355e7c2d4" (UID: "4d77c191-7d04-4381-838f-b7a355e7c2d4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.204986 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d77c191-7d04-4381-838f-b7a355e7c2d4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4d77c191-7d04-4381-838f-b7a355e7c2d4" (UID: "4d77c191-7d04-4381-838f-b7a355e7c2d4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.209867 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d77c191-7d04-4381-838f-b7a355e7c2d4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4d77c191-7d04-4381-838f-b7a355e7c2d4" (UID: "4d77c191-7d04-4381-838f-b7a355e7c2d4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.210025 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d77c191-7d04-4381-838f-b7a355e7c2d4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4d77c191-7d04-4381-838f-b7a355e7c2d4" (UID: "4d77c191-7d04-4381-838f-b7a355e7c2d4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.212094 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d77c191-7d04-4381-838f-b7a355e7c2d4-kube-api-access-nkkwm" (OuterVolumeSpecName: "kube-api-access-nkkwm") pod "4d77c191-7d04-4381-838f-b7a355e7c2d4" (UID: "4d77c191-7d04-4381-838f-b7a355e7c2d4"). InnerVolumeSpecName "kube-api-access-nkkwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.305742 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d77c191-7d04-4381-838f-b7a355e7c2d4-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.305785 4732 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4d77c191-7d04-4381-838f-b7a355e7c2d4-console-oauth-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.305796 4732 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4d77c191-7d04-4381-838f-b7a355e7c2d4-console-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.305805 4732 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d77c191-7d04-4381-838f-b7a355e7c2d4-console-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.305814 4732 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4d77c191-7d04-4381-838f-b7a355e7c2d4-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.305822 4732 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4d77c191-7d04-4381-838f-b7a355e7c2d4-service-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.305831 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkkwm\" (UniqueName: \"kubernetes.io/projected/4d77c191-7d04-4381-838f-b7a355e7c2d4-kube-api-access-nkkwm\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.316814 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-dq9x9_4d77c191-7d04-4381-838f-b7a355e7c2d4/console/0.log" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.316860 4732 generic.go:334] "Generic (PLEG): container finished" podID="4d77c191-7d04-4381-838f-b7a355e7c2d4" containerID="73e91406ab034efa71f59bdd70be88e46d545b72f509e189125711058eb982eb" exitCode=2 Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.316888 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dq9x9" event={"ID":"4d77c191-7d04-4381-838f-b7a355e7c2d4","Type":"ContainerDied","Data":"73e91406ab034efa71f59bdd70be88e46d545b72f509e189125711058eb982eb"} Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.316918 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dq9x9" event={"ID":"4d77c191-7d04-4381-838f-b7a355e7c2d4","Type":"ContainerDied","Data":"66fdd9ebf76077e088115e7e3707900621ed21f82c654455ea29e90ede4618f3"} Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.316936 4732 scope.go:117] "RemoveContainer" containerID="73e91406ab034efa71f59bdd70be88e46d545b72f509e189125711058eb982eb" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.316944 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dq9x9" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.318063 4732 status_manager.go:851] "Failed to get status for pod" podUID="0664f762-59d2-4c95-8e8f-698af0b15611" pod="openshift-image-registry/image-registry-85455b8986-kh7t7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-85455b8986-kh7t7\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.319011 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.319340 4732 status_manager.go:851] "Failed to get status for pod" podUID="85581866-f172-4f4e-8805-55c1f175201d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.319592 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.319900 4732 status_manager.go:851] "Failed to get status for pod" podUID="4d77c191-7d04-4381-838f-b7a355e7c2d4" pod="openshift-console/console-f9d7485db-dq9x9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dq9x9\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.320307 4732 status_manager.go:851] "Failed to get status for pod" podUID="536912b9-ad03-42a4-bce9-754227ecbf82" pod="openshift-kube-controller-manager/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.329134 4732 status_manager.go:851] "Failed to get status for pod" podUID="0664f762-59d2-4c95-8e8f-698af0b15611" pod="openshift-image-registry/image-registry-85455b8986-kh7t7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-85455b8986-kh7t7\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.329307 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.329502 4732 status_manager.go:851] "Failed to get status for pod" podUID="85581866-f172-4f4e-8805-55c1f175201d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.329696 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.329858 4732 status_manager.go:851] "Failed to get status for pod" podUID="4d77c191-7d04-4381-838f-b7a355e7c2d4" pod="openshift-console/console-f9d7485db-dq9x9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dq9x9\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.330030 4732 status_manager.go:851] "Failed to get status for pod" podUID="536912b9-ad03-42a4-bce9-754227ecbf82" pod="openshift-kube-controller-manager/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.337249 4732 scope.go:117] "RemoveContainer" containerID="73e91406ab034efa71f59bdd70be88e46d545b72f509e189125711058eb982eb" Apr 02 13:43:32 crc kubenswrapper[4732]: E0402 13:43:32.338095 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73e91406ab034efa71f59bdd70be88e46d545b72f509e189125711058eb982eb\": container with ID starting with 73e91406ab034efa71f59bdd70be88e46d545b72f509e189125711058eb982eb not found: ID does not exist" containerID="73e91406ab034efa71f59bdd70be88e46d545b72f509e189125711058eb982eb" Apr 02 13:43:32 crc kubenswrapper[4732]: I0402 13:43:32.338140 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73e91406ab034efa71f59bdd70be88e46d545b72f509e189125711058eb982eb"} err="failed to get container status \"73e91406ab034efa71f59bdd70be88e46d545b72f509e189125711058eb982eb\": rpc error: code = NotFound desc = could not find container \"73e91406ab034efa71f59bdd70be88e46d545b72f509e189125711058eb982eb\": container with ID starting with 73e91406ab034efa71f59bdd70be88e46d545b72f509e189125711058eb982eb not found: ID does not exist" Apr 02 13:43:32 crc kubenswrapper[4732]: E0402 13:43:32.820794 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="3.2s" Apr 02 13:43:32 crc kubenswrapper[4732]: E0402 13:43:32.990787 4732 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.138:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18a28e107571bed5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:43:24.478406357 +0000 UTC m=+361.382813900,LastTimestamp:2026-04-02 13:43:24.478406357 +0000 UTC m=+361.382813900,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:43:34 crc kubenswrapper[4732]: I0402 13:43:34.687059 4732 status_manager.go:851] "Failed to get status for pod" podUID="536912b9-ad03-42a4-bce9-754227ecbf82" pod="openshift-kube-controller-manager/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:34 crc kubenswrapper[4732]: I0402 13:43:34.687270 4732 status_manager.go:851] "Failed to get status for pod" podUID="0664f762-59d2-4c95-8e8f-698af0b15611" pod="openshift-image-registry/image-registry-85455b8986-kh7t7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-85455b8986-kh7t7\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:34 crc kubenswrapper[4732]: I0402 13:43:34.687447 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:34 crc kubenswrapper[4732]: I0402 13:43:34.687699 4732 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:34 crc kubenswrapper[4732]: I0402 13:43:34.687923 4732 status_manager.go:851] "Failed to get status for pod" podUID="85581866-f172-4f4e-8805-55c1f175201d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:34 crc kubenswrapper[4732]: I0402 13:43:34.688239 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:34 crc kubenswrapper[4732]: I0402 13:43:34.688873 4732 status_manager.go:851] "Failed to get status for pod" podUID="4d77c191-7d04-4381-838f-b7a355e7c2d4" pod="openshift-console/console-f9d7485db-dq9x9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dq9x9\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:35 crc kubenswrapper[4732]: E0402 13:43:35.781909 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e68473d7cc0d08293a0f48d1224a39e782dd7b899994ad1d070d72a3c06dfc01" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 02 13:43:35 crc kubenswrapper[4732]: E0402 13:43:35.783578 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e68473d7cc0d08293a0f48d1224a39e782dd7b899994ad1d070d72a3c06dfc01" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 02 13:43:35 crc kubenswrapper[4732]: E0402 13:43:35.785091 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e68473d7cc0d08293a0f48d1224a39e782dd7b899994ad1d070d72a3c06dfc01" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 02 13:43:35 crc kubenswrapper[4732]: E0402 13:43:35.785164 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" podUID="bd27c473-5209-45b9-a8da-b285f03920f8" containerName="kube-multus-additional-cni-plugins" Apr 02 13:43:36 crc kubenswrapper[4732]: E0402 13:43:36.022401 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="6.4s" Apr 02 13:43:37 crc kubenswrapper[4732]: E0402 13:43:37.611064 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-ef96f17edb5e968b50bab42d6ddcc7c4ffc0fae1b3c8bcdebc6150e18977740e.scope\": RecentStats: unable to find data in memory cache]" Apr 02 13:43:38 crc kubenswrapper[4732]: I0402 13:43:38.359771 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Apr 02 13:43:38 crc kubenswrapper[4732]: I0402 13:43:38.361117 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Apr 02 13:43:38 crc kubenswrapper[4732]: I0402 13:43:38.361162 4732 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ef96f17edb5e968b50bab42d6ddcc7c4ffc0fae1b3c8bcdebc6150e18977740e" exitCode=1 Apr 02 13:43:38 crc kubenswrapper[4732]: I0402 13:43:38.361190 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ef96f17edb5e968b50bab42d6ddcc7c4ffc0fae1b3c8bcdebc6150e18977740e"} Apr 02 13:43:38 crc kubenswrapper[4732]: I0402 13:43:38.361658 4732 scope.go:117] "RemoveContainer" containerID="ef96f17edb5e968b50bab42d6ddcc7c4ffc0fae1b3c8bcdebc6150e18977740e" Apr 02 13:43:38 crc kubenswrapper[4732]: I0402 13:43:38.362237 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:38 crc kubenswrapper[4732]: I0402 13:43:38.362658 4732 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:38 crc kubenswrapper[4732]: I0402 13:43:38.363022 4732 status_manager.go:851] "Failed to get status for pod" podUID="4d77c191-7d04-4381-838f-b7a355e7c2d4" pod="openshift-console/console-f9d7485db-dq9x9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dq9x9\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:38 crc kubenswrapper[4732]: I0402 13:43:38.363444 4732 status_manager.go:851] "Failed to get status for pod" podUID="536912b9-ad03-42a4-bce9-754227ecbf82" pod="openshift-kube-controller-manager/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:38 crc kubenswrapper[4732]: I0402 13:43:38.363835 4732 status_manager.go:851] "Failed to get status for pod" podUID="0664f762-59d2-4c95-8e8f-698af0b15611" pod="openshift-image-registry/image-registry-85455b8986-kh7t7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-85455b8986-kh7t7\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:38 crc kubenswrapper[4732]: I0402 13:43:38.364086 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:38 crc kubenswrapper[4732]: I0402 13:43:38.364280 4732 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:38 crc kubenswrapper[4732]: I0402 13:43:38.364448 4732 status_manager.go:851] "Failed to get status for pod" podUID="85581866-f172-4f4e-8805-55c1f175201d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:38 crc kubenswrapper[4732]: I0402 13:43:38.679681 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:43:38 crc kubenswrapper[4732]: I0402 13:43:38.681132 4732 status_manager.go:851] "Failed to get status for pod" podUID="536912b9-ad03-42a4-bce9-754227ecbf82" pod="openshift-kube-controller-manager/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:38 crc kubenswrapper[4732]: I0402 13:43:38.681600 4732 status_manager.go:851] "Failed to get status for pod" podUID="0664f762-59d2-4c95-8e8f-698af0b15611" pod="openshift-image-registry/image-registry-85455b8986-kh7t7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-85455b8986-kh7t7\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:38 crc kubenswrapper[4732]: I0402 13:43:38.682070 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:38 crc kubenswrapper[4732]: I0402 13:43:38.682513 4732 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:38 crc kubenswrapper[4732]: I0402 13:43:38.682826 4732 status_manager.go:851] "Failed to get status for pod" podUID="85581866-f172-4f4e-8805-55c1f175201d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:38 crc kubenswrapper[4732]: I0402 13:43:38.683139 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:38 crc kubenswrapper[4732]: I0402 13:43:38.683587 4732 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:38 crc kubenswrapper[4732]: I0402 13:43:38.683912 4732 status_manager.go:851] "Failed to get status for pod" podUID="4d77c191-7d04-4381-838f-b7a355e7c2d4" pod="openshift-console/console-f9d7485db-dq9x9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dq9x9\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:38 crc kubenswrapper[4732]: I0402 13:43:38.693958 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af" Apr 02 13:43:38 crc kubenswrapper[4732]: I0402 13:43:38.694000 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af" Apr 02 13:43:38 crc kubenswrapper[4732]: E0402 13:43:38.694490 4732 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:43:38 crc kubenswrapper[4732]: I0402 13:43:38.694963 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:43:38 crc kubenswrapper[4732]: W0402 13:43:38.721150 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-67b4d23348599ef77b915ecc4e30549379dc8a6fa5303c37b793e61bd00c4cda WatchSource:0}: Error finding container 67b4d23348599ef77b915ecc4e30549379dc8a6fa5303c37b793e61bd00c4cda: Status 404 returned error can't find the container with id 67b4d23348599ef77b915ecc4e30549379dc8a6fa5303c37b793e61bd00c4cda Apr 02 13:43:39 crc kubenswrapper[4732]: I0402 13:43:39.366821 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Apr 02 13:43:39 crc kubenswrapper[4732]: I0402 13:43:39.368003 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Apr 02 13:43:39 crc kubenswrapper[4732]: I0402 13:43:39.368094 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fd58fa9b97b4f1b471d0e141c18378167e8054616db491b1e79d2530f9ecab7f"} Apr 02 13:43:39 crc kubenswrapper[4732]: I0402 13:43:39.368899 4732 status_manager.go:851] "Failed to get status for pod" podUID="536912b9-ad03-42a4-bce9-754227ecbf82" pod="openshift-kube-controller-manager/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:39 crc kubenswrapper[4732]: I0402 13:43:39.369095 4732 status_manager.go:851] "Failed to get status for pod" podUID="0664f762-59d2-4c95-8e8f-698af0b15611" pod="openshift-image-registry/image-registry-85455b8986-kh7t7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-85455b8986-kh7t7\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:39 crc kubenswrapper[4732]: I0402 13:43:39.369361 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:39 crc kubenswrapper[4732]: I0402 13:43:39.369551 4732 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:39 crc kubenswrapper[4732]: I0402 13:43:39.369710 4732 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="b30e48272a118d9bbed1d735935deeb2d907a6763127da4680506b145d65e715" exitCode=0 Apr 02 13:43:39 crc kubenswrapper[4732]: I0402 13:43:39.369752 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"b30e48272a118d9bbed1d735935deeb2d907a6763127da4680506b145d65e715"} Apr 02 13:43:39 crc kubenswrapper[4732]: I0402 13:43:39.369760 4732 status_manager.go:851] "Failed to get status for pod" podUID="85581866-f172-4f4e-8805-55c1f175201d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:39 crc kubenswrapper[4732]: I0402 13:43:39.369775 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"67b4d23348599ef77b915ecc4e30549379dc8a6fa5303c37b793e61bd00c4cda"} Apr 02 13:43:39 crc kubenswrapper[4732]: I0402 13:43:39.369939 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:39 crc kubenswrapper[4732]: I0402 13:43:39.369988 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af" Apr 02 13:43:39 crc kubenswrapper[4732]: I0402 13:43:39.369999 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af" Apr 02 13:43:39 crc kubenswrapper[4732]: I0402 13:43:39.370123 4732 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:39 crc kubenswrapper[4732]: E0402 13:43:39.370181 4732 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:43:39 crc kubenswrapper[4732]: I0402 13:43:39.370313 4732 status_manager.go:851] "Failed to get status for pod" podUID="4d77c191-7d04-4381-838f-b7a355e7c2d4" pod="openshift-console/console-f9d7485db-dq9x9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dq9x9\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:39 crc kubenswrapper[4732]: I0402 13:43:39.370730 4732 status_manager.go:851] "Failed to get status for pod" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" pod="openshift-kube-apiserver/revision-pruner-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/revision-pruner-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:39 crc kubenswrapper[4732]: I0402 13:43:39.371339 4732 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:39 crc kubenswrapper[4732]: I0402 13:43:39.371566 4732 status_manager.go:851] "Failed to get status for pod" podUID="85581866-f172-4f4e-8805-55c1f175201d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:39 crc kubenswrapper[4732]: I0402 13:43:39.371866 4732 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:39 crc kubenswrapper[4732]: I0402 13:43:39.372199 4732 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:39 crc kubenswrapper[4732]: I0402 13:43:39.372790 4732 status_manager.go:851] "Failed to get status for pod" podUID="4d77c191-7d04-4381-838f-b7a355e7c2d4" pod="openshift-console/console-f9d7485db-dq9x9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dq9x9\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:39 crc kubenswrapper[4732]: I0402 13:43:39.373035 4732 status_manager.go:851] "Failed to get status for pod" podUID="536912b9-ad03-42a4-bce9-754227ecbf82" pod="openshift-kube-controller-manager/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:39 crc kubenswrapper[4732]: I0402 13:43:39.373223 4732 status_manager.go:851] "Failed to get status for pod" podUID="0664f762-59d2-4c95-8e8f-698af0b15611" pod="openshift-image-registry/image-registry-85455b8986-kh7t7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-85455b8986-kh7t7\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:43:40 crc kubenswrapper[4732]: I0402 13:43:40.379050 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"31401ac1c6173f4a37b7273100552581a8fd8bacad3eb4a34da8770e5cce6608"} Apr 02 13:43:40 crc kubenswrapper[4732]: I0402 13:43:40.379382 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7fea6f1b57fdeab906e3a369b1c58efb7eb869ca1bb7068f8935ed118dfb6643"} Apr 02 13:43:40 crc kubenswrapper[4732]: I0402 13:43:40.379399 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"95fd6b2b782db18e44fcad5d695b957206ea146911d9035bc8f7a4506bddb058"} Apr 02 13:43:40 crc kubenswrapper[4732]: I0402 13:43:40.379412 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"858e53e28905cbacd387c14192b4f89524e7f215e3974a7181de6ee15ad9a7cc"} Apr 02 13:43:41 crc kubenswrapper[4732]: I0402 13:43:41.089363 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:43:41 crc kubenswrapper[4732]: I0402 13:43:41.089784 4732 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Apr 02 13:43:41 crc kubenswrapper[4732]: I0402 13:43:41.089885 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Apr 02 13:43:41 crc kubenswrapper[4732]: I0402 13:43:41.389115 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"44e641937af018f5df25208fd44098b473a12448b3ae7d42ca143ee83364a972"} Apr 02 13:43:41 crc kubenswrapper[4732]: I0402 13:43:41.389421 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af" Apr 02 13:43:41 crc kubenswrapper[4732]: I0402 13:43:41.389437 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af" Apr 02 13:43:41 crc kubenswrapper[4732]: I0402 13:43:41.389701 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:43:41 crc kubenswrapper[4732]: I0402 13:43:41.720051 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:43:43 crc kubenswrapper[4732]: I0402 13:43:43.695669 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:43:43 crc kubenswrapper[4732]: I0402 13:43:43.696057 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:43:43 crc kubenswrapper[4732]: I0402 13:43:43.702118 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:43:45 crc kubenswrapper[4732]: E0402 13:43:45.782299 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e68473d7cc0d08293a0f48d1224a39e782dd7b899994ad1d070d72a3c06dfc01" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 02 13:43:45 crc kubenswrapper[4732]: E0402 13:43:45.783783 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e68473d7cc0d08293a0f48d1224a39e782dd7b899994ad1d070d72a3c06dfc01" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 02 13:43:45 crc kubenswrapper[4732]: E0402 13:43:45.784744 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e68473d7cc0d08293a0f48d1224a39e782dd7b899994ad1d070d72a3c06dfc01" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 02 13:43:45 crc kubenswrapper[4732]: E0402 13:43:45.784777 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" podUID="bd27c473-5209-45b9-a8da-b285f03920f8" containerName="kube-multus-additional-cni-plugins" Apr 02 13:43:46 crc kubenswrapper[4732]: I0402 13:43:46.397194 4732 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:43:46 crc kubenswrapper[4732]: I0402 13:43:46.425846 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af" Apr 02 13:43:46 crc kubenswrapper[4732]: I0402 13:43:46.425880 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af" Apr 02 13:43:46 crc kubenswrapper[4732]: I0402 13:43:46.430585 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:43:46 crc kubenswrapper[4732]: I0402 13:43:46.555576 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="5b38af35-c358-4061-a119-3485eb32b774" Apr 02 13:43:47 crc kubenswrapper[4732]: I0402 13:43:47.429973 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af" Apr 02 13:43:47 crc kubenswrapper[4732]: I0402 13:43:47.430393 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f1efc9e3-71c0-47e0-bcf8-4ec41e8e25af" Apr 02 13:43:47 crc kubenswrapper[4732]: I0402 13:43:47.434685 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="5b38af35-c358-4061-a119-3485eb32b774" Apr 02 13:43:49 crc kubenswrapper[4732]: I0402 13:43:49.441462 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-q7cfc_bd27c473-5209-45b9-a8da-b285f03920f8/kube-multus-additional-cni-plugins/0.log" Apr 02 13:43:49 crc kubenswrapper[4732]: I0402 13:43:49.441510 4732 generic.go:334] "Generic (PLEG): container finished" podID="bd27c473-5209-45b9-a8da-b285f03920f8" containerID="e68473d7cc0d08293a0f48d1224a39e782dd7b899994ad1d070d72a3c06dfc01" exitCode=137 Apr 02 13:43:49 crc kubenswrapper[4732]: I0402 13:43:49.441554 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" event={"ID":"bd27c473-5209-45b9-a8da-b285f03920f8","Type":"ContainerDied","Data":"e68473d7cc0d08293a0f48d1224a39e782dd7b899994ad1d070d72a3c06dfc01"} Apr 02 13:43:49 crc kubenswrapper[4732]: I0402 13:43:49.781757 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-q7cfc_bd27c473-5209-45b9-a8da-b285f03920f8/kube-multus-additional-cni-plugins/0.log" Apr 02 13:43:49 crc kubenswrapper[4732]: I0402 13:43:49.782100 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" Apr 02 13:43:49 crc kubenswrapper[4732]: I0402 13:43:49.926455 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdbzf\" (UniqueName: \"kubernetes.io/projected/bd27c473-5209-45b9-a8da-b285f03920f8-kube-api-access-mdbzf\") pod \"bd27c473-5209-45b9-a8da-b285f03920f8\" (UID: \"bd27c473-5209-45b9-a8da-b285f03920f8\") " Apr 02 13:43:49 crc kubenswrapper[4732]: I0402 13:43:49.926507 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd27c473-5209-45b9-a8da-b285f03920f8-tuning-conf-dir\") pod \"bd27c473-5209-45b9-a8da-b285f03920f8\" (UID: \"bd27c473-5209-45b9-a8da-b285f03920f8\") " Apr 02 13:43:49 crc kubenswrapper[4732]: I0402 13:43:49.926540 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bd27c473-5209-45b9-a8da-b285f03920f8-cni-sysctl-allowlist\") pod \"bd27c473-5209-45b9-a8da-b285f03920f8\" (UID: \"bd27c473-5209-45b9-a8da-b285f03920f8\") " Apr 02 13:43:49 crc kubenswrapper[4732]: I0402 13:43:49.926565 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/bd27c473-5209-45b9-a8da-b285f03920f8-ready\") pod \"bd27c473-5209-45b9-a8da-b285f03920f8\" (UID: \"bd27c473-5209-45b9-a8da-b285f03920f8\") " Apr 02 13:43:49 crc kubenswrapper[4732]: I0402 13:43:49.926602 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd27c473-5209-45b9-a8da-b285f03920f8-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "bd27c473-5209-45b9-a8da-b285f03920f8" (UID: "bd27c473-5209-45b9-a8da-b285f03920f8"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:43:49 crc kubenswrapper[4732]: I0402 13:43:49.926801 4732 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd27c473-5209-45b9-a8da-b285f03920f8-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:49 crc kubenswrapper[4732]: I0402 13:43:49.927108 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd27c473-5209-45b9-a8da-b285f03920f8-ready" (OuterVolumeSpecName: "ready") pod "bd27c473-5209-45b9-a8da-b285f03920f8" (UID: "bd27c473-5209-45b9-a8da-b285f03920f8"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:43:49 crc kubenswrapper[4732]: I0402 13:43:49.927161 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd27c473-5209-45b9-a8da-b285f03920f8-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "bd27c473-5209-45b9-a8da-b285f03920f8" (UID: "bd27c473-5209-45b9-a8da-b285f03920f8"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:43:49 crc kubenswrapper[4732]: I0402 13:43:49.931025 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd27c473-5209-45b9-a8da-b285f03920f8-kube-api-access-mdbzf" (OuterVolumeSpecName: "kube-api-access-mdbzf") pod "bd27c473-5209-45b9-a8da-b285f03920f8" (UID: "bd27c473-5209-45b9-a8da-b285f03920f8"). InnerVolumeSpecName "kube-api-access-mdbzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:43:50 crc kubenswrapper[4732]: I0402 13:43:50.027984 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdbzf\" (UniqueName: \"kubernetes.io/projected/bd27c473-5209-45b9-a8da-b285f03920f8-kube-api-access-mdbzf\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:50 crc kubenswrapper[4732]: I0402 13:43:50.028017 4732 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bd27c473-5209-45b9-a8da-b285f03920f8-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:50 crc kubenswrapper[4732]: I0402 13:43:50.028026 4732 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/bd27c473-5209-45b9-a8da-b285f03920f8-ready\") on node \"crc\" DevicePath \"\"" Apr 02 13:43:50 crc kubenswrapper[4732]: I0402 13:43:50.449921 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-q7cfc_bd27c473-5209-45b9-a8da-b285f03920f8/kube-multus-additional-cni-plugins/0.log" Apr 02 13:43:50 crc kubenswrapper[4732]: I0402 13:43:50.449992 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" event={"ID":"bd27c473-5209-45b9-a8da-b285f03920f8","Type":"ContainerDied","Data":"5e71134fa6603ed5e6cabcf01ae0a45a55bfeb58e31e8c3441ad80aee6ea2ca8"} Apr 02 13:43:50 crc kubenswrapper[4732]: I0402 13:43:50.450044 4732 scope.go:117] "RemoveContainer" containerID="e68473d7cc0d08293a0f48d1224a39e782dd7b899994ad1d070d72a3c06dfc01" Apr 02 13:43:50 crc kubenswrapper[4732]: I0402 13:43:50.450055 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-q7cfc" Apr 02 13:43:51 crc kubenswrapper[4732]: I0402 13:43:51.093682 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:43:51 crc kubenswrapper[4732]: I0402 13:43:51.097228 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:43:56 crc kubenswrapper[4732]: I0402 13:43:56.671119 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Apr 02 13:43:56 crc kubenswrapper[4732]: I0402 13:43:56.769010 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Apr 02 13:43:57 crc kubenswrapper[4732]: I0402 13:43:57.035600 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Apr 02 13:43:57 crc kubenswrapper[4732]: I0402 13:43:57.046106 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Apr 02 13:43:57 crc kubenswrapper[4732]: I0402 13:43:57.316449 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Apr 02 13:43:57 crc kubenswrapper[4732]: I0402 13:43:57.703297 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Apr 02 13:43:57 crc kubenswrapper[4732]: I0402 13:43:57.730503 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Apr 02 13:43:57 crc kubenswrapper[4732]: I0402 13:43:57.973700 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Apr 02 13:43:58 crc kubenswrapper[4732]: I0402 13:43:58.016838 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Apr 02 13:43:58 crc kubenswrapper[4732]: I0402 13:43:58.127564 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Apr 02 13:43:58 crc kubenswrapper[4732]: I0402 13:43:58.491958 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Apr 02 13:43:58 crc kubenswrapper[4732]: I0402 13:43:58.538963 4732 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Apr 02 13:43:58 crc kubenswrapper[4732]: I0402 13:43:58.577087 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Apr 02 13:43:58 crc kubenswrapper[4732]: I0402 13:43:58.688498 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Apr 02 13:43:58 crc kubenswrapper[4732]: I0402 13:43:58.803632 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Apr 02 13:43:58 crc kubenswrapper[4732]: I0402 13:43:58.815704 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Apr 02 13:43:58 crc kubenswrapper[4732]: I0402 13:43:58.860941 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Apr 02 13:43:58 crc kubenswrapper[4732]: I0402 13:43:58.871466 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Apr 02 13:43:59 crc kubenswrapper[4732]: I0402 13:43:59.028264 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Apr 02 13:43:59 crc kubenswrapper[4732]: I0402 13:43:59.038246 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Apr 02 13:43:59 crc kubenswrapper[4732]: I0402 13:43:59.070008 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Apr 02 13:43:59 crc kubenswrapper[4732]: I0402 13:43:59.109580 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Apr 02 13:43:59 crc kubenswrapper[4732]: I0402 13:43:59.191323 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Apr 02 13:43:59 crc kubenswrapper[4732]: I0402 13:43:59.312290 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Apr 02 13:43:59 crc kubenswrapper[4732]: I0402 13:43:59.323041 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Apr 02 13:43:59 crc kubenswrapper[4732]: I0402 13:43:59.371418 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Apr 02 13:43:59 crc kubenswrapper[4732]: I0402 13:43:59.407004 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Apr 02 13:43:59 crc kubenswrapper[4732]: I0402 13:43:59.593336 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Apr 02 13:43:59 crc kubenswrapper[4732]: I0402 13:43:59.631424 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Apr 02 13:43:59 crc kubenswrapper[4732]: I0402 13:43:59.639452 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Apr 02 13:43:59 crc kubenswrapper[4732]: I0402 13:43:59.643323 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Apr 02 13:43:59 crc kubenswrapper[4732]: I0402 13:43:59.874588 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Apr 02 13:43:59 crc kubenswrapper[4732]: I0402 13:43:59.930859 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Apr 02 13:43:59 crc kubenswrapper[4732]: I0402 13:43:59.937686 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Apr 02 13:44:00 crc kubenswrapper[4732]: I0402 13:44:00.140469 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Apr 02 13:44:00 crc kubenswrapper[4732]: I0402 13:44:00.152178 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Apr 02 13:44:00 crc kubenswrapper[4732]: I0402 13:44:00.187121 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Apr 02 13:44:00 crc kubenswrapper[4732]: I0402 13:44:00.236264 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Apr 02 13:44:00 crc kubenswrapper[4732]: I0402 13:44:00.254344 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Apr 02 13:44:00 crc kubenswrapper[4732]: I0402 13:44:00.279493 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Apr 02 13:44:00 crc kubenswrapper[4732]: I0402 13:44:00.295841 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Apr 02 13:44:00 crc kubenswrapper[4732]: I0402 13:44:00.394149 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Apr 02 13:44:00 crc kubenswrapper[4732]: I0402 13:44:00.394268 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Apr 02 13:44:00 crc kubenswrapper[4732]: I0402 13:44:00.394358 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Apr 02 13:44:00 crc kubenswrapper[4732]: I0402 13:44:00.421042 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Apr 02 13:44:00 crc kubenswrapper[4732]: I0402 13:44:00.425810 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Apr 02 13:44:00 crc kubenswrapper[4732]: I0402 13:44:00.427723 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Apr 02 13:44:00 crc kubenswrapper[4732]: I0402 13:44:00.434947 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Apr 02 13:44:00 crc kubenswrapper[4732]: I0402 13:44:00.448016 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Apr 02 13:44:00 crc kubenswrapper[4732]: I0402 13:44:00.451097 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Apr 02 13:44:00 crc kubenswrapper[4732]: I0402 13:44:00.624873 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Apr 02 13:44:00 crc kubenswrapper[4732]: I0402 13:44:00.699254 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Apr 02 13:44:00 crc kubenswrapper[4732]: I0402 13:44:00.703402 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Apr 02 13:44:00 crc kubenswrapper[4732]: I0402 13:44:00.730905 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Apr 02 13:44:00 crc kubenswrapper[4732]: I0402 13:44:00.813203 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Apr 02 13:44:00 crc kubenswrapper[4732]: I0402 13:44:00.813744 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Apr 02 13:44:00 crc kubenswrapper[4732]: I0402 13:44:00.847545 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Apr 02 13:44:00 crc kubenswrapper[4732]: I0402 13:44:00.849868 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Apr 02 13:44:00 crc kubenswrapper[4732]: I0402 13:44:00.901257 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.062547 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.094819 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.099768 4732 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.100546 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=37.1005192 podStartE2EDuration="37.1005192s" podCreationTimestamp="2026-04-02 13:43:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:43:46.440148869 +0000 UTC m=+383.344556422" watchObservedRunningTime="2026-04-02 13:44:01.1005192 +0000 UTC m=+398.004926743" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.106273 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-85455b8986-kh7t7","openshift-kube-apiserver/kube-apiserver-crc","openshift-multus/cni-sysctl-allowlist-ds-q7cfc","openshift-console/console-f9d7485db-dq9x9"] Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.106466 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.111774 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.124372 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=15.124353803 podStartE2EDuration="15.124353803s" podCreationTimestamp="2026-04-02 13:43:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:44:01.123684465 +0000 UTC m=+398.028092038" watchObservedRunningTime="2026-04-02 13:44:01.124353803 +0000 UTC m=+398.028761356" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.129559 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.232439 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.277483 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.333771 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.397845 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.480510 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.488251 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.523520 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.530951 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.582209 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.583253 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.593375 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.645244 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.679586 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.696085 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.756580 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.791094 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.799594 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.824467 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.880491 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Apr 02 13:44:01 crc kubenswrapper[4732]: I0402 13:44:01.976044 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.040855 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.078187 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.078875 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.093250 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.097344 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.132565 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.148819 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.195674 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.213891 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.248558 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.263713 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.279015 4732 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.291768 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.299883 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.372541 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.405562 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.407193 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.411255 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.512506 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.598843 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.687475 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0664f762-59d2-4c95-8e8f-698af0b15611" path="/var/lib/kubelet/pods/0664f762-59d2-4c95-8e8f-698af0b15611/volumes" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.688379 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d77c191-7d04-4381-838f-b7a355e7c2d4" path="/var/lib/kubelet/pods/4d77c191-7d04-4381-838f-b7a355e7c2d4/volumes" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.689076 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd27c473-5209-45b9-a8da-b285f03920f8" path="/var/lib/kubelet/pods/bd27c473-5209-45b9-a8da-b285f03920f8/volumes" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.730951 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.740717 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.825764 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.905020 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Apr 02 13:44:02 crc kubenswrapper[4732]: I0402 13:44:02.983452 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Apr 02 13:44:03 crc kubenswrapper[4732]: I0402 13:44:03.026077 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Apr 02 13:44:03 crc kubenswrapper[4732]: I0402 13:44:03.081807 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Apr 02 13:44:03 crc kubenswrapper[4732]: I0402 13:44:03.305943 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Apr 02 13:44:03 crc kubenswrapper[4732]: I0402 13:44:03.420788 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Apr 02 13:44:03 crc kubenswrapper[4732]: I0402 13:44:03.442079 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Apr 02 13:44:03 crc kubenswrapper[4732]: I0402 13:44:03.457965 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Apr 02 13:44:03 crc kubenswrapper[4732]: I0402 13:44:03.489502 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Apr 02 13:44:03 crc kubenswrapper[4732]: I0402 13:44:03.548653 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Apr 02 13:44:03 crc kubenswrapper[4732]: I0402 13:44:03.595416 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Apr 02 13:44:03 crc kubenswrapper[4732]: I0402 13:44:03.627235 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Apr 02 13:44:03 crc kubenswrapper[4732]: I0402 13:44:03.649002 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Apr 02 13:44:03 crc kubenswrapper[4732]: I0402 13:44:03.737061 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Apr 02 13:44:03 crc kubenswrapper[4732]: I0402 13:44:03.758589 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Apr 02 13:44:03 crc kubenswrapper[4732]: I0402 13:44:03.776473 4732 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Apr 02 13:44:03 crc kubenswrapper[4732]: I0402 13:44:03.784695 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Apr 02 13:44:03 crc kubenswrapper[4732]: I0402 13:44:03.808693 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Apr 02 13:44:03 crc kubenswrapper[4732]: I0402 13:44:03.842768 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Apr 02 13:44:03 crc kubenswrapper[4732]: I0402 13:44:03.850438 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Apr 02 13:44:03 crc kubenswrapper[4732]: I0402 13:44:03.917258 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Apr 02 13:44:04 crc kubenswrapper[4732]: I0402 13:44:04.004791 4732 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Apr 02 13:44:04 crc kubenswrapper[4732]: I0402 13:44:04.022445 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Apr 02 13:44:04 crc kubenswrapper[4732]: I0402 13:44:04.064174 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Apr 02 13:44:04 crc kubenswrapper[4732]: I0402 13:44:04.127506 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Apr 02 13:44:04 crc kubenswrapper[4732]: I0402 13:44:04.250380 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Apr 02 13:44:04 crc kubenswrapper[4732]: I0402 13:44:04.263387 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Apr 02 13:44:04 crc kubenswrapper[4732]: I0402 13:44:04.305227 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Apr 02 13:44:04 crc kubenswrapper[4732]: I0402 13:44:04.321141 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Apr 02 13:44:04 crc kubenswrapper[4732]: I0402 13:44:04.342104 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Apr 02 13:44:04 crc kubenswrapper[4732]: I0402 13:44:04.405038 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Apr 02 13:44:04 crc kubenswrapper[4732]: I0402 13:44:04.533414 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Apr 02 13:44:04 crc kubenswrapper[4732]: I0402 13:44:04.543535 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Apr 02 13:44:04 crc kubenswrapper[4732]: I0402 13:44:04.567709 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Apr 02 13:44:04 crc kubenswrapper[4732]: I0402 13:44:04.582077 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Apr 02 13:44:04 crc kubenswrapper[4732]: I0402 13:44:04.650098 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Apr 02 13:44:04 crc kubenswrapper[4732]: I0402 13:44:04.732085 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Apr 02 13:44:04 crc kubenswrapper[4732]: I0402 13:44:04.777405 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Apr 02 13:44:04 crc kubenswrapper[4732]: I0402 13:44:04.857320 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Apr 02 13:44:04 crc kubenswrapper[4732]: I0402 13:44:04.925080 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Apr 02 13:44:04 crc kubenswrapper[4732]: I0402 13:44:04.988558 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Apr 02 13:44:04 crc kubenswrapper[4732]: I0402 13:44:04.996376 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Apr 02 13:44:05 crc kubenswrapper[4732]: I0402 13:44:05.000012 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Apr 02 13:44:05 crc kubenswrapper[4732]: I0402 13:44:05.072024 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Apr 02 13:44:05 crc kubenswrapper[4732]: I0402 13:44:05.142738 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Apr 02 13:44:05 crc kubenswrapper[4732]: I0402 13:44:05.152448 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Apr 02 13:44:05 crc kubenswrapper[4732]: I0402 13:44:05.155712 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Apr 02 13:44:05 crc kubenswrapper[4732]: I0402 13:44:05.259115 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Apr 02 13:44:05 crc kubenswrapper[4732]: I0402 13:44:05.314330 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Apr 02 13:44:05 crc kubenswrapper[4732]: I0402 13:44:05.546087 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Apr 02 13:44:05 crc kubenswrapper[4732]: I0402 13:44:05.610377 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Apr 02 13:44:05 crc kubenswrapper[4732]: I0402 13:44:05.634711 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Apr 02 13:44:05 crc kubenswrapper[4732]: I0402 13:44:05.716227 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Apr 02 13:44:05 crc kubenswrapper[4732]: I0402 13:44:05.770161 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Apr 02 13:44:05 crc kubenswrapper[4732]: I0402 13:44:05.930997 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Apr 02 13:44:05 crc kubenswrapper[4732]: I0402 13:44:05.951237 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Apr 02 13:44:05 crc kubenswrapper[4732]: I0402 13:44:05.975424 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Apr 02 13:44:06 crc kubenswrapper[4732]: I0402 13:44:06.234282 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Apr 02 13:44:06 crc kubenswrapper[4732]: I0402 13:44:06.371684 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Apr 02 13:44:06 crc kubenswrapper[4732]: I0402 13:44:06.375711 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Apr 02 13:44:06 crc kubenswrapper[4732]: I0402 13:44:06.440708 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Apr 02 13:44:06 crc kubenswrapper[4732]: I0402 13:44:06.455972 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Apr 02 13:44:06 crc kubenswrapper[4732]: I0402 13:44:06.559292 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Apr 02 13:44:06 crc kubenswrapper[4732]: I0402 13:44:06.563651 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Apr 02 13:44:06 crc kubenswrapper[4732]: I0402 13:44:06.579986 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Apr 02 13:44:06 crc kubenswrapper[4732]: I0402 13:44:06.602778 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Apr 02 13:44:06 crc kubenswrapper[4732]: I0402 13:44:06.603905 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Apr 02 13:44:06 crc kubenswrapper[4732]: I0402 13:44:06.705272 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Apr 02 13:44:06 crc kubenswrapper[4732]: I0402 13:44:06.710080 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Apr 02 13:44:06 crc kubenswrapper[4732]: I0402 13:44:06.767756 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Apr 02 13:44:06 crc kubenswrapper[4732]: I0402 13:44:06.895992 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Apr 02 13:44:06 crc kubenswrapper[4732]: I0402 13:44:06.910115 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Apr 02 13:44:06 crc kubenswrapper[4732]: I0402 13:44:06.948471 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Apr 02 13:44:07 crc kubenswrapper[4732]: I0402 13:44:07.073281 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Apr 02 13:44:07 crc kubenswrapper[4732]: I0402 13:44:07.128447 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Apr 02 13:44:07 crc kubenswrapper[4732]: I0402 13:44:07.154329 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Apr 02 13:44:07 crc kubenswrapper[4732]: I0402 13:44:07.161265 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Apr 02 13:44:07 crc kubenswrapper[4732]: I0402 13:44:07.217419 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Apr 02 13:44:07 crc kubenswrapper[4732]: I0402 13:44:07.221308 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Apr 02 13:44:07 crc kubenswrapper[4732]: I0402 13:44:07.406120 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Apr 02 13:44:07 crc kubenswrapper[4732]: I0402 13:44:07.413828 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Apr 02 13:44:07 crc kubenswrapper[4732]: I0402 13:44:07.492973 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Apr 02 13:44:07 crc kubenswrapper[4732]: I0402 13:44:07.514353 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Apr 02 13:44:07 crc kubenswrapper[4732]: I0402 13:44:07.622415 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Apr 02 13:44:07 crc kubenswrapper[4732]: I0402 13:44:07.733329 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Apr 02 13:44:07 crc kubenswrapper[4732]: I0402 13:44:07.775594 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Apr 02 13:44:07 crc kubenswrapper[4732]: I0402 13:44:07.813041 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Apr 02 13:44:07 crc kubenswrapper[4732]: I0402 13:44:07.851077 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Apr 02 13:44:07 crc kubenswrapper[4732]: I0402 13:44:07.890950 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Apr 02 13:44:07 crc kubenswrapper[4732]: I0402 13:44:07.948005 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Apr 02 13:44:08 crc kubenswrapper[4732]: I0402 13:44:08.087099 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Apr 02 13:44:08 crc kubenswrapper[4732]: I0402 13:44:08.118031 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Apr 02 13:44:08 crc kubenswrapper[4732]: I0402 13:44:08.253357 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Apr 02 13:44:08 crc kubenswrapper[4732]: I0402 13:44:08.267568 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Apr 02 13:44:08 crc kubenswrapper[4732]: I0402 13:44:08.300191 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Apr 02 13:44:08 crc kubenswrapper[4732]: I0402 13:44:08.321388 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Apr 02 13:44:08 crc kubenswrapper[4732]: I0402 13:44:08.395033 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Apr 02 13:44:08 crc kubenswrapper[4732]: I0402 13:44:08.471972 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Apr 02 13:44:08 crc kubenswrapper[4732]: I0402 13:44:08.492242 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Apr 02 13:44:08 crc kubenswrapper[4732]: I0402 13:44:08.556285 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Apr 02 13:44:08 crc kubenswrapper[4732]: I0402 13:44:08.604919 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Apr 02 13:44:08 crc kubenswrapper[4732]: I0402 13:44:08.647867 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Apr 02 13:44:08 crc kubenswrapper[4732]: I0402 13:44:08.725148 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Apr 02 13:44:08 crc kubenswrapper[4732]: I0402 13:44:08.751262 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Apr 02 13:44:08 crc kubenswrapper[4732]: I0402 13:44:08.810201 4732 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Apr 02 13:44:08 crc kubenswrapper[4732]: I0402 13:44:08.810554 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://96840158fe7da1d384adaa2477702b2ebc8af51caba2dfb7559ac028fbfe2523" gracePeriod=5 Apr 02 13:44:08 crc kubenswrapper[4732]: I0402 13:44:08.817464 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Apr 02 13:44:08 crc kubenswrapper[4732]: I0402 13:44:08.866043 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Apr 02 13:44:08 crc kubenswrapper[4732]: I0402 13:44:08.903645 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Apr 02 13:44:08 crc kubenswrapper[4732]: I0402 13:44:08.906079 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Apr 02 13:44:08 crc kubenswrapper[4732]: I0402 13:44:08.914211 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Apr 02 13:44:08 crc kubenswrapper[4732]: I0402 13:44:08.923963 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Apr 02 13:44:09 crc kubenswrapper[4732]: I0402 13:44:09.154326 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Apr 02 13:44:09 crc kubenswrapper[4732]: I0402 13:44:09.279575 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Apr 02 13:44:09 crc kubenswrapper[4732]: I0402 13:44:09.334146 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Apr 02 13:44:09 crc kubenswrapper[4732]: I0402 13:44:09.369898 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Apr 02 13:44:09 crc kubenswrapper[4732]: I0402 13:44:09.455968 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Apr 02 13:44:09 crc kubenswrapper[4732]: I0402 13:44:09.536688 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Apr 02 13:44:09 crc kubenswrapper[4732]: I0402 13:44:09.702429 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Apr 02 13:44:09 crc kubenswrapper[4732]: I0402 13:44:09.925408 4732 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Apr 02 13:44:09 crc kubenswrapper[4732]: I0402 13:44:09.930871 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Apr 02 13:44:09 crc kubenswrapper[4732]: I0402 13:44:09.994913 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.131001 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.248995 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.357540 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.368044 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.485380 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.518837 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.530731 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.535816 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.812607 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.858104 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.942173 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585624-8cm2s"] Apr 02 13:44:10 crc kubenswrapper[4732]: E0402 13:44:10.942409 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd27c473-5209-45b9-a8da-b285f03920f8" containerName="kube-multus-additional-cni-plugins" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.942423 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd27c473-5209-45b9-a8da-b285f03920f8" containerName="kube-multus-additional-cni-plugins" Apr 02 13:44:10 crc kubenswrapper[4732]: E0402 13:44:10.942437 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="536912b9-ad03-42a4-bce9-754227ecbf82" containerName="installer" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.942444 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="536912b9-ad03-42a4-bce9-754227ecbf82" containerName="installer" Apr 02 13:44:10 crc kubenswrapper[4732]: E0402 13:44:10.942456 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85581866-f172-4f4e-8805-55c1f175201d" containerName="installer" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.942463 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="85581866-f172-4f4e-8805-55c1f175201d" containerName="installer" Apr 02 13:44:10 crc kubenswrapper[4732]: E0402 13:44:10.942474 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0664f762-59d2-4c95-8e8f-698af0b15611" containerName="registry" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.942481 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="0664f762-59d2-4c95-8e8f-698af0b15611" containerName="registry" Apr 02 13:44:10 crc kubenswrapper[4732]: E0402 13:44:10.942503 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" containerName="pruner" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.942512 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" containerName="pruner" Apr 02 13:44:10 crc kubenswrapper[4732]: E0402 13:44:10.942523 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d77c191-7d04-4381-838f-b7a355e7c2d4" containerName="console" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.942531 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d77c191-7d04-4381-838f-b7a355e7c2d4" containerName="console" Apr 02 13:44:10 crc kubenswrapper[4732]: E0402 13:44:10.942546 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.942553 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.942695 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="85581866-f172-4f4e-8805-55c1f175201d" containerName="installer" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.942709 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="536912b9-ad03-42a4-bce9-754227ecbf82" containerName="installer" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.942723 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="0664f762-59d2-4c95-8e8f-698af0b15611" containerName="registry" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.942731 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfc68032-3ed5-47ad-adde-2d764f4b8868" containerName="pruner" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.942738 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd27c473-5209-45b9-a8da-b285f03920f8" containerName="kube-multus-additional-cni-plugins" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.942754 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d77c191-7d04-4381-838f-b7a355e7c2d4" containerName="console" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.942765 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.943160 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585624-8cm2s" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.944917 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.945102 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.945470 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.954201 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585624-8cm2s"] Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.997789 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r86kc\" (UniqueName: \"kubernetes.io/projected/18370e02-d95b-4be8-a495-ee908136bee4-kube-api-access-r86kc\") pod \"auto-csr-approver-29585624-8cm2s\" (UID: \"18370e02-d95b-4be8-a495-ee908136bee4\") " pod="openshift-infra/auto-csr-approver-29585624-8cm2s" Apr 02 13:44:10 crc kubenswrapper[4732]: I0402 13:44:10.997897 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Apr 02 13:44:11 crc kubenswrapper[4732]: I0402 13:44:11.098856 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r86kc\" (UniqueName: \"kubernetes.io/projected/18370e02-d95b-4be8-a495-ee908136bee4-kube-api-access-r86kc\") pod \"auto-csr-approver-29585624-8cm2s\" (UID: \"18370e02-d95b-4be8-a495-ee908136bee4\") " pod="openshift-infra/auto-csr-approver-29585624-8cm2s" Apr 02 13:44:11 crc kubenswrapper[4732]: I0402 13:44:11.117305 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Apr 02 13:44:11 crc kubenswrapper[4732]: I0402 13:44:11.124043 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r86kc\" (UniqueName: \"kubernetes.io/projected/18370e02-d95b-4be8-a495-ee908136bee4-kube-api-access-r86kc\") pod \"auto-csr-approver-29585624-8cm2s\" (UID: \"18370e02-d95b-4be8-a495-ee908136bee4\") " pod="openshift-infra/auto-csr-approver-29585624-8cm2s" Apr 02 13:44:11 crc kubenswrapper[4732]: I0402 13:44:11.245280 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Apr 02 13:44:11 crc kubenswrapper[4732]: I0402 13:44:11.309524 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585624-8cm2s" Apr 02 13:44:11 crc kubenswrapper[4732]: I0402 13:44:11.447071 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Apr 02 13:44:11 crc kubenswrapper[4732]: I0402 13:44:11.495945 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Apr 02 13:44:11 crc kubenswrapper[4732]: I0402 13:44:11.571826 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Apr 02 13:44:11 crc kubenswrapper[4732]: I0402 13:44:11.733541 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585624-8cm2s"] Apr 02 13:44:11 crc kubenswrapper[4732]: I0402 13:44:11.761977 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Apr 02 13:44:11 crc kubenswrapper[4732]: I0402 13:44:11.950247 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Apr 02 13:44:11 crc kubenswrapper[4732]: I0402 13:44:11.957664 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Apr 02 13:44:12 crc kubenswrapper[4732]: I0402 13:44:12.032261 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Apr 02 13:44:12 crc kubenswrapper[4732]: I0402 13:44:12.552828 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Apr 02 13:44:12 crc kubenswrapper[4732]: I0402 13:44:12.576119 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585624-8cm2s" event={"ID":"18370e02-d95b-4be8-a495-ee908136bee4","Type":"ContainerStarted","Data":"497dda75c58ca9f4041a7ed3f58fcc286c52be01f9983b98e052dd6d34fa0be9"} Apr 02 13:44:13 crc kubenswrapper[4732]: I0402 13:44:13.513119 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Apr 02 13:44:13 crc kubenswrapper[4732]: I0402 13:44:13.583182 4732 generic.go:334] "Generic (PLEG): container finished" podID="18370e02-d95b-4be8-a495-ee908136bee4" containerID="146571f443aca9aede42e319eb68c0aad3e5dca2be8b8406fde6882547e09f0a" exitCode=0 Apr 02 13:44:13 crc kubenswrapper[4732]: I0402 13:44:13.583228 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585624-8cm2s" event={"ID":"18370e02-d95b-4be8-a495-ee908136bee4","Type":"ContainerDied","Data":"146571f443aca9aede42e319eb68c0aad3e5dca2be8b8406fde6882547e09f0a"} Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.398827 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.398914 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.542403 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.543028 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.543165 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.543288 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.543443 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.542573 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.543264 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.543376 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.543483 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.544032 4732 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.544124 4732 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.544195 4732 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.544260 4732 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.551542 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.590228 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.590481 4732 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="96840158fe7da1d384adaa2477702b2ebc8af51caba2dfb7559ac028fbfe2523" exitCode=137 Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.590559 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.590571 4732 scope.go:117] "RemoveContainer" containerID="96840158fe7da1d384adaa2477702b2ebc8af51caba2dfb7559ac028fbfe2523" Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.610974 4732 scope.go:117] "RemoveContainer" containerID="96840158fe7da1d384adaa2477702b2ebc8af51caba2dfb7559ac028fbfe2523" Apr 02 13:44:14 crc kubenswrapper[4732]: E0402 13:44:14.611479 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96840158fe7da1d384adaa2477702b2ebc8af51caba2dfb7559ac028fbfe2523\": container with ID starting with 96840158fe7da1d384adaa2477702b2ebc8af51caba2dfb7559ac028fbfe2523 not found: ID does not exist" containerID="96840158fe7da1d384adaa2477702b2ebc8af51caba2dfb7559ac028fbfe2523" Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.611514 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96840158fe7da1d384adaa2477702b2ebc8af51caba2dfb7559ac028fbfe2523"} err="failed to get container status \"96840158fe7da1d384adaa2477702b2ebc8af51caba2dfb7559ac028fbfe2523\": rpc error: code = NotFound desc = could not find container \"96840158fe7da1d384adaa2477702b2ebc8af51caba2dfb7559ac028fbfe2523\": container with ID starting with 96840158fe7da1d384adaa2477702b2ebc8af51caba2dfb7559ac028fbfe2523 not found: ID does not exist" Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.645342 4732 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.697665 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.697989 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.712227 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.712267 4732 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="d38707a4-61e3-47f1-8302-b252e4603a1e" Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.717497 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.717539 4732 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="d38707a4-61e3-47f1-8302-b252e4603a1e" Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.851079 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585624-8cm2s" Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.952436 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r86kc\" (UniqueName: \"kubernetes.io/projected/18370e02-d95b-4be8-a495-ee908136bee4-kube-api-access-r86kc\") pod \"18370e02-d95b-4be8-a495-ee908136bee4\" (UID: \"18370e02-d95b-4be8-a495-ee908136bee4\") " Apr 02 13:44:14 crc kubenswrapper[4732]: I0402 13:44:14.955707 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18370e02-d95b-4be8-a495-ee908136bee4-kube-api-access-r86kc" (OuterVolumeSpecName: "kube-api-access-r86kc") pod "18370e02-d95b-4be8-a495-ee908136bee4" (UID: "18370e02-d95b-4be8-a495-ee908136bee4"). InnerVolumeSpecName "kube-api-access-r86kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:44:15 crc kubenswrapper[4732]: I0402 13:44:15.053890 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r86kc\" (UniqueName: \"kubernetes.io/projected/18370e02-d95b-4be8-a495-ee908136bee4-kube-api-access-r86kc\") on node \"crc\" DevicePath \"\"" Apr 02 13:44:15 crc kubenswrapper[4732]: I0402 13:44:15.598640 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585624-8cm2s" event={"ID":"18370e02-d95b-4be8-a495-ee908136bee4","Type":"ContainerDied","Data":"497dda75c58ca9f4041a7ed3f58fcc286c52be01f9983b98e052dd6d34fa0be9"} Apr 02 13:44:15 crc kubenswrapper[4732]: I0402 13:44:15.598682 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="497dda75c58ca9f4041a7ed3f58fcc286c52be01f9983b98e052dd6d34fa0be9" Apr 02 13:44:15 crc kubenswrapper[4732]: I0402 13:44:15.598727 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585624-8cm2s" Apr 02 13:44:18 crc kubenswrapper[4732]: I0402 13:44:18.725011 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:44:18 crc kubenswrapper[4732]: I0402 13:44:18.727866 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="e0654c6f-b103-4146-b591-b2acee4900a9" Apr 02 13:44:18 crc kubenswrapper[4732]: I0402 13:44:18.728013 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="e0654c6f-b103-4146-b591-b2acee4900a9" Apr 02 13:44:18 crc kubenswrapper[4732]: I0402 13:44:18.737744 4732 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:44:18 crc kubenswrapper[4732]: I0402 13:44:18.743103 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Apr 02 13:44:18 crc kubenswrapper[4732]: I0402 13:44:18.747971 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Apr 02 13:44:18 crc kubenswrapper[4732]: I0402 13:44:18.764584 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0654c6f-b103-4146-b591-b2acee4900a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:44:18Z\\\",\\\"message\\\":null,\\\"reason\\\":null,\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:44:18Z\\\",\\\"message\\\":null,\\\"reason\\\":null,\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a06c8fb9cbb6779ec889350af216d97764fbc5b1426de87ec0d07df751528119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9c3b985666916a34338e9f991c3bea2b33029e002346a256d5f70d34c5a009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4dabd56041117649788f504b9c27c19def6d0699d4cb6d55cd4d16aed89a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}]}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Pod \"openshift-kube-scheduler-crc\" is invalid: metadata.uid: Invalid value: \"e0654c6f-b103-4146-b591-b2acee4900a9\": field is immutable" Apr 02 13:44:18 crc kubenswrapper[4732]: I0402 13:44:18.768369 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Apr 02 13:44:19 crc kubenswrapper[4732]: I0402 13:44:19.621847 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="e0654c6f-b103-4146-b591-b2acee4900a9" Apr 02 13:44:19 crc kubenswrapper[4732]: I0402 13:44:19.621883 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="e0654c6f-b103-4146-b591-b2acee4900a9" Apr 02 13:44:24 crc kubenswrapper[4732]: I0402 13:44:24.701496 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=6.701476282 podStartE2EDuration="6.701476282s" podCreationTimestamp="2026-04-02 13:44:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:44:24.697363721 +0000 UTC m=+421.601771284" watchObservedRunningTime="2026-04-02 13:44:24.701476282 +0000 UTC m=+421.605883835" Apr 02 13:44:29 crc kubenswrapper[4732]: I0402 13:44:29.805601 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Apr 02 13:44:34 crc kubenswrapper[4732]: I0402 13:44:34.675890 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Apr 02 13:44:46 crc kubenswrapper[4732]: I0402 13:44:46.901696 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Apr 02 13:44:47 crc kubenswrapper[4732]: I0402 13:44:47.959286 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vgjz4"] Apr 02 13:44:47 crc kubenswrapper[4732]: I0402 13:44:47.959826 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vgjz4" podUID="50d43b2d-24ec-439f-a418-3673791eb1b1" containerName="registry-server" containerID="cri-o://ed364b40491f81b7c45fc61749d3ea7a1c9489cbbc60f255d7178b263131dda6" gracePeriod=2 Apr 02 13:44:48 crc kubenswrapper[4732]: I0402 13:44:48.385219 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vgjz4" Apr 02 13:44:48 crc kubenswrapper[4732]: I0402 13:44:48.447568 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50d43b2d-24ec-439f-a418-3673791eb1b1-utilities\") pod \"50d43b2d-24ec-439f-a418-3673791eb1b1\" (UID: \"50d43b2d-24ec-439f-a418-3673791eb1b1\") " Apr 02 13:44:48 crc kubenswrapper[4732]: I0402 13:44:48.447808 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxmng\" (UniqueName: \"kubernetes.io/projected/50d43b2d-24ec-439f-a418-3673791eb1b1-kube-api-access-wxmng\") pod \"50d43b2d-24ec-439f-a418-3673791eb1b1\" (UID: \"50d43b2d-24ec-439f-a418-3673791eb1b1\") " Apr 02 13:44:48 crc kubenswrapper[4732]: I0402 13:44:48.447859 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50d43b2d-24ec-439f-a418-3673791eb1b1-catalog-content\") pod \"50d43b2d-24ec-439f-a418-3673791eb1b1\" (UID: \"50d43b2d-24ec-439f-a418-3673791eb1b1\") " Apr 02 13:44:48 crc kubenswrapper[4732]: I0402 13:44:48.449679 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50d43b2d-24ec-439f-a418-3673791eb1b1-utilities" (OuterVolumeSpecName: "utilities") pod "50d43b2d-24ec-439f-a418-3673791eb1b1" (UID: "50d43b2d-24ec-439f-a418-3673791eb1b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:44:48 crc kubenswrapper[4732]: I0402 13:44:48.453231 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50d43b2d-24ec-439f-a418-3673791eb1b1-kube-api-access-wxmng" (OuterVolumeSpecName: "kube-api-access-wxmng") pod "50d43b2d-24ec-439f-a418-3673791eb1b1" (UID: "50d43b2d-24ec-439f-a418-3673791eb1b1"). InnerVolumeSpecName "kube-api-access-wxmng". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:44:48 crc kubenswrapper[4732]: I0402 13:44:48.549826 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50d43b2d-24ec-439f-a418-3673791eb1b1-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 13:44:48 crc kubenswrapper[4732]: I0402 13:44:48.549886 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxmng\" (UniqueName: \"kubernetes.io/projected/50d43b2d-24ec-439f-a418-3673791eb1b1-kube-api-access-wxmng\") on node \"crc\" DevicePath \"\"" Apr 02 13:44:48 crc kubenswrapper[4732]: I0402 13:44:48.586199 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50d43b2d-24ec-439f-a418-3673791eb1b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50d43b2d-24ec-439f-a418-3673791eb1b1" (UID: "50d43b2d-24ec-439f-a418-3673791eb1b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:44:48 crc kubenswrapper[4732]: I0402 13:44:48.651019 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50d43b2d-24ec-439f-a418-3673791eb1b1-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 13:44:48 crc kubenswrapper[4732]: I0402 13:44:48.785523 4732 generic.go:334] "Generic (PLEG): container finished" podID="50d43b2d-24ec-439f-a418-3673791eb1b1" containerID="ed364b40491f81b7c45fc61749d3ea7a1c9489cbbc60f255d7178b263131dda6" exitCode=0 Apr 02 13:44:48 crc kubenswrapper[4732]: I0402 13:44:48.785582 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgjz4" event={"ID":"50d43b2d-24ec-439f-a418-3673791eb1b1","Type":"ContainerDied","Data":"ed364b40491f81b7c45fc61749d3ea7a1c9489cbbc60f255d7178b263131dda6"} Apr 02 13:44:48 crc kubenswrapper[4732]: I0402 13:44:48.785644 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgjz4" event={"ID":"50d43b2d-24ec-439f-a418-3673791eb1b1","Type":"ContainerDied","Data":"c72853c3883ddf064d1087c7a54a67bc5173e895763e7bbc0616ecb63dbd4ecc"} Apr 02 13:44:48 crc kubenswrapper[4732]: I0402 13:44:48.785680 4732 scope.go:117] "RemoveContainer" containerID="ed364b40491f81b7c45fc61749d3ea7a1c9489cbbc60f255d7178b263131dda6" Apr 02 13:44:48 crc kubenswrapper[4732]: I0402 13:44:48.785853 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vgjz4" Apr 02 13:44:48 crc kubenswrapper[4732]: I0402 13:44:48.808637 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vgjz4"] Apr 02 13:44:48 crc kubenswrapper[4732]: I0402 13:44:48.811175 4732 scope.go:117] "RemoveContainer" containerID="90cf6284bc60f80e12333f177cf852a10f14c55a6bb7da1c6ad61c3b50212e39" Apr 02 13:44:48 crc kubenswrapper[4732]: I0402 13:44:48.813355 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vgjz4"] Apr 02 13:44:48 crc kubenswrapper[4732]: I0402 13:44:48.828924 4732 scope.go:117] "RemoveContainer" containerID="8e4ab18fa6d5568cc39926566d16f55002ef34dcf5f6d4f7c14f444d167dcb69" Apr 02 13:44:48 crc kubenswrapper[4732]: I0402 13:44:48.848317 4732 scope.go:117] "RemoveContainer" containerID="ed364b40491f81b7c45fc61749d3ea7a1c9489cbbc60f255d7178b263131dda6" Apr 02 13:44:48 crc kubenswrapper[4732]: E0402 13:44:48.848664 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed364b40491f81b7c45fc61749d3ea7a1c9489cbbc60f255d7178b263131dda6\": container with ID starting with ed364b40491f81b7c45fc61749d3ea7a1c9489cbbc60f255d7178b263131dda6 not found: ID does not exist" containerID="ed364b40491f81b7c45fc61749d3ea7a1c9489cbbc60f255d7178b263131dda6" Apr 02 13:44:48 crc kubenswrapper[4732]: I0402 13:44:48.848700 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed364b40491f81b7c45fc61749d3ea7a1c9489cbbc60f255d7178b263131dda6"} err="failed to get container status \"ed364b40491f81b7c45fc61749d3ea7a1c9489cbbc60f255d7178b263131dda6\": rpc error: code = NotFound desc = could not find container \"ed364b40491f81b7c45fc61749d3ea7a1c9489cbbc60f255d7178b263131dda6\": container with ID starting with ed364b40491f81b7c45fc61749d3ea7a1c9489cbbc60f255d7178b263131dda6 not found: ID does not exist" Apr 02 13:44:48 crc kubenswrapper[4732]: I0402 13:44:48.848727 4732 scope.go:117] "RemoveContainer" containerID="90cf6284bc60f80e12333f177cf852a10f14c55a6bb7da1c6ad61c3b50212e39" Apr 02 13:44:48 crc kubenswrapper[4732]: E0402 13:44:48.848945 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90cf6284bc60f80e12333f177cf852a10f14c55a6bb7da1c6ad61c3b50212e39\": container with ID starting with 90cf6284bc60f80e12333f177cf852a10f14c55a6bb7da1c6ad61c3b50212e39 not found: ID does not exist" containerID="90cf6284bc60f80e12333f177cf852a10f14c55a6bb7da1c6ad61c3b50212e39" Apr 02 13:44:48 crc kubenswrapper[4732]: I0402 13:44:48.848971 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90cf6284bc60f80e12333f177cf852a10f14c55a6bb7da1c6ad61c3b50212e39"} err="failed to get container status \"90cf6284bc60f80e12333f177cf852a10f14c55a6bb7da1c6ad61c3b50212e39\": rpc error: code = NotFound desc = could not find container \"90cf6284bc60f80e12333f177cf852a10f14c55a6bb7da1c6ad61c3b50212e39\": container with ID starting with 90cf6284bc60f80e12333f177cf852a10f14c55a6bb7da1c6ad61c3b50212e39 not found: ID does not exist" Apr 02 13:44:48 crc kubenswrapper[4732]: I0402 13:44:48.848983 4732 scope.go:117] "RemoveContainer" containerID="8e4ab18fa6d5568cc39926566d16f55002ef34dcf5f6d4f7c14f444d167dcb69" Apr 02 13:44:48 crc kubenswrapper[4732]: E0402 13:44:48.849190 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e4ab18fa6d5568cc39926566d16f55002ef34dcf5f6d4f7c14f444d167dcb69\": container with ID starting with 8e4ab18fa6d5568cc39926566d16f55002ef34dcf5f6d4f7c14f444d167dcb69 not found: ID does not exist" containerID="8e4ab18fa6d5568cc39926566d16f55002ef34dcf5f6d4f7c14f444d167dcb69" Apr 02 13:44:48 crc kubenswrapper[4732]: I0402 13:44:48.849216 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e4ab18fa6d5568cc39926566d16f55002ef34dcf5f6d4f7c14f444d167dcb69"} err="failed to get container status \"8e4ab18fa6d5568cc39926566d16f55002ef34dcf5f6d4f7c14f444d167dcb69\": rpc error: code = NotFound desc = could not find container \"8e4ab18fa6d5568cc39926566d16f55002ef34dcf5f6d4f7c14f444d167dcb69\": container with ID starting with 8e4ab18fa6d5568cc39926566d16f55002ef34dcf5f6d4f7c14f444d167dcb69 not found: ID does not exist" Apr 02 13:44:50 crc kubenswrapper[4732]: I0402 13:44:50.686533 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50d43b2d-24ec-439f-a418-3673791eb1b1" path="/var/lib/kubelet/pods/50d43b2d-24ec-439f-a418-3673791eb1b1/volumes" Apr 02 13:44:51 crc kubenswrapper[4732]: I0402 13:44:51.435768 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Apr 02 13:45:00 crc kubenswrapper[4732]: I0402 13:45:00.128041 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29585625-5cdrx"] Apr 02 13:45:00 crc kubenswrapper[4732]: E0402 13:45:00.128827 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50d43b2d-24ec-439f-a418-3673791eb1b1" containerName="extract-utilities" Apr 02 13:45:00 crc kubenswrapper[4732]: I0402 13:45:00.128844 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d43b2d-24ec-439f-a418-3673791eb1b1" containerName="extract-utilities" Apr 02 13:45:00 crc kubenswrapper[4732]: E0402 13:45:00.128861 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18370e02-d95b-4be8-a495-ee908136bee4" containerName="oc" Apr 02 13:45:00 crc kubenswrapper[4732]: I0402 13:45:00.128868 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="18370e02-d95b-4be8-a495-ee908136bee4" containerName="oc" Apr 02 13:45:00 crc kubenswrapper[4732]: E0402 13:45:00.128880 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50d43b2d-24ec-439f-a418-3673791eb1b1" containerName="registry-server" Apr 02 13:45:00 crc kubenswrapper[4732]: I0402 13:45:00.128888 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d43b2d-24ec-439f-a418-3673791eb1b1" containerName="registry-server" Apr 02 13:45:00 crc kubenswrapper[4732]: E0402 13:45:00.128904 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50d43b2d-24ec-439f-a418-3673791eb1b1" containerName="extract-content" Apr 02 13:45:00 crc kubenswrapper[4732]: I0402 13:45:00.128911 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d43b2d-24ec-439f-a418-3673791eb1b1" containerName="extract-content" Apr 02 13:45:00 crc kubenswrapper[4732]: I0402 13:45:00.129043 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="50d43b2d-24ec-439f-a418-3673791eb1b1" containerName="registry-server" Apr 02 13:45:00 crc kubenswrapper[4732]: I0402 13:45:00.129062 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="18370e02-d95b-4be8-a495-ee908136bee4" containerName="oc" Apr 02 13:45:00 crc kubenswrapper[4732]: I0402 13:45:00.129475 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29585625-5cdrx" Apr 02 13:45:00 crc kubenswrapper[4732]: I0402 13:45:00.132123 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Apr 02 13:45:00 crc kubenswrapper[4732]: I0402 13:45:00.135120 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Apr 02 13:45:00 crc kubenswrapper[4732]: I0402 13:45:00.150746 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29585625-5cdrx"] Apr 02 13:45:00 crc kubenswrapper[4732]: I0402 13:45:00.194672 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/48fbf37e-5802-4553-8b9a-ecb5a5a1a379-secret-volume\") pod \"collect-profiles-29585625-5cdrx\" (UID: \"48fbf37e-5802-4553-8b9a-ecb5a5a1a379\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585625-5cdrx" Apr 02 13:45:00 crc kubenswrapper[4732]: I0402 13:45:00.194746 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48fbf37e-5802-4553-8b9a-ecb5a5a1a379-config-volume\") pod \"collect-profiles-29585625-5cdrx\" (UID: \"48fbf37e-5802-4553-8b9a-ecb5a5a1a379\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585625-5cdrx" Apr 02 13:45:00 crc kubenswrapper[4732]: I0402 13:45:00.194771 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhv6k\" (UniqueName: \"kubernetes.io/projected/48fbf37e-5802-4553-8b9a-ecb5a5a1a379-kube-api-access-fhv6k\") pod \"collect-profiles-29585625-5cdrx\" (UID: \"48fbf37e-5802-4553-8b9a-ecb5a5a1a379\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585625-5cdrx" Apr 02 13:45:00 crc kubenswrapper[4732]: I0402 13:45:00.296190 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48fbf37e-5802-4553-8b9a-ecb5a5a1a379-config-volume\") pod \"collect-profiles-29585625-5cdrx\" (UID: \"48fbf37e-5802-4553-8b9a-ecb5a5a1a379\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585625-5cdrx" Apr 02 13:45:00 crc kubenswrapper[4732]: I0402 13:45:00.296256 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhv6k\" (UniqueName: \"kubernetes.io/projected/48fbf37e-5802-4553-8b9a-ecb5a5a1a379-kube-api-access-fhv6k\") pod \"collect-profiles-29585625-5cdrx\" (UID: \"48fbf37e-5802-4553-8b9a-ecb5a5a1a379\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585625-5cdrx" Apr 02 13:45:00 crc kubenswrapper[4732]: I0402 13:45:00.296398 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/48fbf37e-5802-4553-8b9a-ecb5a5a1a379-secret-volume\") pod \"collect-profiles-29585625-5cdrx\" (UID: \"48fbf37e-5802-4553-8b9a-ecb5a5a1a379\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585625-5cdrx" Apr 02 13:45:00 crc kubenswrapper[4732]: I0402 13:45:00.297305 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48fbf37e-5802-4553-8b9a-ecb5a5a1a379-config-volume\") pod \"collect-profiles-29585625-5cdrx\" (UID: \"48fbf37e-5802-4553-8b9a-ecb5a5a1a379\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585625-5cdrx" Apr 02 13:45:00 crc kubenswrapper[4732]: I0402 13:45:00.303325 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/48fbf37e-5802-4553-8b9a-ecb5a5a1a379-secret-volume\") pod \"collect-profiles-29585625-5cdrx\" (UID: \"48fbf37e-5802-4553-8b9a-ecb5a5a1a379\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585625-5cdrx" Apr 02 13:45:00 crc kubenswrapper[4732]: I0402 13:45:00.319792 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhv6k\" (UniqueName: \"kubernetes.io/projected/48fbf37e-5802-4553-8b9a-ecb5a5a1a379-kube-api-access-fhv6k\") pod \"collect-profiles-29585625-5cdrx\" (UID: \"48fbf37e-5802-4553-8b9a-ecb5a5a1a379\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585625-5cdrx" Apr 02 13:45:00 crc kubenswrapper[4732]: I0402 13:45:00.448919 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29585625-5cdrx" Apr 02 13:45:00 crc kubenswrapper[4732]: I0402 13:45:00.844641 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29585625-5cdrx"] Apr 02 13:45:00 crc kubenswrapper[4732]: I0402 13:45:00.882516 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29585625-5cdrx" event={"ID":"48fbf37e-5802-4553-8b9a-ecb5a5a1a379","Type":"ContainerStarted","Data":"9203cd10ef889bb77c14fa896f4627fc1d7e04d881f9c42220f36fb8c52699f2"} Apr 02 13:45:01 crc kubenswrapper[4732]: I0402 13:45:01.892540 4732 generic.go:334] "Generic (PLEG): container finished" podID="48fbf37e-5802-4553-8b9a-ecb5a5a1a379" containerID="d9e3c354334015dbdb65c55a464a9570a2373e628e8e687fe6fa57065771d204" exitCode=0 Apr 02 13:45:01 crc kubenswrapper[4732]: I0402 13:45:01.892665 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29585625-5cdrx" event={"ID":"48fbf37e-5802-4553-8b9a-ecb5a5a1a379","Type":"ContainerDied","Data":"d9e3c354334015dbdb65c55a464a9570a2373e628e8e687fe6fa57065771d204"} Apr 02 13:45:03 crc kubenswrapper[4732]: I0402 13:45:03.204461 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29585625-5cdrx" Apr 02 13:45:03 crc kubenswrapper[4732]: I0402 13:45:03.337446 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/48fbf37e-5802-4553-8b9a-ecb5a5a1a379-secret-volume\") pod \"48fbf37e-5802-4553-8b9a-ecb5a5a1a379\" (UID: \"48fbf37e-5802-4553-8b9a-ecb5a5a1a379\") " Apr 02 13:45:03 crc kubenswrapper[4732]: I0402 13:45:03.337509 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48fbf37e-5802-4553-8b9a-ecb5a5a1a379-config-volume\") pod \"48fbf37e-5802-4553-8b9a-ecb5a5a1a379\" (UID: \"48fbf37e-5802-4553-8b9a-ecb5a5a1a379\") " Apr 02 13:45:03 crc kubenswrapper[4732]: I0402 13:45:03.337599 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhv6k\" (UniqueName: \"kubernetes.io/projected/48fbf37e-5802-4553-8b9a-ecb5a5a1a379-kube-api-access-fhv6k\") pod \"48fbf37e-5802-4553-8b9a-ecb5a5a1a379\" (UID: \"48fbf37e-5802-4553-8b9a-ecb5a5a1a379\") " Apr 02 13:45:03 crc kubenswrapper[4732]: I0402 13:45:03.338959 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48fbf37e-5802-4553-8b9a-ecb5a5a1a379-config-volume" (OuterVolumeSpecName: "config-volume") pod "48fbf37e-5802-4553-8b9a-ecb5a5a1a379" (UID: "48fbf37e-5802-4553-8b9a-ecb5a5a1a379"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:45:03 crc kubenswrapper[4732]: I0402 13:45:03.343883 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48fbf37e-5802-4553-8b9a-ecb5a5a1a379-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "48fbf37e-5802-4553-8b9a-ecb5a5a1a379" (UID: "48fbf37e-5802-4553-8b9a-ecb5a5a1a379"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:45:03 crc kubenswrapper[4732]: I0402 13:45:03.346741 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48fbf37e-5802-4553-8b9a-ecb5a5a1a379-kube-api-access-fhv6k" (OuterVolumeSpecName: "kube-api-access-fhv6k") pod "48fbf37e-5802-4553-8b9a-ecb5a5a1a379" (UID: "48fbf37e-5802-4553-8b9a-ecb5a5a1a379"). InnerVolumeSpecName "kube-api-access-fhv6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:45:03 crc kubenswrapper[4732]: I0402 13:45:03.438665 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhv6k\" (UniqueName: \"kubernetes.io/projected/48fbf37e-5802-4553-8b9a-ecb5a5a1a379-kube-api-access-fhv6k\") on node \"crc\" DevicePath \"\"" Apr 02 13:45:03 crc kubenswrapper[4732]: I0402 13:45:03.438695 4732 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/48fbf37e-5802-4553-8b9a-ecb5a5a1a379-secret-volume\") on node \"crc\" DevicePath \"\"" Apr 02 13:45:03 crc kubenswrapper[4732]: I0402 13:45:03.438705 4732 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48fbf37e-5802-4553-8b9a-ecb5a5a1a379-config-volume\") on node \"crc\" DevicePath \"\"" Apr 02 13:45:03 crc kubenswrapper[4732]: I0402 13:45:03.906173 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29585625-5cdrx" event={"ID":"48fbf37e-5802-4553-8b9a-ecb5a5a1a379","Type":"ContainerDied","Data":"9203cd10ef889bb77c14fa896f4627fc1d7e04d881f9c42220f36fb8c52699f2"} Apr 02 13:45:03 crc kubenswrapper[4732]: I0402 13:45:03.906221 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9203cd10ef889bb77c14fa896f4627fc1d7e04d881f9c42220f36fb8c52699f2" Apr 02 13:45:03 crc kubenswrapper[4732]: I0402 13:45:03.906283 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29585625-5cdrx" Apr 02 13:45:06 crc kubenswrapper[4732]: I0402 13:45:06.456916 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-10-crc"] Apr 02 13:45:06 crc kubenswrapper[4732]: E0402 13:45:06.458329 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48fbf37e-5802-4553-8b9a-ecb5a5a1a379" containerName="collect-profiles" Apr 02 13:45:06 crc kubenswrapper[4732]: I0402 13:45:06.458463 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="48fbf37e-5802-4553-8b9a-ecb5a5a1a379" containerName="collect-profiles" Apr 02 13:45:06 crc kubenswrapper[4732]: I0402 13:45:06.458778 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="48fbf37e-5802-4553-8b9a-ecb5a5a1a379" containerName="collect-profiles" Apr 02 13:45:06 crc kubenswrapper[4732]: I0402 13:45:06.459379 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-10-crc" Apr 02 13:45:06 crc kubenswrapper[4732]: I0402 13:45:06.462344 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Apr 02 13:45:06 crc kubenswrapper[4732]: I0402 13:45:06.469284 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Apr 02 13:45:06 crc kubenswrapper[4732]: I0402 13:45:06.469586 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-10-crc"] Apr 02 13:45:06 crc kubenswrapper[4732]: I0402 13:45:06.609811 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e580e30-624e-4cbc-9095-fa5659ce546e-kubelet-dir\") pod \"installer-10-crc\" (UID: \"6e580e30-624e-4cbc-9095-fa5659ce546e\") " pod="openshift-kube-apiserver/installer-10-crc" Apr 02 13:45:06 crc kubenswrapper[4732]: I0402 13:45:06.610110 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6e580e30-624e-4cbc-9095-fa5659ce546e-var-lock\") pod \"installer-10-crc\" (UID: \"6e580e30-624e-4cbc-9095-fa5659ce546e\") " pod="openshift-kube-apiserver/installer-10-crc" Apr 02 13:45:06 crc kubenswrapper[4732]: I0402 13:45:06.610241 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e580e30-624e-4cbc-9095-fa5659ce546e-kube-api-access\") pod \"installer-10-crc\" (UID: \"6e580e30-624e-4cbc-9095-fa5659ce546e\") " pod="openshift-kube-apiserver/installer-10-crc" Apr 02 13:45:06 crc kubenswrapper[4732]: I0402 13:45:06.711110 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e580e30-624e-4cbc-9095-fa5659ce546e-kube-api-access\") pod \"installer-10-crc\" (UID: \"6e580e30-624e-4cbc-9095-fa5659ce546e\") " pod="openshift-kube-apiserver/installer-10-crc" Apr 02 13:45:06 crc kubenswrapper[4732]: I0402 13:45:06.711193 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e580e30-624e-4cbc-9095-fa5659ce546e-kubelet-dir\") pod \"installer-10-crc\" (UID: \"6e580e30-624e-4cbc-9095-fa5659ce546e\") " pod="openshift-kube-apiserver/installer-10-crc" Apr 02 13:45:06 crc kubenswrapper[4732]: I0402 13:45:06.711228 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6e580e30-624e-4cbc-9095-fa5659ce546e-var-lock\") pod \"installer-10-crc\" (UID: \"6e580e30-624e-4cbc-9095-fa5659ce546e\") " pod="openshift-kube-apiserver/installer-10-crc" Apr 02 13:45:06 crc kubenswrapper[4732]: I0402 13:45:06.711316 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6e580e30-624e-4cbc-9095-fa5659ce546e-var-lock\") pod \"installer-10-crc\" (UID: \"6e580e30-624e-4cbc-9095-fa5659ce546e\") " pod="openshift-kube-apiserver/installer-10-crc" Apr 02 13:45:06 crc kubenswrapper[4732]: I0402 13:45:06.711358 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e580e30-624e-4cbc-9095-fa5659ce546e-kubelet-dir\") pod \"installer-10-crc\" (UID: \"6e580e30-624e-4cbc-9095-fa5659ce546e\") " pod="openshift-kube-apiserver/installer-10-crc" Apr 02 13:45:06 crc kubenswrapper[4732]: I0402 13:45:06.727322 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e580e30-624e-4cbc-9095-fa5659ce546e-kube-api-access\") pod \"installer-10-crc\" (UID: \"6e580e30-624e-4cbc-9095-fa5659ce546e\") " pod="openshift-kube-apiserver/installer-10-crc" Apr 02 13:45:06 crc kubenswrapper[4732]: I0402 13:45:06.776383 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-10-crc" Apr 02 13:45:07 crc kubenswrapper[4732]: I0402 13:45:07.050193 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-10-crc"] Apr 02 13:45:07 crc kubenswrapper[4732]: I0402 13:45:07.930513 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-10-crc" event={"ID":"6e580e30-624e-4cbc-9095-fa5659ce546e","Type":"ContainerStarted","Data":"08941b02b8d608abc69250b1e69bb1c13d73671b0fdcec89d785cbc9753ed8db"} Apr 02 13:45:08 crc kubenswrapper[4732]: I0402 13:45:08.937546 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-10-crc" event={"ID":"6e580e30-624e-4cbc-9095-fa5659ce546e","Type":"ContainerStarted","Data":"a52637663e3b9a50b75ad42e14b1445272e26bd62d94fe63f7188866c0b3fb95"} Apr 02 13:45:29 crc kubenswrapper[4732]: I0402 13:45:29.680490 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-10-crc" podStartSLOduration=23.680470898 podStartE2EDuration="23.680470898s" podCreationTimestamp="2026-04-02 13:45:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:45:08.952864747 +0000 UTC m=+465.857272380" watchObservedRunningTime="2026-04-02 13:45:29.680470898 +0000 UTC m=+486.584878451" Apr 02 13:45:29 crc kubenswrapper[4732]: I0402 13:45:29.686357 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-11-retry-1-crc"] Apr 02 13:45:29 crc kubenswrapper[4732]: I0402 13:45:29.687242 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 02 13:45:29 crc kubenswrapper[4732]: I0402 13:45:29.689820 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Apr 02 13:45:29 crc kubenswrapper[4732]: I0402 13:45:29.689927 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Apr 02 13:45:29 crc kubenswrapper[4732]: I0402 13:45:29.711709 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-11-retry-1-crc"] Apr 02 13:45:29 crc kubenswrapper[4732]: I0402 13:45:29.793367 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e4f3964-0282-48aa-a0a5-cad803e3812b-kube-api-access\") pod \"installer-11-retry-1-crc\" (UID: \"6e4f3964-0282-48aa-a0a5-cad803e3812b\") " pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 02 13:45:29 crc kubenswrapper[4732]: I0402 13:45:29.793693 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e4f3964-0282-48aa-a0a5-cad803e3812b-kubelet-dir\") pod \"installer-11-retry-1-crc\" (UID: \"6e4f3964-0282-48aa-a0a5-cad803e3812b\") " pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 02 13:45:29 crc kubenswrapper[4732]: I0402 13:45:29.793718 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6e4f3964-0282-48aa-a0a5-cad803e3812b-var-lock\") pod \"installer-11-retry-1-crc\" (UID: \"6e4f3964-0282-48aa-a0a5-cad803e3812b\") " pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 02 13:45:29 crc kubenswrapper[4732]: I0402 13:45:29.894832 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e4f3964-0282-48aa-a0a5-cad803e3812b-kubelet-dir\") pod \"installer-11-retry-1-crc\" (UID: \"6e4f3964-0282-48aa-a0a5-cad803e3812b\") " pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 02 13:45:29 crc kubenswrapper[4732]: I0402 13:45:29.894888 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6e4f3964-0282-48aa-a0a5-cad803e3812b-var-lock\") pod \"installer-11-retry-1-crc\" (UID: \"6e4f3964-0282-48aa-a0a5-cad803e3812b\") " pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 02 13:45:29 crc kubenswrapper[4732]: I0402 13:45:29.894935 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e4f3964-0282-48aa-a0a5-cad803e3812b-kube-api-access\") pod \"installer-11-retry-1-crc\" (UID: \"6e4f3964-0282-48aa-a0a5-cad803e3812b\") " pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 02 13:45:29 crc kubenswrapper[4732]: I0402 13:45:29.895248 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e4f3964-0282-48aa-a0a5-cad803e3812b-kubelet-dir\") pod \"installer-11-retry-1-crc\" (UID: \"6e4f3964-0282-48aa-a0a5-cad803e3812b\") " pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 02 13:45:29 crc kubenswrapper[4732]: I0402 13:45:29.895283 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6e4f3964-0282-48aa-a0a5-cad803e3812b-var-lock\") pod \"installer-11-retry-1-crc\" (UID: \"6e4f3964-0282-48aa-a0a5-cad803e3812b\") " pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 02 13:45:29 crc kubenswrapper[4732]: I0402 13:45:29.925511 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e4f3964-0282-48aa-a0a5-cad803e3812b-kube-api-access\") pod \"installer-11-retry-1-crc\" (UID: \"6e4f3964-0282-48aa-a0a5-cad803e3812b\") " pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 02 13:45:30 crc kubenswrapper[4732]: I0402 13:45:30.016856 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 02 13:45:30 crc kubenswrapper[4732]: I0402 13:45:30.407026 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-11-retry-1-crc"] Apr 02 13:45:31 crc kubenswrapper[4732]: I0402 13:45:31.090978 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-11-retry-1-crc" event={"ID":"6e4f3964-0282-48aa-a0a5-cad803e3812b","Type":"ContainerStarted","Data":"75d1c9855f320237c4f2255a2386b01f78bcb361951e5bff599684ff41ab29e1"} Apr 02 13:45:31 crc kubenswrapper[4732]: I0402 13:45:31.091298 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-11-retry-1-crc" event={"ID":"6e4f3964-0282-48aa-a0a5-cad803e3812b","Type":"ContainerStarted","Data":"4989c64fcdc939e6f0924f3d7381eb420ebc7ca6d039f1734249bdfd6da7a1eb"} Apr 02 13:45:31 crc kubenswrapper[4732]: I0402 13:45:31.109471 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-11-retry-1-crc" podStartSLOduration=2.109452089 podStartE2EDuration="2.109452089s" podCreationTimestamp="2026-04-02 13:45:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:45:31.105675175 +0000 UTC m=+488.010082728" watchObservedRunningTime="2026-04-02 13:45:31.109452089 +0000 UTC m=+488.013859642" Apr 02 13:45:31 crc kubenswrapper[4732]: I0402 13:45:31.924157 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 13:45:31 crc kubenswrapper[4732]: I0402 13:45:31.924248 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.285886 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-79499bfbc-frt6n"] Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.286805 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.342861 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-79499bfbc-frt6n"] Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.354976 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7b6f3ee-8ae2-432c-a6da-44a7b883205a-trusted-ca\") pod \"image-registry-79499bfbc-frt6n\" (UID: \"c7b6f3ee-8ae2-432c-a6da-44a7b883205a\") " pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.355041 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppjdv\" (UniqueName: \"kubernetes.io/projected/c7b6f3ee-8ae2-432c-a6da-44a7b883205a-kube-api-access-ppjdv\") pod \"image-registry-79499bfbc-frt6n\" (UID: \"c7b6f3ee-8ae2-432c-a6da-44a7b883205a\") " pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.355075 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c7b6f3ee-8ae2-432c-a6da-44a7b883205a-bound-sa-token\") pod \"image-registry-79499bfbc-frt6n\" (UID: \"c7b6f3ee-8ae2-432c-a6da-44a7b883205a\") " pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.355099 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c7b6f3ee-8ae2-432c-a6da-44a7b883205a-ca-trust-extracted\") pod \"image-registry-79499bfbc-frt6n\" (UID: \"c7b6f3ee-8ae2-432c-a6da-44a7b883205a\") " pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.355127 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c7b6f3ee-8ae2-432c-a6da-44a7b883205a-registry-tls\") pod \"image-registry-79499bfbc-frt6n\" (UID: \"c7b6f3ee-8ae2-432c-a6da-44a7b883205a\") " pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.355162 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c7b6f3ee-8ae2-432c-a6da-44a7b883205a-registry-certificates\") pod \"image-registry-79499bfbc-frt6n\" (UID: \"c7b6f3ee-8ae2-432c-a6da-44a7b883205a\") " pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.355207 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-79499bfbc-frt6n\" (UID: \"c7b6f3ee-8ae2-432c-a6da-44a7b883205a\") " pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.355242 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c7b6f3ee-8ae2-432c-a6da-44a7b883205a-installation-pull-secrets\") pod \"image-registry-79499bfbc-frt6n\" (UID: \"c7b6f3ee-8ae2-432c-a6da-44a7b883205a\") " pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.380795 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-79499bfbc-frt6n\" (UID: \"c7b6f3ee-8ae2-432c-a6da-44a7b883205a\") " pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.456897 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7b6f3ee-8ae2-432c-a6da-44a7b883205a-trusted-ca\") pod \"image-registry-79499bfbc-frt6n\" (UID: \"c7b6f3ee-8ae2-432c-a6da-44a7b883205a\") " pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.456941 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppjdv\" (UniqueName: \"kubernetes.io/projected/c7b6f3ee-8ae2-432c-a6da-44a7b883205a-kube-api-access-ppjdv\") pod \"image-registry-79499bfbc-frt6n\" (UID: \"c7b6f3ee-8ae2-432c-a6da-44a7b883205a\") " pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.456966 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c7b6f3ee-8ae2-432c-a6da-44a7b883205a-bound-sa-token\") pod \"image-registry-79499bfbc-frt6n\" (UID: \"c7b6f3ee-8ae2-432c-a6da-44a7b883205a\") " pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.456985 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c7b6f3ee-8ae2-432c-a6da-44a7b883205a-ca-trust-extracted\") pod \"image-registry-79499bfbc-frt6n\" (UID: \"c7b6f3ee-8ae2-432c-a6da-44a7b883205a\") " pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.457006 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c7b6f3ee-8ae2-432c-a6da-44a7b883205a-registry-tls\") pod \"image-registry-79499bfbc-frt6n\" (UID: \"c7b6f3ee-8ae2-432c-a6da-44a7b883205a\") " pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.457034 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c7b6f3ee-8ae2-432c-a6da-44a7b883205a-registry-certificates\") pod \"image-registry-79499bfbc-frt6n\" (UID: \"c7b6f3ee-8ae2-432c-a6da-44a7b883205a\") " pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.457064 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c7b6f3ee-8ae2-432c-a6da-44a7b883205a-installation-pull-secrets\") pod \"image-registry-79499bfbc-frt6n\" (UID: \"c7b6f3ee-8ae2-432c-a6da-44a7b883205a\") " pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.457912 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c7b6f3ee-8ae2-432c-a6da-44a7b883205a-ca-trust-extracted\") pod \"image-registry-79499bfbc-frt6n\" (UID: \"c7b6f3ee-8ae2-432c-a6da-44a7b883205a\") " pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.458353 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7b6f3ee-8ae2-432c-a6da-44a7b883205a-trusted-ca\") pod \"image-registry-79499bfbc-frt6n\" (UID: \"c7b6f3ee-8ae2-432c-a6da-44a7b883205a\") " pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.459056 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c7b6f3ee-8ae2-432c-a6da-44a7b883205a-registry-certificates\") pod \"image-registry-79499bfbc-frt6n\" (UID: \"c7b6f3ee-8ae2-432c-a6da-44a7b883205a\") " pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.464923 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c7b6f3ee-8ae2-432c-a6da-44a7b883205a-registry-tls\") pod \"image-registry-79499bfbc-frt6n\" (UID: \"c7b6f3ee-8ae2-432c-a6da-44a7b883205a\") " pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.465301 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c7b6f3ee-8ae2-432c-a6da-44a7b883205a-installation-pull-secrets\") pod \"image-registry-79499bfbc-frt6n\" (UID: \"c7b6f3ee-8ae2-432c-a6da-44a7b883205a\") " pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.485769 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppjdv\" (UniqueName: \"kubernetes.io/projected/c7b6f3ee-8ae2-432c-a6da-44a7b883205a-kube-api-access-ppjdv\") pod \"image-registry-79499bfbc-frt6n\" (UID: \"c7b6f3ee-8ae2-432c-a6da-44a7b883205a\") " pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.487146 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c7b6f3ee-8ae2-432c-a6da-44a7b883205a-bound-sa-token\") pod \"image-registry-79499bfbc-frt6n\" (UID: \"c7b6f3ee-8ae2-432c-a6da-44a7b883205a\") " pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.604349 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:34 crc kubenswrapper[4732]: I0402 13:45:34.801312 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-79499bfbc-frt6n"] Apr 02 13:45:34 crc kubenswrapper[4732]: W0402 13:45:34.805080 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7b6f3ee_8ae2_432c_a6da_44a7b883205a.slice/crio-8edd89b457410589eb7d5bfa0d1efd4f57a61b64fd946fe4f4f424b5cb1b11f7 WatchSource:0}: Error finding container 8edd89b457410589eb7d5bfa0d1efd4f57a61b64fd946fe4f4f424b5cb1b11f7: Status 404 returned error can't find the container with id 8edd89b457410589eb7d5bfa0d1efd4f57a61b64fd946fe4f4f424b5cb1b11f7 Apr 02 13:45:35 crc kubenswrapper[4732]: I0402 13:45:35.116106 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-79499bfbc-frt6n" event={"ID":"c7b6f3ee-8ae2-432c-a6da-44a7b883205a","Type":"ContainerStarted","Data":"ff8c89ba2b79952fca6cd0c782651b598e57f9380b343e2ad3817967907ef69f"} Apr 02 13:45:35 crc kubenswrapper[4732]: I0402 13:45:35.116423 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-79499bfbc-frt6n" event={"ID":"c7b6f3ee-8ae2-432c-a6da-44a7b883205a","Type":"ContainerStarted","Data":"8edd89b457410589eb7d5bfa0d1efd4f57a61b64fd946fe4f4f424b5cb1b11f7"} Apr 02 13:45:36 crc kubenswrapper[4732]: I0402 13:45:36.121311 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:36 crc kubenswrapper[4732]: I0402 13:45:36.151139 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-79499bfbc-frt6n" podStartSLOduration=2.151122157 podStartE2EDuration="2.151122157s" podCreationTimestamp="2026-04-02 13:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:45:36.148120215 +0000 UTC m=+493.052527788" watchObservedRunningTime="2026-04-02 13:45:36.151122157 +0000 UTC m=+493.055529730" Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.811016 4732 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.812401 4732 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.812651 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.812766 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" containerID="cri-o://858e53e28905cbacd387c14192b4f89524e7f215e3974a7181de6ee15ad9a7cc" gracePeriod=15 Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.812821 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://31401ac1c6173f4a37b7273100552581a8fd8bacad3eb4a34da8770e5cce6608" gracePeriod=15 Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.812851 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://7fea6f1b57fdeab906e3a369b1c58efb7eb869ca1bb7068f8935ed118dfb6643" gracePeriod=15 Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.812880 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-cert-syncer" containerID="cri-o://95fd6b2b782db18e44fcad5d695b957206ea146911d9035bc8f7a4506bddb058" gracePeriod=15 Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.812945 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-check-endpoints" containerID="cri-o://44e641937af018f5df25208fd44098b473a12448b3ae7d42ca143ee83364a972" gracePeriod=15 Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.813871 4732 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Apr 02 13:45:45 crc kubenswrapper[4732]: E0402 13:45:45.814028 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-check-endpoints" Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.814042 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-check-endpoints" Apr 02 13:45:45 crc kubenswrapper[4732]: E0402 13:45:45.814055 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-insecure-readyz" Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.814062 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-insecure-readyz" Apr 02 13:45:45 crc kubenswrapper[4732]: E0402 13:45:45.814074 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="setup" Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.814082 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="setup" Apr 02 13:45:45 crc kubenswrapper[4732]: E0402 13:45:45.814092 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-cert-regeneration-controller" Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.814101 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-cert-regeneration-controller" Apr 02 13:45:45 crc kubenswrapper[4732]: E0402 13:45:45.814111 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.814118 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" Apr 02 13:45:45 crc kubenswrapper[4732]: E0402 13:45:45.814131 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-cert-syncer" Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.814138 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-cert-syncer" Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.814254 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.814312 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-cert-syncer" Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.814327 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-insecure-readyz" Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.814342 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-cert-regeneration-controller" Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.814355 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-check-endpoints" Apr 02 13:45:45 crc kubenswrapper[4732]: E0402 13:45:45.893285 4732 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.909737 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"4e6039c7a12c5a0c0ef5917dc7ee5582\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.909838 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.909859 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"4e6039c7a12c5a0c0ef5917dc7ee5582\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.909938 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.910022 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"4e6039c7a12c5a0c0ef5917dc7ee5582\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.910087 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.910131 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:45:45 crc kubenswrapper[4732]: I0402 13:45:45.910176 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.011053 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"4e6039c7a12c5a0c0ef5917dc7ee5582\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.011105 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.011122 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"4e6039c7a12c5a0c0ef5917dc7ee5582\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.011149 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.011176 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"4e6039c7a12c5a0c0ef5917dc7ee5582\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.011182 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"4e6039c7a12c5a0c0ef5917dc7ee5582\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.011208 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.011227 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"4e6039c7a12c5a0c0ef5917dc7ee5582\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.011198 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.011259 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"4e6039c7a12c5a0c0ef5917dc7ee5582\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.011266 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.011292 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.011275 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.011232 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.011345 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.011405 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.192318 4732 generic.go:334] "Generic (PLEG): container finished" podID="6e580e30-624e-4cbc-9095-fa5659ce546e" containerID="a52637663e3b9a50b75ad42e14b1445272e26bd62d94fe63f7188866c0b3fb95" exitCode=0 Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.192448 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-10-crc" event={"ID":"6e580e30-624e-4cbc-9095-fa5659ce546e","Type":"ContainerDied","Data":"a52637663e3b9a50b75ad42e14b1445272e26bd62d94fe63f7188866c0b3fb95"} Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.193546 4732 status_manager.go:851] "Failed to get status for pod" podUID="6e580e30-624e-4cbc-9095-fa5659ce546e" pod="openshift-kube-apiserver/installer-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.193877 4732 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.194658 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.196756 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_71bb4a3aecc4ba5b26c4b7318770ce13/kube-apiserver-cert-syncer/0.log" Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.197532 4732 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="44e641937af018f5df25208fd44098b473a12448b3ae7d42ca143ee83364a972" exitCode=0 Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.197558 4732 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="31401ac1c6173f4a37b7273100552581a8fd8bacad3eb4a34da8770e5cce6608" exitCode=0 Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.197568 4732 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="7fea6f1b57fdeab906e3a369b1c58efb7eb869ca1bb7068f8935ed118dfb6643" exitCode=0 Apr 02 13:45:46 crc kubenswrapper[4732]: I0402 13:45:46.197577 4732 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="95fd6b2b782db18e44fcad5d695b957206ea146911d9035bc8f7a4506bddb058" exitCode=2 Apr 02 13:45:46 crc kubenswrapper[4732]: E0402 13:45:46.218349 4732 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.138:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18a28e3175be4c00 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:be484bf35d3aabad50f6e4a86d258a31,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:45:46.217344 +0000 UTC m=+503.121751573,LastTimestamp:2026-04-02 13:45:46.217344 +0000 UTC m=+503.121751573,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:45:46 crc kubenswrapper[4732]: E0402 13:45:46.720926 4732 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" volumeName="registry-storage" Apr 02 13:45:47 crc kubenswrapper[4732]: I0402 13:45:47.213068 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"be484bf35d3aabad50f6e4a86d258a31","Type":"ContainerStarted","Data":"2dc186267b608fd3df3bd16c72511cb9c3cff44015cba0742315493871361361"} Apr 02 13:45:47 crc kubenswrapper[4732]: I0402 13:45:47.213132 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"be484bf35d3aabad50f6e4a86d258a31","Type":"ContainerStarted","Data":"18b3895170ea454fc4a4dc2338cb20d9c18f69994ad0ffa03c604dae05358d61"} Apr 02 13:45:47 crc kubenswrapper[4732]: E0402 13:45:47.213855 4732 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:45:47 crc kubenswrapper[4732]: I0402 13:45:47.214021 4732 status_manager.go:851] "Failed to get status for pod" podUID="6e580e30-624e-4cbc-9095-fa5659ce546e" pod="openshift-kube-apiserver/installer-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:45:47 crc kubenswrapper[4732]: I0402 13:45:47.433912 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-10-crc" Apr 02 13:45:47 crc kubenswrapper[4732]: I0402 13:45:47.435076 4732 status_manager.go:851] "Failed to get status for pod" podUID="6e580e30-624e-4cbc-9095-fa5659ce546e" pod="openshift-kube-apiserver/installer-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:45:47 crc kubenswrapper[4732]: I0402 13:45:47.533285 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e580e30-624e-4cbc-9095-fa5659ce546e-kubelet-dir\") pod \"6e580e30-624e-4cbc-9095-fa5659ce546e\" (UID: \"6e580e30-624e-4cbc-9095-fa5659ce546e\") " Apr 02 13:45:47 crc kubenswrapper[4732]: I0402 13:45:47.533668 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e580e30-624e-4cbc-9095-fa5659ce546e-kube-api-access\") pod \"6e580e30-624e-4cbc-9095-fa5659ce546e\" (UID: \"6e580e30-624e-4cbc-9095-fa5659ce546e\") " Apr 02 13:45:47 crc kubenswrapper[4732]: I0402 13:45:47.533409 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e580e30-624e-4cbc-9095-fa5659ce546e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6e580e30-624e-4cbc-9095-fa5659ce546e" (UID: "6e580e30-624e-4cbc-9095-fa5659ce546e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:45:47 crc kubenswrapper[4732]: I0402 13:45:47.533737 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6e580e30-624e-4cbc-9095-fa5659ce546e-var-lock\") pod \"6e580e30-624e-4cbc-9095-fa5659ce546e\" (UID: \"6e580e30-624e-4cbc-9095-fa5659ce546e\") " Apr 02 13:45:47 crc kubenswrapper[4732]: I0402 13:45:47.533968 4732 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e580e30-624e-4cbc-9095-fa5659ce546e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:45:47 crc kubenswrapper[4732]: I0402 13:45:47.534018 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e580e30-624e-4cbc-9095-fa5659ce546e-var-lock" (OuterVolumeSpecName: "var-lock") pod "6e580e30-624e-4cbc-9095-fa5659ce546e" (UID: "6e580e30-624e-4cbc-9095-fa5659ce546e"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:45:47 crc kubenswrapper[4732]: I0402 13:45:47.538234 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e580e30-624e-4cbc-9095-fa5659ce546e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6e580e30-624e-4cbc-9095-fa5659ce546e" (UID: "6e580e30-624e-4cbc-9095-fa5659ce546e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:45:47 crc kubenswrapper[4732]: I0402 13:45:47.634464 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e580e30-624e-4cbc-9095-fa5659ce546e-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 02 13:45:47 crc kubenswrapper[4732]: I0402 13:45:47.634508 4732 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6e580e30-624e-4cbc-9095-fa5659ce546e-var-lock\") on node \"crc\" DevicePath \"\"" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.125262 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_71bb4a3aecc4ba5b26c4b7318770ce13/kube-apiserver-cert-syncer/0.log" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.126274 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.126864 4732 status_manager.go:851] "Failed to get status for pod" podUID="6e580e30-624e-4cbc-9095-fa5659ce546e" pod="openshift-kube-apiserver/installer-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.127096 4732 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.221211 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_71bb4a3aecc4ba5b26c4b7318770ce13/kube-apiserver-cert-syncer/0.log" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.222804 4732 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="858e53e28905cbacd387c14192b4f89524e7f215e3974a7181de6ee15ad9a7cc" exitCode=0 Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.222885 4732 scope.go:117] "RemoveContainer" containerID="44e641937af018f5df25208fd44098b473a12448b3ae7d42ca143ee83364a972" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.223043 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.225659 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-10-crc" event={"ID":"6e580e30-624e-4cbc-9095-fa5659ce546e","Type":"ContainerDied","Data":"08941b02b8d608abc69250b1e69bb1c13d73671b0fdcec89d785cbc9753ed8db"} Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.225749 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-10-crc" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.225766 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08941b02b8d608abc69250b1e69bb1c13d73671b0fdcec89d785cbc9753ed8db" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.238979 4732 status_manager.go:851] "Failed to get status for pod" podUID="6e580e30-624e-4cbc-9095-fa5659ce546e" pod="openshift-kube-apiserver/installer-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.239351 4732 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.241443 4732 scope.go:117] "RemoveContainer" containerID="31401ac1c6173f4a37b7273100552581a8fd8bacad3eb4a34da8770e5cce6608" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.241725 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"71bb4a3aecc4ba5b26c4b7318770ce13\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.241891 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"71bb4a3aecc4ba5b26c4b7318770ce13\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.242002 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"71bb4a3aecc4ba5b26c4b7318770ce13\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.241989 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "71bb4a3aecc4ba5b26c4b7318770ce13" (UID: "71bb4a3aecc4ba5b26c4b7318770ce13"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.242036 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "71bb4a3aecc4ba5b26c4b7318770ce13" (UID: "71bb4a3aecc4ba5b26c4b7318770ce13"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.242087 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "71bb4a3aecc4ba5b26c4b7318770ce13" (UID: "71bb4a3aecc4ba5b26c4b7318770ce13"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.242439 4732 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.242543 4732 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.242856 4732 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.258405 4732 scope.go:117] "RemoveContainer" containerID="7fea6f1b57fdeab906e3a369b1c58efb7eb869ca1bb7068f8935ed118dfb6643" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.270521 4732 scope.go:117] "RemoveContainer" containerID="95fd6b2b782db18e44fcad5d695b957206ea146911d9035bc8f7a4506bddb058" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.282908 4732 scope.go:117] "RemoveContainer" containerID="858e53e28905cbacd387c14192b4f89524e7f215e3974a7181de6ee15ad9a7cc" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.299497 4732 scope.go:117] "RemoveContainer" containerID="b30e48272a118d9bbed1d735935deeb2d907a6763127da4680506b145d65e715" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.315966 4732 scope.go:117] "RemoveContainer" containerID="44e641937af018f5df25208fd44098b473a12448b3ae7d42ca143ee83364a972" Apr 02 13:45:48 crc kubenswrapper[4732]: E0402 13:45:48.316329 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44e641937af018f5df25208fd44098b473a12448b3ae7d42ca143ee83364a972\": container with ID starting with 44e641937af018f5df25208fd44098b473a12448b3ae7d42ca143ee83364a972 not found: ID does not exist" containerID="44e641937af018f5df25208fd44098b473a12448b3ae7d42ca143ee83364a972" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.316382 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44e641937af018f5df25208fd44098b473a12448b3ae7d42ca143ee83364a972"} err="failed to get container status \"44e641937af018f5df25208fd44098b473a12448b3ae7d42ca143ee83364a972\": rpc error: code = NotFound desc = could not find container \"44e641937af018f5df25208fd44098b473a12448b3ae7d42ca143ee83364a972\": container with ID starting with 44e641937af018f5df25208fd44098b473a12448b3ae7d42ca143ee83364a972 not found: ID does not exist" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.316413 4732 scope.go:117] "RemoveContainer" containerID="31401ac1c6173f4a37b7273100552581a8fd8bacad3eb4a34da8770e5cce6608" Apr 02 13:45:48 crc kubenswrapper[4732]: E0402 13:45:48.316803 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31401ac1c6173f4a37b7273100552581a8fd8bacad3eb4a34da8770e5cce6608\": container with ID starting with 31401ac1c6173f4a37b7273100552581a8fd8bacad3eb4a34da8770e5cce6608 not found: ID does not exist" containerID="31401ac1c6173f4a37b7273100552581a8fd8bacad3eb4a34da8770e5cce6608" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.316853 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31401ac1c6173f4a37b7273100552581a8fd8bacad3eb4a34da8770e5cce6608"} err="failed to get container status \"31401ac1c6173f4a37b7273100552581a8fd8bacad3eb4a34da8770e5cce6608\": rpc error: code = NotFound desc = could not find container \"31401ac1c6173f4a37b7273100552581a8fd8bacad3eb4a34da8770e5cce6608\": container with ID starting with 31401ac1c6173f4a37b7273100552581a8fd8bacad3eb4a34da8770e5cce6608 not found: ID does not exist" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.316889 4732 scope.go:117] "RemoveContainer" containerID="7fea6f1b57fdeab906e3a369b1c58efb7eb869ca1bb7068f8935ed118dfb6643" Apr 02 13:45:48 crc kubenswrapper[4732]: E0402 13:45:48.317157 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fea6f1b57fdeab906e3a369b1c58efb7eb869ca1bb7068f8935ed118dfb6643\": container with ID starting with 7fea6f1b57fdeab906e3a369b1c58efb7eb869ca1bb7068f8935ed118dfb6643 not found: ID does not exist" containerID="7fea6f1b57fdeab906e3a369b1c58efb7eb869ca1bb7068f8935ed118dfb6643" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.317188 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fea6f1b57fdeab906e3a369b1c58efb7eb869ca1bb7068f8935ed118dfb6643"} err="failed to get container status \"7fea6f1b57fdeab906e3a369b1c58efb7eb869ca1bb7068f8935ed118dfb6643\": rpc error: code = NotFound desc = could not find container \"7fea6f1b57fdeab906e3a369b1c58efb7eb869ca1bb7068f8935ed118dfb6643\": container with ID starting with 7fea6f1b57fdeab906e3a369b1c58efb7eb869ca1bb7068f8935ed118dfb6643 not found: ID does not exist" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.317206 4732 scope.go:117] "RemoveContainer" containerID="95fd6b2b782db18e44fcad5d695b957206ea146911d9035bc8f7a4506bddb058" Apr 02 13:45:48 crc kubenswrapper[4732]: E0402 13:45:48.317431 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95fd6b2b782db18e44fcad5d695b957206ea146911d9035bc8f7a4506bddb058\": container with ID starting with 95fd6b2b782db18e44fcad5d695b957206ea146911d9035bc8f7a4506bddb058 not found: ID does not exist" containerID="95fd6b2b782db18e44fcad5d695b957206ea146911d9035bc8f7a4506bddb058" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.317461 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95fd6b2b782db18e44fcad5d695b957206ea146911d9035bc8f7a4506bddb058"} err="failed to get container status \"95fd6b2b782db18e44fcad5d695b957206ea146911d9035bc8f7a4506bddb058\": rpc error: code = NotFound desc = could not find container \"95fd6b2b782db18e44fcad5d695b957206ea146911d9035bc8f7a4506bddb058\": container with ID starting with 95fd6b2b782db18e44fcad5d695b957206ea146911d9035bc8f7a4506bddb058 not found: ID does not exist" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.317477 4732 scope.go:117] "RemoveContainer" containerID="858e53e28905cbacd387c14192b4f89524e7f215e3974a7181de6ee15ad9a7cc" Apr 02 13:45:48 crc kubenswrapper[4732]: E0402 13:45:48.317685 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"858e53e28905cbacd387c14192b4f89524e7f215e3974a7181de6ee15ad9a7cc\": container with ID starting with 858e53e28905cbacd387c14192b4f89524e7f215e3974a7181de6ee15ad9a7cc not found: ID does not exist" containerID="858e53e28905cbacd387c14192b4f89524e7f215e3974a7181de6ee15ad9a7cc" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.317713 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"858e53e28905cbacd387c14192b4f89524e7f215e3974a7181de6ee15ad9a7cc"} err="failed to get container status \"858e53e28905cbacd387c14192b4f89524e7f215e3974a7181de6ee15ad9a7cc\": rpc error: code = NotFound desc = could not find container \"858e53e28905cbacd387c14192b4f89524e7f215e3974a7181de6ee15ad9a7cc\": container with ID starting with 858e53e28905cbacd387c14192b4f89524e7f215e3974a7181de6ee15ad9a7cc not found: ID does not exist" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.317730 4732 scope.go:117] "RemoveContainer" containerID="b30e48272a118d9bbed1d735935deeb2d907a6763127da4680506b145d65e715" Apr 02 13:45:48 crc kubenswrapper[4732]: E0402 13:45:48.318055 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b30e48272a118d9bbed1d735935deeb2d907a6763127da4680506b145d65e715\": container with ID starting with b30e48272a118d9bbed1d735935deeb2d907a6763127da4680506b145d65e715 not found: ID does not exist" containerID="b30e48272a118d9bbed1d735935deeb2d907a6763127da4680506b145d65e715" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.318085 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b30e48272a118d9bbed1d735935deeb2d907a6763127da4680506b145d65e715"} err="failed to get container status \"b30e48272a118d9bbed1d735935deeb2d907a6763127da4680506b145d65e715\": rpc error: code = NotFound desc = could not find container \"b30e48272a118d9bbed1d735935deeb2d907a6763127da4680506b145d65e715\": container with ID starting with b30e48272a118d9bbed1d735935deeb2d907a6763127da4680506b145d65e715 not found: ID does not exist" Apr 02 13:45:48 crc kubenswrapper[4732]: E0402 13:45:48.529304 4732 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:45:48 crc kubenswrapper[4732]: E0402 13:45:48.529885 4732 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:45:48 crc kubenswrapper[4732]: E0402 13:45:48.530196 4732 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:45:48 crc kubenswrapper[4732]: E0402 13:45:48.530510 4732 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:45:48 crc kubenswrapper[4732]: E0402 13:45:48.530766 4732 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.530817 4732 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Apr 02 13:45:48 crc kubenswrapper[4732]: E0402 13:45:48.531021 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="200ms" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.538955 4732 status_manager.go:851] "Failed to get status for pod" podUID="6e580e30-624e-4cbc-9095-fa5659ce546e" pod="openshift-kube-apiserver/installer-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.539362 4732 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:45:48 crc kubenswrapper[4732]: I0402 13:45:48.687048 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" path="/var/lib/kubelet/pods/71bb4a3aecc4ba5b26c4b7318770ce13/volumes" Apr 02 13:45:48 crc kubenswrapper[4732]: E0402 13:45:48.732439 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="400ms" Apr 02 13:45:48 crc kubenswrapper[4732]: E0402 13:45:48.872340 4732 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.138:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18a28e3175be4c00 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:be484bf35d3aabad50f6e4a86d258a31,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:45:46.217344 +0000 UTC m=+503.121751573,LastTimestamp:2026-04-02 13:45:46.217344 +0000 UTC m=+503.121751573,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:45:49 crc kubenswrapper[4732]: E0402 13:45:49.133596 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="800ms" Apr 02 13:45:49 crc kubenswrapper[4732]: E0402 13:45:49.935280 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="1.6s" Apr 02 13:45:51 crc kubenswrapper[4732]: E0402 13:45:51.536455 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="3.2s" Apr 02 13:45:54 crc kubenswrapper[4732]: I0402 13:45:54.609440 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-79499bfbc-frt6n" Apr 02 13:45:54 crc kubenswrapper[4732]: I0402 13:45:54.610695 4732 status_manager.go:851] "Failed to get status for pod" podUID="6e580e30-624e-4cbc-9095-fa5659ce546e" pod="openshift-kube-apiserver/installer-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:45:54 crc kubenswrapper[4732]: I0402 13:45:54.610901 4732 status_manager.go:851] "Failed to get status for pod" podUID="c7b6f3ee-8ae2-432c-a6da-44a7b883205a" pod="openshift-image-registry/image-registry-79499bfbc-frt6n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-79499bfbc-frt6n\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:45:54 crc kubenswrapper[4732]: E0402 13:45:54.623284 4732 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-image-registry/image-registry-79499bfbc-frt6n" volumeName="registry-storage" Apr 02 13:45:54 crc kubenswrapper[4732]: I0402 13:45:54.683303 4732 status_manager.go:851] "Failed to get status for pod" podUID="6e580e30-624e-4cbc-9095-fa5659ce546e" pod="openshift-kube-apiserver/installer-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:45:54 crc kubenswrapper[4732]: I0402 13:45:54.683589 4732 status_manager.go:851] "Failed to get status for pod" podUID="c7b6f3ee-8ae2-432c-a6da-44a7b883205a" pod="openshift-image-registry/image-registry-79499bfbc-frt6n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-79499bfbc-frt6n\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:45:54 crc kubenswrapper[4732]: E0402 13:45:54.753120 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="6.4s" Apr 02 13:45:56 crc kubenswrapper[4732]: I0402 13:45:56.680188 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:45:56 crc kubenswrapper[4732]: I0402 13:45:56.681462 4732 status_manager.go:851] "Failed to get status for pod" podUID="c7b6f3ee-8ae2-432c-a6da-44a7b883205a" pod="openshift-image-registry/image-registry-79499bfbc-frt6n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-79499bfbc-frt6n\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:45:56 crc kubenswrapper[4732]: I0402 13:45:56.681846 4732 status_manager.go:851] "Failed to get status for pod" podUID="6e580e30-624e-4cbc-9095-fa5659ce546e" pod="openshift-kube-apiserver/installer-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:45:56 crc kubenswrapper[4732]: I0402 13:45:56.694415 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5b38af35-c358-4061-a119-3485eb32b774" Apr 02 13:45:56 crc kubenswrapper[4732]: I0402 13:45:56.694449 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5b38af35-c358-4061-a119-3485eb32b774" Apr 02 13:45:56 crc kubenswrapper[4732]: E0402 13:45:56.695008 4732 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:45:56 crc kubenswrapper[4732]: I0402 13:45:56.695412 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:45:57 crc kubenswrapper[4732]: I0402 13:45:57.288513 4732 generic.go:334] "Generic (PLEG): container finished" podID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerID="3e705b44ea9269407921b4e2c12a475f27ad577094437a5ac000a6628c171724" exitCode=0 Apr 02 13:45:57 crc kubenswrapper[4732]: I0402 13:45:57.288598 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"4e6039c7a12c5a0c0ef5917dc7ee5582","Type":"ContainerDied","Data":"3e705b44ea9269407921b4e2c12a475f27ad577094437a5ac000a6628c171724"} Apr 02 13:45:57 crc kubenswrapper[4732]: I0402 13:45:57.288854 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"4e6039c7a12c5a0c0ef5917dc7ee5582","Type":"ContainerStarted","Data":"de4a6be30e8fe7c66de7aaac801eac1055276d6984d42edef9dd092d7cf20be6"} Apr 02 13:45:57 crc kubenswrapper[4732]: I0402 13:45:57.289639 4732 status_manager.go:851] "Failed to get status for pod" podUID="c7b6f3ee-8ae2-432c-a6da-44a7b883205a" pod="openshift-image-registry/image-registry-79499bfbc-frt6n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-79499bfbc-frt6n\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:45:57 crc kubenswrapper[4732]: I0402 13:45:57.289694 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5b38af35-c358-4061-a119-3485eb32b774" Apr 02 13:45:57 crc kubenswrapper[4732]: I0402 13:45:57.289857 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5b38af35-c358-4061-a119-3485eb32b774" Apr 02 13:45:57 crc kubenswrapper[4732]: E0402 13:45:57.290186 4732 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:45:57 crc kubenswrapper[4732]: I0402 13:45:57.290187 4732 status_manager.go:851] "Failed to get status for pod" podUID="6e580e30-624e-4cbc-9095-fa5659ce546e" pod="openshift-kube-apiserver/installer-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-10-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:45:58 crc kubenswrapper[4732]: I0402 13:45:58.298994 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"4e6039c7a12c5a0c0ef5917dc7ee5582","Type":"ContainerStarted","Data":"154f96ad2e07e914c8919ce03f5a46d1f2c4770c36a0f52d47b1c9f79ba64c1e"} Apr 02 13:45:58 crc kubenswrapper[4732]: I0402 13:45:58.299045 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"4e6039c7a12c5a0c0ef5917dc7ee5582","Type":"ContainerStarted","Data":"22672e53ab610750e5765dc8ce0aacf7b0848b1e72f7e1924976df8af0e68f30"} Apr 02 13:45:58 crc kubenswrapper[4732]: I0402 13:45:58.299058 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"4e6039c7a12c5a0c0ef5917dc7ee5582","Type":"ContainerStarted","Data":"060b077035d0d8f3b137497cde23a9a2def5bcac1994d53cb009323a5939cb36"} Apr 02 13:45:58 crc kubenswrapper[4732]: I0402 13:45:58.299070 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"4e6039c7a12c5a0c0ef5917dc7ee5582","Type":"ContainerStarted","Data":"96cc6a0134c07b1897cf08a443d481b40cafe36f9fc7219d1f0b472531c54e50"} Apr 02 13:45:59 crc kubenswrapper[4732]: I0402 13:45:59.316399 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"4e6039c7a12c5a0c0ef5917dc7ee5582","Type":"ContainerStarted","Data":"2c9bd70787ed9f0ffcf2b5aa836cdc2f70f83b8baed3300ce8244a1130f6edc6"} Apr 02 13:45:59 crc kubenswrapper[4732]: I0402 13:45:59.316777 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:45:59 crc kubenswrapper[4732]: I0402 13:45:59.316676 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5b38af35-c358-4061-a119-3485eb32b774" Apr 02 13:45:59 crc kubenswrapper[4732]: I0402 13:45:59.316797 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5b38af35-c358-4061-a119-3485eb32b774" Apr 02 13:45:59 crc kubenswrapper[4732]: I0402 13:45:59.319296 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Apr 02 13:45:59 crc kubenswrapper[4732]: I0402 13:45:59.320206 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Apr 02 13:45:59 crc kubenswrapper[4732]: I0402 13:45:59.321374 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Apr 02 13:45:59 crc kubenswrapper[4732]: I0402 13:45:59.321414 4732 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="fd58fa9b97b4f1b471d0e141c18378167e8054616db491b1e79d2530f9ecab7f" exitCode=1 Apr 02 13:45:59 crc kubenswrapper[4732]: I0402 13:45:59.321440 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"fd58fa9b97b4f1b471d0e141c18378167e8054616db491b1e79d2530f9ecab7f"} Apr 02 13:45:59 crc kubenswrapper[4732]: I0402 13:45:59.321475 4732 scope.go:117] "RemoveContainer" containerID="ef96f17edb5e968b50bab42d6ddcc7c4ffc0fae1b3c8bcdebc6150e18977740e" Apr 02 13:45:59 crc kubenswrapper[4732]: I0402 13:45:59.321974 4732 scope.go:117] "RemoveContainer" containerID="fd58fa9b97b4f1b471d0e141c18378167e8054616db491b1e79d2530f9ecab7f" Apr 02 13:45:59 crc kubenswrapper[4732]: E0402 13:45:59.322212 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=kube-controller-manager-crc_openshift-kube-controller-manager(f614b9022728cf315e60c057852e563e)\"" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" Apr 02 13:46:00 crc kubenswrapper[4732]: I0402 13:46:00.328600 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Apr 02 13:46:00 crc kubenswrapper[4732]: I0402 13:46:00.329223 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Apr 02 13:46:01 crc kubenswrapper[4732]: I0402 13:46:01.089396 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:46:01 crc kubenswrapper[4732]: I0402 13:46:01.089918 4732 scope.go:117] "RemoveContainer" containerID="fd58fa9b97b4f1b471d0e141c18378167e8054616db491b1e79d2530f9ecab7f" Apr 02 13:46:01 crc kubenswrapper[4732]: E0402 13:46:01.090265 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=kube-controller-manager-crc_openshift-kube-controller-manager(f614b9022728cf315e60c057852e563e)\"" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" Apr 02 13:46:01 crc kubenswrapper[4732]: I0402 13:46:01.338700 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-11-retry-1-crc_6e4f3964-0282-48aa-a0a5-cad803e3812b/installer/0.log" Apr 02 13:46:01 crc kubenswrapper[4732]: I0402 13:46:01.339106 4732 generic.go:334] "Generic (PLEG): container finished" podID="6e4f3964-0282-48aa-a0a5-cad803e3812b" containerID="75d1c9855f320237c4f2255a2386b01f78bcb361951e5bff599684ff41ab29e1" exitCode=1 Apr 02 13:46:01 crc kubenswrapper[4732]: I0402 13:46:01.339149 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-11-retry-1-crc" event={"ID":"6e4f3964-0282-48aa-a0a5-cad803e3812b","Type":"ContainerDied","Data":"75d1c9855f320237c4f2255a2386b01f78bcb361951e5bff599684ff41ab29e1"} Apr 02 13:46:01 crc kubenswrapper[4732]: I0402 13:46:01.696428 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:46:01 crc kubenswrapper[4732]: I0402 13:46:01.696483 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:46:01 crc kubenswrapper[4732]: I0402 13:46:01.701469 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:46:01 crc kubenswrapper[4732]: I0402 13:46:01.720778 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:46:01 crc kubenswrapper[4732]: I0402 13:46:01.721381 4732 scope.go:117] "RemoveContainer" containerID="fd58fa9b97b4f1b471d0e141c18378167e8054616db491b1e79d2530f9ecab7f" Apr 02 13:46:01 crc kubenswrapper[4732]: E0402 13:46:01.721697 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=kube-controller-manager-crc_openshift-kube-controller-manager(f614b9022728cf315e60c057852e563e)\"" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" Apr 02 13:46:01 crc kubenswrapper[4732]: I0402 13:46:01.924860 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 13:46:01 crc kubenswrapper[4732]: I0402 13:46:01.924928 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 13:46:03 crc kubenswrapper[4732]: I0402 13:46:02.557874 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-11-retry-1-crc_6e4f3964-0282-48aa-a0a5-cad803e3812b/installer/0.log" Apr 02 13:46:03 crc kubenswrapper[4732]: I0402 13:46:02.557966 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 02 13:46:03 crc kubenswrapper[4732]: I0402 13:46:02.633718 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e4f3964-0282-48aa-a0a5-cad803e3812b-kubelet-dir\") pod \"6e4f3964-0282-48aa-a0a5-cad803e3812b\" (UID: \"6e4f3964-0282-48aa-a0a5-cad803e3812b\") " Apr 02 13:46:03 crc kubenswrapper[4732]: I0402 13:46:02.633810 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e4f3964-0282-48aa-a0a5-cad803e3812b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6e4f3964-0282-48aa-a0a5-cad803e3812b" (UID: "6e4f3964-0282-48aa-a0a5-cad803e3812b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:46:03 crc kubenswrapper[4732]: I0402 13:46:02.633822 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e4f3964-0282-48aa-a0a5-cad803e3812b-kube-api-access\") pod \"6e4f3964-0282-48aa-a0a5-cad803e3812b\" (UID: \"6e4f3964-0282-48aa-a0a5-cad803e3812b\") " Apr 02 13:46:03 crc kubenswrapper[4732]: I0402 13:46:02.633918 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6e4f3964-0282-48aa-a0a5-cad803e3812b-var-lock\") pod \"6e4f3964-0282-48aa-a0a5-cad803e3812b\" (UID: \"6e4f3964-0282-48aa-a0a5-cad803e3812b\") " Apr 02 13:46:03 crc kubenswrapper[4732]: I0402 13:46:02.634219 4732 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e4f3964-0282-48aa-a0a5-cad803e3812b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:46:03 crc kubenswrapper[4732]: I0402 13:46:02.634246 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e4f3964-0282-48aa-a0a5-cad803e3812b-var-lock" (OuterVolumeSpecName: "var-lock") pod "6e4f3964-0282-48aa-a0a5-cad803e3812b" (UID: "6e4f3964-0282-48aa-a0a5-cad803e3812b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:46:03 crc kubenswrapper[4732]: I0402 13:46:02.642865 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e4f3964-0282-48aa-a0a5-cad803e3812b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6e4f3964-0282-48aa-a0a5-cad803e3812b" (UID: "6e4f3964-0282-48aa-a0a5-cad803e3812b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:46:03 crc kubenswrapper[4732]: I0402 13:46:02.735465 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e4f3964-0282-48aa-a0a5-cad803e3812b-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 02 13:46:03 crc kubenswrapper[4732]: I0402 13:46:02.735671 4732 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6e4f3964-0282-48aa-a0a5-cad803e3812b-var-lock\") on node \"crc\" DevicePath \"\"" Apr 02 13:46:03 crc kubenswrapper[4732]: I0402 13:46:03.353034 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-11-retry-1-crc_6e4f3964-0282-48aa-a0a5-cad803e3812b/installer/0.log" Apr 02 13:46:03 crc kubenswrapper[4732]: I0402 13:46:03.353099 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-11-retry-1-crc" event={"ID":"6e4f3964-0282-48aa-a0a5-cad803e3812b","Type":"ContainerDied","Data":"4989c64fcdc939e6f0924f3d7381eb420ebc7ca6d039f1734249bdfd6da7a1eb"} Apr 02 13:46:03 crc kubenswrapper[4732]: I0402 13:46:03.353130 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4989c64fcdc939e6f0924f3d7381eb420ebc7ca6d039f1734249bdfd6da7a1eb" Apr 02 13:46:03 crc kubenswrapper[4732]: I0402 13:46:03.353171 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 02 13:46:04 crc kubenswrapper[4732]: I0402 13:46:04.327136 4732 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:46:04 crc kubenswrapper[4732]: I0402 13:46:04.353136 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b38af35-c358-4061-a119-3485eb32b774\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:45:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:45:57Z\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:45:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-02T13:45:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96cc6a0134c07b1897cf08a443d481b40cafe36f9fc7219d1f0b472531c54e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22672e53ab610750e5765dc8ce0aacf7b0848b1e72f7e1924976df8af0e68f30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://060b077035d0d8f3b137497cde23a9a2def5bcac1994d53cb009323a5939cb36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c9bd70787ed9f0ffcf2b5aa836cdc2f70f83b8baed3300ce8244a1130f6edc6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://154f96ad2e07e914c8919ce03f5a46d1f2c4770c36a0f52d47b1c9f79ba64c1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-02T13:45:58Z\\\"}}}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e705b44ea9269407921b4e2c12a475f27ad577094437a5ac000a6628c171724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e705b44ea9269407921b4e2c12a475f27ad577094437a5ac000a6628c171724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-04-02T13:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-04-02T13:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}]}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Pod \"kube-apiserver-crc\" is invalid: metadata.uid: Invalid value: \"5b38af35-c358-4061-a119-3485eb32b774\": field is immutable" Apr 02 13:46:04 crc kubenswrapper[4732]: I0402 13:46:04.358166 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5b38af35-c358-4061-a119-3485eb32b774" Apr 02 13:46:04 crc kubenswrapper[4732]: I0402 13:46:04.358192 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5b38af35-c358-4061-a119-3485eb32b774" Apr 02 13:46:04 crc kubenswrapper[4732]: I0402 13:46:04.361668 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:46:04 crc kubenswrapper[4732]: I0402 13:46:04.396020 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="4e6039c7a12c5a0c0ef5917dc7ee5582" podUID="22fd4835-69d0-47de-8792-5c8c0e8028ce" Apr 02 13:46:05 crc kubenswrapper[4732]: I0402 13:46:05.578455 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:46:05 crc kubenswrapper[4732]: I0402 13:46:05.579929 4732 scope.go:117] "RemoveContainer" containerID="fd58fa9b97b4f1b471d0e141c18378167e8054616db491b1e79d2530f9ecab7f" Apr 02 13:46:05 crc kubenswrapper[4732]: E0402 13:46:05.580291 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=kube-controller-manager-crc_openshift-kube-controller-manager(f614b9022728cf315e60c057852e563e)\"" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" Apr 02 13:46:05 crc kubenswrapper[4732]: I0402 13:46:05.586986 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5b38af35-c358-4061-a119-3485eb32b774" Apr 02 13:46:05 crc kubenswrapper[4732]: I0402 13:46:05.587078 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5b38af35-c358-4061-a119-3485eb32b774" Apr 02 13:46:10 crc kubenswrapper[4732]: I0402 13:46:10.406164 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Apr 02 13:46:11 crc kubenswrapper[4732]: I0402 13:46:11.017151 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Apr 02 13:46:11 crc kubenswrapper[4732]: I0402 13:46:11.018319 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Apr 02 13:46:11 crc kubenswrapper[4732]: I0402 13:46:11.069140 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Apr 02 13:46:11 crc kubenswrapper[4732]: I0402 13:46:11.533274 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Apr 02 13:46:11 crc kubenswrapper[4732]: I0402 13:46:11.764480 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Apr 02 13:46:12 crc kubenswrapper[4732]: I0402 13:46:12.585423 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Apr 02 13:46:13 crc kubenswrapper[4732]: I0402 13:46:13.037272 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Apr 02 13:46:13 crc kubenswrapper[4732]: I0402 13:46:13.670554 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Apr 02 13:46:13 crc kubenswrapper[4732]: I0402 13:46:13.721257 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Apr 02 13:46:13 crc kubenswrapper[4732]: I0402 13:46:13.987464 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Apr 02 13:46:14 crc kubenswrapper[4732]: I0402 13:46:14.041529 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Apr 02 13:46:14 crc kubenswrapper[4732]: I0402 13:46:14.225153 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Apr 02 13:46:14 crc kubenswrapper[4732]: I0402 13:46:14.236797 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Apr 02 13:46:14 crc kubenswrapper[4732]: I0402 13:46:14.442524 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Apr 02 13:46:14 crc kubenswrapper[4732]: I0402 13:46:14.693100 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="4e6039c7a12c5a0c0ef5917dc7ee5582" podUID="22fd4835-69d0-47de-8792-5c8c0e8028ce" Apr 02 13:46:14 crc kubenswrapper[4732]: I0402 13:46:14.723057 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Apr 02 13:46:14 crc kubenswrapper[4732]: I0402 13:46:14.748177 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Apr 02 13:46:14 crc kubenswrapper[4732]: I0402 13:46:14.755727 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Apr 02 13:46:14 crc kubenswrapper[4732]: I0402 13:46:14.945294 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Apr 02 13:46:15 crc kubenswrapper[4732]: I0402 13:46:15.218889 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Apr 02 13:46:15 crc kubenswrapper[4732]: I0402 13:46:15.361811 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Apr 02 13:46:15 crc kubenswrapper[4732]: I0402 13:46:15.821042 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Apr 02 13:46:15 crc kubenswrapper[4732]: I0402 13:46:15.961192 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Apr 02 13:46:16 crc kubenswrapper[4732]: I0402 13:46:16.094100 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Apr 02 13:46:16 crc kubenswrapper[4732]: I0402 13:46:16.094192 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Apr 02 13:46:16 crc kubenswrapper[4732]: I0402 13:46:16.285044 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Apr 02 13:46:16 crc kubenswrapper[4732]: I0402 13:46:16.323792 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Apr 02 13:46:16 crc kubenswrapper[4732]: I0402 13:46:16.347668 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Apr 02 13:46:16 crc kubenswrapper[4732]: I0402 13:46:16.521238 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Apr 02 13:46:16 crc kubenswrapper[4732]: I0402 13:46:16.576835 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Apr 02 13:46:16 crc kubenswrapper[4732]: I0402 13:46:16.680672 4732 scope.go:117] "RemoveContainer" containerID="fd58fa9b97b4f1b471d0e141c18378167e8054616db491b1e79d2530f9ecab7f" Apr 02 13:46:16 crc kubenswrapper[4732]: I0402 13:46:16.723074 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Apr 02 13:46:16 crc kubenswrapper[4732]: I0402 13:46:16.921667 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Apr 02 13:46:17 crc kubenswrapper[4732]: I0402 13:46:17.083470 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Apr 02 13:46:17 crc kubenswrapper[4732]: I0402 13:46:17.115996 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Apr 02 13:46:17 crc kubenswrapper[4732]: I0402 13:46:17.116243 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Apr 02 13:46:17 crc kubenswrapper[4732]: I0402 13:46:17.376306 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Apr 02 13:46:17 crc kubenswrapper[4732]: I0402 13:46:17.432597 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Apr 02 13:46:17 crc kubenswrapper[4732]: I0402 13:46:17.580894 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Apr 02 13:46:17 crc kubenswrapper[4732]: I0402 13:46:17.611767 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Apr 02 13:46:17 crc kubenswrapper[4732]: I0402 13:46:17.655096 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Apr 02 13:46:17 crc kubenswrapper[4732]: I0402 13:46:17.656215 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Apr 02 13:46:17 crc kubenswrapper[4732]: I0402 13:46:17.657397 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a14d376cb655f7984c3fbe269826d57ea3e936ded391d3fbb9394a6b7960dad5"} Apr 02 13:46:17 crc kubenswrapper[4732]: I0402 13:46:17.697345 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Apr 02 13:46:17 crc kubenswrapper[4732]: I0402 13:46:17.833928 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Apr 02 13:46:17 crc kubenswrapper[4732]: I0402 13:46:17.932829 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Apr 02 13:46:18 crc kubenswrapper[4732]: I0402 13:46:18.168504 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Apr 02 13:46:18 crc kubenswrapper[4732]: I0402 13:46:18.608964 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Apr 02 13:46:18 crc kubenswrapper[4732]: I0402 13:46:18.638336 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Apr 02 13:46:18 crc kubenswrapper[4732]: I0402 13:46:18.759035 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Apr 02 13:46:19 crc kubenswrapper[4732]: I0402 13:46:19.022843 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Apr 02 13:46:19 crc kubenswrapper[4732]: I0402 13:46:19.146201 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Apr 02 13:46:19 crc kubenswrapper[4732]: I0402 13:46:19.239433 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Apr 02 13:46:19 crc kubenswrapper[4732]: I0402 13:46:19.302177 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Apr 02 13:46:19 crc kubenswrapper[4732]: I0402 13:46:19.352520 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Apr 02 13:46:19 crc kubenswrapper[4732]: I0402 13:46:19.409352 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Apr 02 13:46:19 crc kubenswrapper[4732]: I0402 13:46:19.748984 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Apr 02 13:46:19 crc kubenswrapper[4732]: I0402 13:46:19.810662 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Apr 02 13:46:19 crc kubenswrapper[4732]: I0402 13:46:19.884089 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Apr 02 13:46:19 crc kubenswrapper[4732]: I0402 13:46:19.987475 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Apr 02 13:46:20 crc kubenswrapper[4732]: I0402 13:46:20.017349 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Apr 02 13:46:20 crc kubenswrapper[4732]: I0402 13:46:20.068473 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Apr 02 13:46:20 crc kubenswrapper[4732]: I0402 13:46:20.296704 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Apr 02 13:46:20 crc kubenswrapper[4732]: I0402 13:46:20.370690 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Apr 02 13:46:20 crc kubenswrapper[4732]: I0402 13:46:20.470706 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Apr 02 13:46:20 crc kubenswrapper[4732]: I0402 13:46:20.494162 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Apr 02 13:46:20 crc kubenswrapper[4732]: I0402 13:46:20.495372 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Apr 02 13:46:20 crc kubenswrapper[4732]: I0402 13:46:20.553887 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Apr 02 13:46:20 crc kubenswrapper[4732]: I0402 13:46:20.592416 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Apr 02 13:46:20 crc kubenswrapper[4732]: I0402 13:46:20.615837 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Apr 02 13:46:20 crc kubenswrapper[4732]: I0402 13:46:20.639469 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Apr 02 13:46:20 crc kubenswrapper[4732]: I0402 13:46:20.658005 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Apr 02 13:46:20 crc kubenswrapper[4732]: I0402 13:46:20.734319 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Apr 02 13:46:20 crc kubenswrapper[4732]: I0402 13:46:20.858162 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Apr 02 13:46:21 crc kubenswrapper[4732]: I0402 13:46:21.015324 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Apr 02 13:46:21 crc kubenswrapper[4732]: I0402 13:46:21.035709 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Apr 02 13:46:21 crc kubenswrapper[4732]: I0402 13:46:21.068189 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Apr 02 13:46:21 crc kubenswrapper[4732]: I0402 13:46:21.089168 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:46:21 crc kubenswrapper[4732]: I0402 13:46:21.093246 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:46:21 crc kubenswrapper[4732]: I0402 13:46:21.148596 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Apr 02 13:46:21 crc kubenswrapper[4732]: I0402 13:46:21.266960 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Apr 02 13:46:21 crc kubenswrapper[4732]: I0402 13:46:21.307903 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Apr 02 13:46:21 crc kubenswrapper[4732]: I0402 13:46:21.405278 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Apr 02 13:46:21 crc kubenswrapper[4732]: I0402 13:46:21.455716 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Apr 02 13:46:21 crc kubenswrapper[4732]: I0402 13:46:21.468596 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Apr 02 13:46:21 crc kubenswrapper[4732]: I0402 13:46:21.475479 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Apr 02 13:46:21 crc kubenswrapper[4732]: I0402 13:46:21.515280 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Apr 02 13:46:21 crc kubenswrapper[4732]: I0402 13:46:21.523299 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Apr 02 13:46:21 crc kubenswrapper[4732]: I0402 13:46:21.679750 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:46:21 crc kubenswrapper[4732]: I0402 13:46:21.845964 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Apr 02 13:46:21 crc kubenswrapper[4732]: I0402 13:46:21.864362 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Apr 02 13:46:21 crc kubenswrapper[4732]: I0402 13:46:21.865495 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Apr 02 13:46:21 crc kubenswrapper[4732]: I0402 13:46:21.985754 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Apr 02 13:46:22 crc kubenswrapper[4732]: I0402 13:46:22.020208 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Apr 02 13:46:22 crc kubenswrapper[4732]: I0402 13:46:22.050648 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Apr 02 13:46:22 crc kubenswrapper[4732]: I0402 13:46:22.126043 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Apr 02 13:46:22 crc kubenswrapper[4732]: I0402 13:46:22.205253 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Apr 02 13:46:22 crc kubenswrapper[4732]: I0402 13:46:22.230593 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Apr 02 13:46:22 crc kubenswrapper[4732]: I0402 13:46:22.270139 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Apr 02 13:46:22 crc kubenswrapper[4732]: I0402 13:46:22.332117 4732 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Apr 02 13:46:22 crc kubenswrapper[4732]: I0402 13:46:22.360730 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Apr 02 13:46:22 crc kubenswrapper[4732]: I0402 13:46:22.403004 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Apr 02 13:46:22 crc kubenswrapper[4732]: I0402 13:46:22.477378 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Apr 02 13:46:22 crc kubenswrapper[4732]: I0402 13:46:22.585130 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Apr 02 13:46:22 crc kubenswrapper[4732]: I0402 13:46:22.617865 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Apr 02 13:46:22 crc kubenswrapper[4732]: I0402 13:46:22.741926 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Apr 02 13:46:22 crc kubenswrapper[4732]: I0402 13:46:22.743078 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Apr 02 13:46:22 crc kubenswrapper[4732]: I0402 13:46:22.792916 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Apr 02 13:46:22 crc kubenswrapper[4732]: I0402 13:46:22.796298 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Apr 02 13:46:22 crc kubenswrapper[4732]: I0402 13:46:22.920158 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Apr 02 13:46:22 crc kubenswrapper[4732]: I0402 13:46:22.948825 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Apr 02 13:46:23 crc kubenswrapper[4732]: I0402 13:46:23.048488 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Apr 02 13:46:23 crc kubenswrapper[4732]: I0402 13:46:23.050461 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Apr 02 13:46:23 crc kubenswrapper[4732]: I0402 13:46:23.068524 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Apr 02 13:46:23 crc kubenswrapper[4732]: I0402 13:46:23.159069 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Apr 02 13:46:23 crc kubenswrapper[4732]: I0402 13:46:23.267288 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Apr 02 13:46:23 crc kubenswrapper[4732]: I0402 13:46:23.521068 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Apr 02 13:46:23 crc kubenswrapper[4732]: I0402 13:46:23.553854 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Apr 02 13:46:23 crc kubenswrapper[4732]: I0402 13:46:23.553862 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Apr 02 13:46:23 crc kubenswrapper[4732]: I0402 13:46:23.621091 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Apr 02 13:46:23 crc kubenswrapper[4732]: I0402 13:46:23.695644 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Apr 02 13:46:23 crc kubenswrapper[4732]: I0402 13:46:23.778214 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Apr 02 13:46:23 crc kubenswrapper[4732]: I0402 13:46:23.803249 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Apr 02 13:46:23 crc kubenswrapper[4732]: I0402 13:46:23.896342 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Apr 02 13:46:23 crc kubenswrapper[4732]: I0402 13:46:23.921345 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Apr 02 13:46:23 crc kubenswrapper[4732]: I0402 13:46:23.945530 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Apr 02 13:46:23 crc kubenswrapper[4732]: I0402 13:46:23.966960 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Apr 02 13:46:23 crc kubenswrapper[4732]: I0402 13:46:23.989523 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Apr 02 13:46:24 crc kubenswrapper[4732]: I0402 13:46:24.115879 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Apr 02 13:46:24 crc kubenswrapper[4732]: I0402 13:46:24.193693 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Apr 02 13:46:24 crc kubenswrapper[4732]: I0402 13:46:24.305385 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Apr 02 13:46:24 crc kubenswrapper[4732]: I0402 13:46:24.355008 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Apr 02 13:46:24 crc kubenswrapper[4732]: I0402 13:46:24.604696 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Apr 02 13:46:24 crc kubenswrapper[4732]: I0402 13:46:24.640828 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Apr 02 13:46:24 crc kubenswrapper[4732]: I0402 13:46:24.669507 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Apr 02 13:46:24 crc kubenswrapper[4732]: I0402 13:46:24.728091 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Apr 02 13:46:24 crc kubenswrapper[4732]: I0402 13:46:24.825966 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Apr 02 13:46:24 crc kubenswrapper[4732]: I0402 13:46:24.833248 4732 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Apr 02 13:46:24 crc kubenswrapper[4732]: I0402 13:46:24.841044 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Apr 02 13:46:24 crc kubenswrapper[4732]: I0402 13:46:24.843260 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Apr 02 13:46:24 crc kubenswrapper[4732]: I0402 13:46:24.843350 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Apr 02 13:46:24 crc kubenswrapper[4732]: I0402 13:46:24.848947 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:46:24 crc kubenswrapper[4732]: I0402 13:46:24.870503 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.870480438 podStartE2EDuration="20.870480438s" podCreationTimestamp="2026-04-02 13:46:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:46:24.867118377 +0000 UTC m=+541.771525940" watchObservedRunningTime="2026-04-02 13:46:24.870480438 +0000 UTC m=+541.774888031" Apr 02 13:46:24 crc kubenswrapper[4732]: I0402 13:46:24.879383 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Apr 02 13:46:24 crc kubenswrapper[4732]: I0402 13:46:24.921480 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Apr 02 13:46:24 crc kubenswrapper[4732]: I0402 13:46:24.991551 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Apr 02 13:46:24 crc kubenswrapper[4732]: I0402 13:46:24.999812 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Apr 02 13:46:25 crc kubenswrapper[4732]: I0402 13:46:25.192201 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Apr 02 13:46:25 crc kubenswrapper[4732]: I0402 13:46:25.209155 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Apr 02 13:46:25 crc kubenswrapper[4732]: I0402 13:46:25.245335 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Apr 02 13:46:25 crc kubenswrapper[4732]: I0402 13:46:25.253023 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Apr 02 13:46:25 crc kubenswrapper[4732]: I0402 13:46:25.340908 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Apr 02 13:46:25 crc kubenswrapper[4732]: I0402 13:46:25.448850 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Apr 02 13:46:25 crc kubenswrapper[4732]: I0402 13:46:25.470530 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Apr 02 13:46:25 crc kubenswrapper[4732]: I0402 13:46:25.507755 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Apr 02 13:46:25 crc kubenswrapper[4732]: I0402 13:46:25.632650 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Apr 02 13:46:25 crc kubenswrapper[4732]: I0402 13:46:25.685571 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Apr 02 13:46:25 crc kubenswrapper[4732]: I0402 13:46:25.754699 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Apr 02 13:46:25 crc kubenswrapper[4732]: I0402 13:46:25.755267 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Apr 02 13:46:25 crc kubenswrapper[4732]: I0402 13:46:25.785485 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Apr 02 13:46:25 crc kubenswrapper[4732]: I0402 13:46:25.803098 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Apr 02 13:46:25 crc kubenswrapper[4732]: I0402 13:46:25.803353 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Apr 02 13:46:25 crc kubenswrapper[4732]: I0402 13:46:25.824677 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Apr 02 13:46:25 crc kubenswrapper[4732]: I0402 13:46:25.825272 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Apr 02 13:46:25 crc kubenswrapper[4732]: I0402 13:46:25.922269 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Apr 02 13:46:25 crc kubenswrapper[4732]: I0402 13:46:25.937216 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Apr 02 13:46:25 crc kubenswrapper[4732]: I0402 13:46:25.946589 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Apr 02 13:46:25 crc kubenswrapper[4732]: I0402 13:46:25.988469 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Apr 02 13:46:25 crc kubenswrapper[4732]: I0402 13:46:25.988483 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Apr 02 13:46:26 crc kubenswrapper[4732]: I0402 13:46:26.075475 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Apr 02 13:46:26 crc kubenswrapper[4732]: I0402 13:46:26.110475 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Apr 02 13:46:26 crc kubenswrapper[4732]: I0402 13:46:26.182021 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Apr 02 13:46:26 crc kubenswrapper[4732]: I0402 13:46:26.363807 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Apr 02 13:46:26 crc kubenswrapper[4732]: I0402 13:46:26.376914 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Apr 02 13:46:26 crc kubenswrapper[4732]: I0402 13:46:26.402128 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Apr 02 13:46:26 crc kubenswrapper[4732]: I0402 13:46:26.473987 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Apr 02 13:46:26 crc kubenswrapper[4732]: I0402 13:46:26.554780 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Apr 02 13:46:26 crc kubenswrapper[4732]: I0402 13:46:26.578312 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Apr 02 13:46:26 crc kubenswrapper[4732]: I0402 13:46:26.591808 4732 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Apr 02 13:46:26 crc kubenswrapper[4732]: I0402 13:46:26.592047 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="be484bf35d3aabad50f6e4a86d258a31" containerName="startup-monitor" containerID="cri-o://2dc186267b608fd3df3bd16c72511cb9c3cff44015cba0742315493871361361" gracePeriod=5 Apr 02 13:46:26 crc kubenswrapper[4732]: I0402 13:46:26.706002 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Apr 02 13:46:26 crc kubenswrapper[4732]: I0402 13:46:26.758176 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Apr 02 13:46:26 crc kubenswrapper[4732]: I0402 13:46:26.914630 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Apr 02 13:46:26 crc kubenswrapper[4732]: I0402 13:46:26.981055 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Apr 02 13:46:27 crc kubenswrapper[4732]: I0402 13:46:27.055984 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Apr 02 13:46:27 crc kubenswrapper[4732]: I0402 13:46:27.105398 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Apr 02 13:46:27 crc kubenswrapper[4732]: I0402 13:46:27.135338 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Apr 02 13:46:27 crc kubenswrapper[4732]: I0402 13:46:27.161955 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Apr 02 13:46:27 crc kubenswrapper[4732]: I0402 13:46:27.307814 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Apr 02 13:46:27 crc kubenswrapper[4732]: I0402 13:46:27.319797 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Apr 02 13:46:27 crc kubenswrapper[4732]: I0402 13:46:27.373491 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Apr 02 13:46:27 crc kubenswrapper[4732]: I0402 13:46:27.410476 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Apr 02 13:46:27 crc kubenswrapper[4732]: I0402 13:46:27.454678 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Apr 02 13:46:27 crc kubenswrapper[4732]: I0402 13:46:27.559324 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Apr 02 13:46:27 crc kubenswrapper[4732]: I0402 13:46:27.611490 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Apr 02 13:46:27 crc kubenswrapper[4732]: I0402 13:46:27.638118 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Apr 02 13:46:27 crc kubenswrapper[4732]: I0402 13:46:27.640484 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Apr 02 13:46:27 crc kubenswrapper[4732]: I0402 13:46:27.755941 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Apr 02 13:46:27 crc kubenswrapper[4732]: I0402 13:46:27.807158 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Apr 02 13:46:27 crc kubenswrapper[4732]: I0402 13:46:27.853118 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Apr 02 13:46:27 crc kubenswrapper[4732]: I0402 13:46:27.880783 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Apr 02 13:46:27 crc kubenswrapper[4732]: I0402 13:46:27.997063 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Apr 02 13:46:28 crc kubenswrapper[4732]: I0402 13:46:28.039946 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Apr 02 13:46:28 crc kubenswrapper[4732]: I0402 13:46:28.094638 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Apr 02 13:46:28 crc kubenswrapper[4732]: I0402 13:46:28.103237 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Apr 02 13:46:28 crc kubenswrapper[4732]: I0402 13:46:28.141994 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Apr 02 13:46:28 crc kubenswrapper[4732]: I0402 13:46:28.163279 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Apr 02 13:46:28 crc kubenswrapper[4732]: I0402 13:46:28.227724 4732 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Apr 02 13:46:28 crc kubenswrapper[4732]: I0402 13:46:28.305575 4732 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Apr 02 13:46:28 crc kubenswrapper[4732]: I0402 13:46:28.378834 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Apr 02 13:46:28 crc kubenswrapper[4732]: I0402 13:46:28.412266 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Apr 02 13:46:28 crc kubenswrapper[4732]: I0402 13:46:28.513260 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Apr 02 13:46:28 crc kubenswrapper[4732]: I0402 13:46:28.538922 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Apr 02 13:46:28 crc kubenswrapper[4732]: I0402 13:46:28.558930 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Apr 02 13:46:28 crc kubenswrapper[4732]: I0402 13:46:28.590557 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Apr 02 13:46:28 crc kubenswrapper[4732]: I0402 13:46:28.631847 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Apr 02 13:46:28 crc kubenswrapper[4732]: I0402 13:46:28.728193 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Apr 02 13:46:28 crc kubenswrapper[4732]: I0402 13:46:28.875138 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Apr 02 13:46:28 crc kubenswrapper[4732]: I0402 13:46:28.965468 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Apr 02 13:46:28 crc kubenswrapper[4732]: I0402 13:46:28.972791 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Apr 02 13:46:29 crc kubenswrapper[4732]: I0402 13:46:29.119067 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Apr 02 13:46:29 crc kubenswrapper[4732]: I0402 13:46:29.158293 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Apr 02 13:46:29 crc kubenswrapper[4732]: I0402 13:46:29.387068 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Apr 02 13:46:29 crc kubenswrapper[4732]: I0402 13:46:29.429300 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Apr 02 13:46:29 crc kubenswrapper[4732]: I0402 13:46:29.461240 4732 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Apr 02 13:46:29 crc kubenswrapper[4732]: I0402 13:46:29.509530 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Apr 02 13:46:29 crc kubenswrapper[4732]: I0402 13:46:29.564458 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qlgxl"] Apr 02 13:46:29 crc kubenswrapper[4732]: I0402 13:46:29.564820 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qlgxl" podUID="1827909b-49ea-4ba8-9995-f525d1d82f45" containerName="registry-server" containerID="cri-o://379848342ab71ffa30dea645fdced0d379fcf457ee71fe018e018fae2d13234b" gracePeriod=30 Apr 02 13:46:29 crc kubenswrapper[4732]: I0402 13:46:29.572647 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4m6rj"] Apr 02 13:46:29 crc kubenswrapper[4732]: I0402 13:46:29.573196 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4m6rj" podUID="cf030ff0-459d-4453-975f-19ba4ff9641a" containerName="registry-server" containerID="cri-o://425cedb83d984da24336a079437685c8d001e2247a41453b3dda436bf0d02899" gracePeriod=30 Apr 02 13:46:29 crc kubenswrapper[4732]: I0402 13:46:29.582906 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-s6sm2"] Apr 02 13:46:29 crc kubenswrapper[4732]: I0402 13:46:29.583196 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-s6sm2" podUID="99e5508c-0d75-4f87-9c07-b53509e461aa" containerName="marketplace-operator" containerID="cri-o://9e2bd8a777276c58aff3c69d6184b247fdaf13ed2a075b2b5949f6250566deb5" gracePeriod=30 Apr 02 13:46:29 crc kubenswrapper[4732]: I0402 13:46:29.606298 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Apr 02 13:46:29 crc kubenswrapper[4732]: I0402 13:46:29.608055 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9lrnf"] Apr 02 13:46:29 crc kubenswrapper[4732]: I0402 13:46:29.608319 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9lrnf" podUID="51a0e365-014c-40e8-8749-7512f2c00758" containerName="registry-server" containerID="cri-o://f0b0b4ffe3ae302d71148a33d0d55749459f5819b20e8bc8265f0c4145447656" gracePeriod=30 Apr 02 13:46:29 crc kubenswrapper[4732]: I0402 13:46:29.623701 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lvckz"] Apr 02 13:46:29 crc kubenswrapper[4732]: I0402 13:46:29.624226 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lvckz" podUID="33708fee-32a5-4418-81d0-226813150db7" containerName="registry-server" containerID="cri-o://28818acdc7b448e9a253990a4d438b9341394b606d83467632fbb10396de5953" gracePeriod=30 Apr 02 13:46:29 crc kubenswrapper[4732]: E0402 13:46:29.671369 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf030ff0_459d_4453_975f_19ba4ff9641a.slice/crio-425cedb83d984da24336a079437685c8d001e2247a41453b3dda436bf0d02899.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99e5508c_0d75_4f87_9c07_b53509e461aa.slice/crio-9e2bd8a777276c58aff3c69d6184b247fdaf13ed2a075b2b5949f6250566deb5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33708fee_32a5_4418_81d0_226813150db7.slice/crio-28818acdc7b448e9a253990a4d438b9341394b606d83467632fbb10396de5953.scope\": RecentStats: unable to find data in memory cache]" Apr 02 13:46:29 crc kubenswrapper[4732]: I0402 13:46:29.735577 4732 generic.go:334] "Generic (PLEG): container finished" podID="99e5508c-0d75-4f87-9c07-b53509e461aa" containerID="9e2bd8a777276c58aff3c69d6184b247fdaf13ed2a075b2b5949f6250566deb5" exitCode=0 Apr 02 13:46:29 crc kubenswrapper[4732]: I0402 13:46:29.735665 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-s6sm2" event={"ID":"99e5508c-0d75-4f87-9c07-b53509e461aa","Type":"ContainerDied","Data":"9e2bd8a777276c58aff3c69d6184b247fdaf13ed2a075b2b5949f6250566deb5"} Apr 02 13:46:29 crc kubenswrapper[4732]: I0402 13:46:29.738093 4732 generic.go:334] "Generic (PLEG): container finished" podID="51a0e365-014c-40e8-8749-7512f2c00758" containerID="f0b0b4ffe3ae302d71148a33d0d55749459f5819b20e8bc8265f0c4145447656" exitCode=0 Apr 02 13:46:29 crc kubenswrapper[4732]: I0402 13:46:29.738148 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9lrnf" event={"ID":"51a0e365-014c-40e8-8749-7512f2c00758","Type":"ContainerDied","Data":"f0b0b4ffe3ae302d71148a33d0d55749459f5819b20e8bc8265f0c4145447656"} Apr 02 13:46:29 crc kubenswrapper[4732]: I0402 13:46:29.739576 4732 generic.go:334] "Generic (PLEG): container finished" podID="1827909b-49ea-4ba8-9995-f525d1d82f45" containerID="379848342ab71ffa30dea645fdced0d379fcf457ee71fe018e018fae2d13234b" exitCode=0 Apr 02 13:46:29 crc kubenswrapper[4732]: I0402 13:46:29.739634 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qlgxl" event={"ID":"1827909b-49ea-4ba8-9995-f525d1d82f45","Type":"ContainerDied","Data":"379848342ab71ffa30dea645fdced0d379fcf457ee71fe018e018fae2d13234b"} Apr 02 13:46:29 crc kubenswrapper[4732]: I0402 13:46:29.742003 4732 generic.go:334] "Generic (PLEG): container finished" podID="cf030ff0-459d-4453-975f-19ba4ff9641a" containerID="425cedb83d984da24336a079437685c8d001e2247a41453b3dda436bf0d02899" exitCode=0 Apr 02 13:46:29 crc kubenswrapper[4732]: I0402 13:46:29.742023 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m6rj" event={"ID":"cf030ff0-459d-4453-975f-19ba4ff9641a","Type":"ContainerDied","Data":"425cedb83d984da24336a079437685c8d001e2247a41453b3dda436bf0d02899"} Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.007311 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9lrnf" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.025777 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-s6sm2" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.031199 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.055320 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lvckz" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.068339 4732 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.075296 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.108727 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.114877 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjk9n\" (UniqueName: \"kubernetes.io/projected/51a0e365-014c-40e8-8749-7512f2c00758-kube-api-access-zjk9n\") pod \"51a0e365-014c-40e8-8749-7512f2c00758\" (UID: \"51a0e365-014c-40e8-8749-7512f2c00758\") " Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.114953 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a0e365-014c-40e8-8749-7512f2c00758-catalog-content\") pod \"51a0e365-014c-40e8-8749-7512f2c00758\" (UID: \"51a0e365-014c-40e8-8749-7512f2c00758\") " Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.115065 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a0e365-014c-40e8-8749-7512f2c00758-utilities\") pod \"51a0e365-014c-40e8-8749-7512f2c00758\" (UID: \"51a0e365-014c-40e8-8749-7512f2c00758\") " Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.116912 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51a0e365-014c-40e8-8749-7512f2c00758-utilities" (OuterVolumeSpecName: "utilities") pod "51a0e365-014c-40e8-8749-7512f2c00758" (UID: "51a0e365-014c-40e8-8749-7512f2c00758"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.129669 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51a0e365-014c-40e8-8749-7512f2c00758-kube-api-access-zjk9n" (OuterVolumeSpecName: "kube-api-access-zjk9n") pod "51a0e365-014c-40e8-8749-7512f2c00758" (UID: "51a0e365-014c-40e8-8749-7512f2c00758"). InnerVolumeSpecName "kube-api-access-zjk9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.152169 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51a0e365-014c-40e8-8749-7512f2c00758-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51a0e365-014c-40e8-8749-7512f2c00758" (UID: "51a0e365-014c-40e8-8749-7512f2c00758"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.216277 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/99e5508c-0d75-4f87-9c07-b53509e461aa-marketplace-operator-metrics\") pod \"99e5508c-0d75-4f87-9c07-b53509e461aa\" (UID: \"99e5508c-0d75-4f87-9c07-b53509e461aa\") " Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.216323 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tcvq\" (UniqueName: \"kubernetes.io/projected/33708fee-32a5-4418-81d0-226813150db7-kube-api-access-4tcvq\") pod \"33708fee-32a5-4418-81d0-226813150db7\" (UID: \"33708fee-32a5-4418-81d0-226813150db7\") " Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.216398 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33708fee-32a5-4418-81d0-226813150db7-utilities\") pod \"33708fee-32a5-4418-81d0-226813150db7\" (UID: \"33708fee-32a5-4418-81d0-226813150db7\") " Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.216441 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqtpb\" (UniqueName: \"kubernetes.io/projected/99e5508c-0d75-4f87-9c07-b53509e461aa-kube-api-access-kqtpb\") pod \"99e5508c-0d75-4f87-9c07-b53509e461aa\" (UID: \"99e5508c-0d75-4f87-9c07-b53509e461aa\") " Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.216471 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33708fee-32a5-4418-81d0-226813150db7-catalog-content\") pod \"33708fee-32a5-4418-81d0-226813150db7\" (UID: \"33708fee-32a5-4418-81d0-226813150db7\") " Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.216522 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/99e5508c-0d75-4f87-9c07-b53509e461aa-marketplace-trusted-ca\") pod \"99e5508c-0d75-4f87-9c07-b53509e461aa\" (UID: \"99e5508c-0d75-4f87-9c07-b53509e461aa\") " Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.216790 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjk9n\" (UniqueName: \"kubernetes.io/projected/51a0e365-014c-40e8-8749-7512f2c00758-kube-api-access-zjk9n\") on node \"crc\" DevicePath \"\"" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.216813 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a0e365-014c-40e8-8749-7512f2c00758-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.216826 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a0e365-014c-40e8-8749-7512f2c00758-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.217520 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99e5508c-0d75-4f87-9c07-b53509e461aa-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "99e5508c-0d75-4f87-9c07-b53509e461aa" (UID: "99e5508c-0d75-4f87-9c07-b53509e461aa"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.218837 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33708fee-32a5-4418-81d0-226813150db7-utilities" (OuterVolumeSpecName: "utilities") pod "33708fee-32a5-4418-81d0-226813150db7" (UID: "33708fee-32a5-4418-81d0-226813150db7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.220436 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99e5508c-0d75-4f87-9c07-b53509e461aa-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "99e5508c-0d75-4f87-9c07-b53509e461aa" (UID: "99e5508c-0d75-4f87-9c07-b53509e461aa"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.221073 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33708fee-32a5-4418-81d0-226813150db7-kube-api-access-4tcvq" (OuterVolumeSpecName: "kube-api-access-4tcvq") pod "33708fee-32a5-4418-81d0-226813150db7" (UID: "33708fee-32a5-4418-81d0-226813150db7"). InnerVolumeSpecName "kube-api-access-4tcvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.221476 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99e5508c-0d75-4f87-9c07-b53509e461aa-kube-api-access-kqtpb" (OuterVolumeSpecName: "kube-api-access-kqtpb") pod "99e5508c-0d75-4f87-9c07-b53509e461aa" (UID: "99e5508c-0d75-4f87-9c07-b53509e461aa"). InnerVolumeSpecName "kube-api-access-kqtpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.235958 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.248156 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.289478 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.312583 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.317499 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33708fee-32a5-4418-81d0-226813150db7-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.317582 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqtpb\" (UniqueName: \"kubernetes.io/projected/99e5508c-0d75-4f87-9c07-b53509e461aa-kube-api-access-kqtpb\") on node \"crc\" DevicePath \"\"" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.317592 4732 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/99e5508c-0d75-4f87-9c07-b53509e461aa-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.317601 4732 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/99e5508c-0d75-4f87-9c07-b53509e461aa-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.317635 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tcvq\" (UniqueName: \"kubernetes.io/projected/33708fee-32a5-4418-81d0-226813150db7-kube-api-access-4tcvq\") on node \"crc\" DevicePath \"\"" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.356658 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33708fee-32a5-4418-81d0-226813150db7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33708fee-32a5-4418-81d0-226813150db7" (UID: "33708fee-32a5-4418-81d0-226813150db7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.368379 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.400293 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4m6rj" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.415373 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qlgxl" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.418325 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33708fee-32a5-4418-81d0-226813150db7-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.480510 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.499466 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.519802 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1827909b-49ea-4ba8-9995-f525d1d82f45-utilities\") pod \"1827909b-49ea-4ba8-9995-f525d1d82f45\" (UID: \"1827909b-49ea-4ba8-9995-f525d1d82f45\") " Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.519875 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf030ff0-459d-4453-975f-19ba4ff9641a-catalog-content\") pod \"cf030ff0-459d-4453-975f-19ba4ff9641a\" (UID: \"cf030ff0-459d-4453-975f-19ba4ff9641a\") " Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.519911 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf030ff0-459d-4453-975f-19ba4ff9641a-utilities\") pod \"cf030ff0-459d-4453-975f-19ba4ff9641a\" (UID: \"cf030ff0-459d-4453-975f-19ba4ff9641a\") " Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.519932 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4z2m\" (UniqueName: \"kubernetes.io/projected/cf030ff0-459d-4453-975f-19ba4ff9641a-kube-api-access-r4z2m\") pod \"cf030ff0-459d-4453-975f-19ba4ff9641a\" (UID: \"cf030ff0-459d-4453-975f-19ba4ff9641a\") " Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.519950 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8zl9\" (UniqueName: \"kubernetes.io/projected/1827909b-49ea-4ba8-9995-f525d1d82f45-kube-api-access-m8zl9\") pod \"1827909b-49ea-4ba8-9995-f525d1d82f45\" (UID: \"1827909b-49ea-4ba8-9995-f525d1d82f45\") " Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.520019 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1827909b-49ea-4ba8-9995-f525d1d82f45-catalog-content\") pod \"1827909b-49ea-4ba8-9995-f525d1d82f45\" (UID: \"1827909b-49ea-4ba8-9995-f525d1d82f45\") " Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.524652 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf030ff0-459d-4453-975f-19ba4ff9641a-utilities" (OuterVolumeSpecName: "utilities") pod "cf030ff0-459d-4453-975f-19ba4ff9641a" (UID: "cf030ff0-459d-4453-975f-19ba4ff9641a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.524880 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1827909b-49ea-4ba8-9995-f525d1d82f45-utilities" (OuterVolumeSpecName: "utilities") pod "1827909b-49ea-4ba8-9995-f525d1d82f45" (UID: "1827909b-49ea-4ba8-9995-f525d1d82f45"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.526546 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1827909b-49ea-4ba8-9995-f525d1d82f45-kube-api-access-m8zl9" (OuterVolumeSpecName: "kube-api-access-m8zl9") pod "1827909b-49ea-4ba8-9995-f525d1d82f45" (UID: "1827909b-49ea-4ba8-9995-f525d1d82f45"). InnerVolumeSpecName "kube-api-access-m8zl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.527310 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf030ff0-459d-4453-975f-19ba4ff9641a-kube-api-access-r4z2m" (OuterVolumeSpecName: "kube-api-access-r4z2m") pod "cf030ff0-459d-4453-975f-19ba4ff9641a" (UID: "cf030ff0-459d-4453-975f-19ba4ff9641a"). InnerVolumeSpecName "kube-api-access-r4z2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.585895 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf030ff0-459d-4453-975f-19ba4ff9641a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf030ff0-459d-4453-975f-19ba4ff9641a" (UID: "cf030ff0-459d-4453-975f-19ba4ff9641a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.592012 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.596454 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1827909b-49ea-4ba8-9995-f525d1d82f45-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1827909b-49ea-4ba8-9995-f525d1d82f45" (UID: "1827909b-49ea-4ba8-9995-f525d1d82f45"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.620842 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf030ff0-459d-4453-975f-19ba4ff9641a-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.620875 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf030ff0-459d-4453-975f-19ba4ff9641a-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.620884 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4z2m\" (UniqueName: \"kubernetes.io/projected/cf030ff0-459d-4453-975f-19ba4ff9641a-kube-api-access-r4z2m\") on node \"crc\" DevicePath \"\"" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.620895 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8zl9\" (UniqueName: \"kubernetes.io/projected/1827909b-49ea-4ba8-9995-f525d1d82f45-kube-api-access-m8zl9\") on node \"crc\" DevicePath \"\"" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.620903 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1827909b-49ea-4ba8-9995-f525d1d82f45-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.620911 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1827909b-49ea-4ba8-9995-f525d1d82f45-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.638016 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.659223 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.710976 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.747348 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-s6sm2" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.747343 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-s6sm2" event={"ID":"99e5508c-0d75-4f87-9c07-b53509e461aa","Type":"ContainerDied","Data":"57b5ddb6afef1b202b4d135b6404a217d2103eeb512b85bddd0f568d4a52fc1a"} Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.748174 4732 scope.go:117] "RemoveContainer" containerID="9e2bd8a777276c58aff3c69d6184b247fdaf13ed2a075b2b5949f6250566deb5" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.749651 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9lrnf" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.749655 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9lrnf" event={"ID":"51a0e365-014c-40e8-8749-7512f2c00758","Type":"ContainerDied","Data":"a5b64c74cd0af37c8f6db5871a77443a73c101f6fd86bf5106af224fbcade0e1"} Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.753245 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qlgxl" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.753256 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qlgxl" event={"ID":"1827909b-49ea-4ba8-9995-f525d1d82f45","Type":"ContainerDied","Data":"da97d71055b4b784da56de209f467f22027d238d2a64236a68128d6712f04476"} Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.758308 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m6rj" event={"ID":"cf030ff0-459d-4453-975f-19ba4ff9641a","Type":"ContainerDied","Data":"8e710b90b254ff538be0e156b50bf321fb22653a2383e4ccd08086e5cab87427"} Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.758421 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4m6rj" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.761743 4732 generic.go:334] "Generic (PLEG): container finished" podID="33708fee-32a5-4418-81d0-226813150db7" containerID="28818acdc7b448e9a253990a4d438b9341394b606d83467632fbb10396de5953" exitCode=0 Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.761786 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvckz" event={"ID":"33708fee-32a5-4418-81d0-226813150db7","Type":"ContainerDied","Data":"28818acdc7b448e9a253990a4d438b9341394b606d83467632fbb10396de5953"} Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.761815 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvckz" event={"ID":"33708fee-32a5-4418-81d0-226813150db7","Type":"ContainerDied","Data":"bec5d9172a678578a4a25802436f82279e97fe20393ed0ec28b58bbd4d7353be"} Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.761872 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lvckz" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.772111 4732 scope.go:117] "RemoveContainer" containerID="f0b0b4ffe3ae302d71148a33d0d55749459f5819b20e8bc8265f0c4145447656" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.786453 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-s6sm2"] Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.791920 4732 scope.go:117] "RemoveContainer" containerID="300bdd0e2ae6c21103f4f45e67123d7de8eaa3cbef3e2562926b7864e0bb52df" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.800540 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-s6sm2"] Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.805290 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lvckz"] Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.809828 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lvckz"] Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.814993 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4m6rj"] Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.817824 4732 scope.go:117] "RemoveContainer" containerID="07f5d8e0c009a92486cfbb49db1e9a3555daf3090c6f9416221f132da132091b" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.823820 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4m6rj"] Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.841132 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qlgxl"] Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.846319 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qlgxl"] Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.846426 4732 scope.go:117] "RemoveContainer" containerID="379848342ab71ffa30dea645fdced0d379fcf457ee71fe018e018fae2d13234b" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.848628 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9lrnf"] Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.852304 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9lrnf"] Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.862579 4732 scope.go:117] "RemoveContainer" containerID="cc2f5fdab330cb1eec128d2812084addca824200ee57c2d5d69ce92760196169" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.878726 4732 scope.go:117] "RemoveContainer" containerID="7dcbae5ed3cdbf1bd8f3cfa18f78e2c48935aa93532bd6393fda2191c7650c52" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.893423 4732 scope.go:117] "RemoveContainer" containerID="425cedb83d984da24336a079437685c8d001e2247a41453b3dda436bf0d02899" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.916759 4732 scope.go:117] "RemoveContainer" containerID="c0ae4d4789a9b315a1d1c0369d0032385d80233a8d22f3bbbcfa5e78dc247fcb" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.936472 4732 scope.go:117] "RemoveContainer" containerID="898bc99e31c66d55335c3093239515a3af51a99013591088cd8ecde3d9445f69" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.950463 4732 scope.go:117] "RemoveContainer" containerID="28818acdc7b448e9a253990a4d438b9341394b606d83467632fbb10396de5953" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.966424 4732 scope.go:117] "RemoveContainer" containerID="a01d592b0ba86583dfa1fdecff24f378de8e39e237965b26e4cb0da5fcc7b523" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.978270 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Apr 02 13:46:30 crc kubenswrapper[4732]: I0402 13:46:30.988573 4732 scope.go:117] "RemoveContainer" containerID="966e60f520d1c8037b0fde5c77272cfcf79e1f1b7e6ec9164a02737e3ffa1266" Apr 02 13:46:31 crc kubenswrapper[4732]: I0402 13:46:31.009074 4732 scope.go:117] "RemoveContainer" containerID="28818acdc7b448e9a253990a4d438b9341394b606d83467632fbb10396de5953" Apr 02 13:46:31 crc kubenswrapper[4732]: E0402 13:46:31.009373 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28818acdc7b448e9a253990a4d438b9341394b606d83467632fbb10396de5953\": container with ID starting with 28818acdc7b448e9a253990a4d438b9341394b606d83467632fbb10396de5953 not found: ID does not exist" containerID="28818acdc7b448e9a253990a4d438b9341394b606d83467632fbb10396de5953" Apr 02 13:46:31 crc kubenswrapper[4732]: I0402 13:46:31.009414 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28818acdc7b448e9a253990a4d438b9341394b606d83467632fbb10396de5953"} err="failed to get container status \"28818acdc7b448e9a253990a4d438b9341394b606d83467632fbb10396de5953\": rpc error: code = NotFound desc = could not find container \"28818acdc7b448e9a253990a4d438b9341394b606d83467632fbb10396de5953\": container with ID starting with 28818acdc7b448e9a253990a4d438b9341394b606d83467632fbb10396de5953 not found: ID does not exist" Apr 02 13:46:31 crc kubenswrapper[4732]: I0402 13:46:31.009442 4732 scope.go:117] "RemoveContainer" containerID="a01d592b0ba86583dfa1fdecff24f378de8e39e237965b26e4cb0da5fcc7b523" Apr 02 13:46:31 crc kubenswrapper[4732]: E0402 13:46:31.009773 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a01d592b0ba86583dfa1fdecff24f378de8e39e237965b26e4cb0da5fcc7b523\": container with ID starting with a01d592b0ba86583dfa1fdecff24f378de8e39e237965b26e4cb0da5fcc7b523 not found: ID does not exist" containerID="a01d592b0ba86583dfa1fdecff24f378de8e39e237965b26e4cb0da5fcc7b523" Apr 02 13:46:31 crc kubenswrapper[4732]: I0402 13:46:31.009832 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a01d592b0ba86583dfa1fdecff24f378de8e39e237965b26e4cb0da5fcc7b523"} err="failed to get container status \"a01d592b0ba86583dfa1fdecff24f378de8e39e237965b26e4cb0da5fcc7b523\": rpc error: code = NotFound desc = could not find container \"a01d592b0ba86583dfa1fdecff24f378de8e39e237965b26e4cb0da5fcc7b523\": container with ID starting with a01d592b0ba86583dfa1fdecff24f378de8e39e237965b26e4cb0da5fcc7b523 not found: ID does not exist" Apr 02 13:46:31 crc kubenswrapper[4732]: I0402 13:46:31.009925 4732 scope.go:117] "RemoveContainer" containerID="966e60f520d1c8037b0fde5c77272cfcf79e1f1b7e6ec9164a02737e3ffa1266" Apr 02 13:46:31 crc kubenswrapper[4732]: E0402 13:46:31.010243 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"966e60f520d1c8037b0fde5c77272cfcf79e1f1b7e6ec9164a02737e3ffa1266\": container with ID starting with 966e60f520d1c8037b0fde5c77272cfcf79e1f1b7e6ec9164a02737e3ffa1266 not found: ID does not exist" containerID="966e60f520d1c8037b0fde5c77272cfcf79e1f1b7e6ec9164a02737e3ffa1266" Apr 02 13:46:31 crc kubenswrapper[4732]: I0402 13:46:31.010297 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"966e60f520d1c8037b0fde5c77272cfcf79e1f1b7e6ec9164a02737e3ffa1266"} err="failed to get container status \"966e60f520d1c8037b0fde5c77272cfcf79e1f1b7e6ec9164a02737e3ffa1266\": rpc error: code = NotFound desc = could not find container \"966e60f520d1c8037b0fde5c77272cfcf79e1f1b7e6ec9164a02737e3ffa1266\": container with ID starting with 966e60f520d1c8037b0fde5c77272cfcf79e1f1b7e6ec9164a02737e3ffa1266 not found: ID does not exist" Apr 02 13:46:31 crc kubenswrapper[4732]: I0402 13:46:31.010813 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Apr 02 13:46:31 crc kubenswrapper[4732]: I0402 13:46:31.045764 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Apr 02 13:46:31 crc kubenswrapper[4732]: I0402 13:46:31.281821 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Apr 02 13:46:31 crc kubenswrapper[4732]: I0402 13:46:31.479069 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Apr 02 13:46:31 crc kubenswrapper[4732]: I0402 13:46:31.724696 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:46:31 crc kubenswrapper[4732]: I0402 13:46:31.769478 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_be484bf35d3aabad50f6e4a86d258a31/startup-monitor/0.log" Apr 02 13:46:31 crc kubenswrapper[4732]: I0402 13:46:31.769532 4732 generic.go:334] "Generic (PLEG): container finished" podID="be484bf35d3aabad50f6e4a86d258a31" containerID="2dc186267b608fd3df3bd16c72511cb9c3cff44015cba0742315493871361361" exitCode=137 Apr 02 13:46:31 crc kubenswrapper[4732]: I0402 13:46:31.924236 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 13:46:31 crc kubenswrapper[4732]: I0402 13:46:31.924298 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 13:46:31 crc kubenswrapper[4732]: I0402 13:46:31.924345 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 13:46:31 crc kubenswrapper[4732]: I0402 13:46:31.924939 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6679fff77ada4a54a69b7189491d8feac3c5def6519c359d285b772063d2ad8d"} pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 02 13:46:31 crc kubenswrapper[4732]: I0402 13:46:31.924991 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" containerID="cri-o://6679fff77ada4a54a69b7189491d8feac3c5def6519c359d285b772063d2ad8d" gracePeriod=600 Apr 02 13:46:31 crc kubenswrapper[4732]: I0402 13:46:31.928679 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.180104 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_be484bf35d3aabad50f6e4a86d258a31/startup-monitor/0.log" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.180356 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.187037 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.346354 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-var-lock\") pod \"be484bf35d3aabad50f6e4a86d258a31\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.346449 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-manifests\") pod \"be484bf35d3aabad50f6e4a86d258a31\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.346471 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-var-lock" (OuterVolumeSpecName: "var-lock") pod "be484bf35d3aabad50f6e4a86d258a31" (UID: "be484bf35d3aabad50f6e4a86d258a31"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.346485 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-resource-dir\") pod \"be484bf35d3aabad50f6e4a86d258a31\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.346525 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-manifests" (OuterVolumeSpecName: "manifests") pod "be484bf35d3aabad50f6e4a86d258a31" (UID: "be484bf35d3aabad50f6e4a86d258a31"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.346528 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-var-log\") pod \"be484bf35d3aabad50f6e4a86d258a31\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.346558 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-var-log" (OuterVolumeSpecName: "var-log") pod "be484bf35d3aabad50f6e4a86d258a31" (UID: "be484bf35d3aabad50f6e4a86d258a31"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.346594 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "be484bf35d3aabad50f6e4a86d258a31" (UID: "be484bf35d3aabad50f6e4a86d258a31"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.346665 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-pod-resource-dir\") pod \"be484bf35d3aabad50f6e4a86d258a31\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.347725 4732 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-var-lock\") on node \"crc\" DevicePath \"\"" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.347746 4732 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-manifests\") on node \"crc\" DevicePath \"\"" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.347755 4732 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.347766 4732 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-var-log\") on node \"crc\" DevicePath \"\"" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.355162 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "be484bf35d3aabad50f6e4a86d258a31" (UID: "be484bf35d3aabad50f6e4a86d258a31"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.448995 4732 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.571649 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.672050 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.689958 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1827909b-49ea-4ba8-9995-f525d1d82f45" path="/var/lib/kubelet/pods/1827909b-49ea-4ba8-9995-f525d1d82f45/volumes" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.691139 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33708fee-32a5-4418-81d0-226813150db7" path="/var/lib/kubelet/pods/33708fee-32a5-4418-81d0-226813150db7/volumes" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.691809 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51a0e365-014c-40e8-8749-7512f2c00758" path="/var/lib/kubelet/pods/51a0e365-014c-40e8-8749-7512f2c00758/volumes" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.693227 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99e5508c-0d75-4f87-9c07-b53509e461aa" path="/var/lib/kubelet/pods/99e5508c-0d75-4f87-9c07-b53509e461aa/volumes" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.693827 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be484bf35d3aabad50f6e4a86d258a31" path="/var/lib/kubelet/pods/be484bf35d3aabad50f6e4a86d258a31/volumes" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.694336 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf030ff0-459d-4453-975f-19ba4ff9641a" path="/var/lib/kubelet/pods/cf030ff0-459d-4453-975f-19ba4ff9641a/volumes" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.780234 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_be484bf35d3aabad50f6e4a86d258a31/startup-monitor/0.log" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.780308 4732 scope.go:117] "RemoveContainer" containerID="2dc186267b608fd3df3bd16c72511cb9c3cff44015cba0742315493871361361" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.780373 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.785765 4732 generic.go:334] "Generic (PLEG): container finished" podID="38409e5e-4545-49da-8f6c-4bfb30582878" containerID="6679fff77ada4a54a69b7189491d8feac3c5def6519c359d285b772063d2ad8d" exitCode=0 Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.785931 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerDied","Data":"6679fff77ada4a54a69b7189491d8feac3c5def6519c359d285b772063d2ad8d"} Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.786057 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerStarted","Data":"af70e3cad0ce1292e5dbf16bae2fa3fb252384621cc6f2a55ab8798328ebbc50"} Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.803679 4732 scope.go:117] "RemoveContainer" containerID="09dd989931f88ecf396194a22057cfe8c07809db42ff1c2b861bf3aa5db1673f" Apr 02 13:46:32 crc kubenswrapper[4732]: I0402 13:46:32.813044 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Apr 02 13:46:33 crc kubenswrapper[4732]: I0402 13:46:33.011053 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Apr 02 13:46:33 crc kubenswrapper[4732]: I0402 13:46:33.292287 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.180361 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.546233 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585626-cqxvf"] Apr 02 13:46:34 crc kubenswrapper[4732]: E0402 13:46:34.546466 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99e5508c-0d75-4f87-9c07-b53509e461aa" containerName="marketplace-operator" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.546490 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e5508c-0d75-4f87-9c07-b53509e461aa" containerName="marketplace-operator" Apr 02 13:46:34 crc kubenswrapper[4732]: E0402 13:46:34.546507 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1827909b-49ea-4ba8-9995-f525d1d82f45" containerName="registry-server" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.546515 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1827909b-49ea-4ba8-9995-f525d1d82f45" containerName="registry-server" Apr 02 13:46:34 crc kubenswrapper[4732]: E0402 13:46:34.546528 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a0e365-014c-40e8-8749-7512f2c00758" containerName="registry-server" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.546536 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a0e365-014c-40e8-8749-7512f2c00758" containerName="registry-server" Apr 02 13:46:34 crc kubenswrapper[4732]: E0402 13:46:34.546548 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1827909b-49ea-4ba8-9995-f525d1d82f45" containerName="extract-utilities" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.546556 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1827909b-49ea-4ba8-9995-f525d1d82f45" containerName="extract-utilities" Apr 02 13:46:34 crc kubenswrapper[4732]: E0402 13:46:34.546566 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e4f3964-0282-48aa-a0a5-cad803e3812b" containerName="installer" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.546574 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e4f3964-0282-48aa-a0a5-cad803e3812b" containerName="installer" Apr 02 13:46:34 crc kubenswrapper[4732]: E0402 13:46:34.546588 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e580e30-624e-4cbc-9095-fa5659ce546e" containerName="installer" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.546596 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e580e30-624e-4cbc-9095-fa5659ce546e" containerName="installer" Apr 02 13:46:34 crc kubenswrapper[4732]: E0402 13:46:34.546652 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf030ff0-459d-4453-975f-19ba4ff9641a" containerName="extract-content" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.546662 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf030ff0-459d-4453-975f-19ba4ff9641a" containerName="extract-content" Apr 02 13:46:34 crc kubenswrapper[4732]: E0402 13:46:34.546674 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a0e365-014c-40e8-8749-7512f2c00758" containerName="extract-content" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.546681 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a0e365-014c-40e8-8749-7512f2c00758" containerName="extract-content" Apr 02 13:46:34 crc kubenswrapper[4732]: E0402 13:46:34.546691 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a0e365-014c-40e8-8749-7512f2c00758" containerName="extract-utilities" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.546698 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a0e365-014c-40e8-8749-7512f2c00758" containerName="extract-utilities" Apr 02 13:46:34 crc kubenswrapper[4732]: E0402 13:46:34.546711 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf030ff0-459d-4453-975f-19ba4ff9641a" containerName="extract-utilities" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.546718 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf030ff0-459d-4453-975f-19ba4ff9641a" containerName="extract-utilities" Apr 02 13:46:34 crc kubenswrapper[4732]: E0402 13:46:34.546727 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33708fee-32a5-4418-81d0-226813150db7" containerName="extract-content" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.546734 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="33708fee-32a5-4418-81d0-226813150db7" containerName="extract-content" Apr 02 13:46:34 crc kubenswrapper[4732]: E0402 13:46:34.546743 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf030ff0-459d-4453-975f-19ba4ff9641a" containerName="registry-server" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.546750 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf030ff0-459d-4453-975f-19ba4ff9641a" containerName="registry-server" Apr 02 13:46:34 crc kubenswrapper[4732]: E0402 13:46:34.546762 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be484bf35d3aabad50f6e4a86d258a31" containerName="startup-monitor" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.546769 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="be484bf35d3aabad50f6e4a86d258a31" containerName="startup-monitor" Apr 02 13:46:34 crc kubenswrapper[4732]: E0402 13:46:34.546778 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33708fee-32a5-4418-81d0-226813150db7" containerName="extract-utilities" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.546786 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="33708fee-32a5-4418-81d0-226813150db7" containerName="extract-utilities" Apr 02 13:46:34 crc kubenswrapper[4732]: E0402 13:46:34.546801 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33708fee-32a5-4418-81d0-226813150db7" containerName="registry-server" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.546809 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="33708fee-32a5-4418-81d0-226813150db7" containerName="registry-server" Apr 02 13:46:34 crc kubenswrapper[4732]: E0402 13:46:34.546823 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1827909b-49ea-4ba8-9995-f525d1d82f45" containerName="extract-content" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.546830 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1827909b-49ea-4ba8-9995-f525d1d82f45" containerName="extract-content" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.546926 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="33708fee-32a5-4418-81d0-226813150db7" containerName="registry-server" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.546937 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="99e5508c-0d75-4f87-9c07-b53509e461aa" containerName="marketplace-operator" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.546944 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1827909b-49ea-4ba8-9995-f525d1d82f45" containerName="registry-server" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.546954 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e580e30-624e-4cbc-9095-fa5659ce546e" containerName="installer" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.546962 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="be484bf35d3aabad50f6e4a86d258a31" containerName="startup-monitor" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.546972 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf030ff0-459d-4453-975f-19ba4ff9641a" containerName="registry-server" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.546984 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="51a0e365-014c-40e8-8749-7512f2c00758" containerName="registry-server" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.546996 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e4f3964-0282-48aa-a0a5-cad803e3812b" containerName="installer" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.547470 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585626-cqxvf" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.550426 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.550474 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.556205 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.557213 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585626-cqxvf"] Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.608509 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-d9c747dbb-jqfln"] Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.645910 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w7mzq"] Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.646723 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-w7mzq" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.649342 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.649718 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.650068 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.650687 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.662006 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.662198 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w7mzq"] Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.682917 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2c9f0ff-65e0-4d5f-8518-4461263be6c2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-w7mzq\" (UID: \"c2c9f0ff-65e0-4d5f-8518-4461263be6c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-w7mzq" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.683127 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c2c9f0ff-65e0-4d5f-8518-4461263be6c2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-w7mzq\" (UID: \"c2c9f0ff-65e0-4d5f-8518-4461263be6c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-w7mzq" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.683178 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m67gc\" (UniqueName: \"kubernetes.io/projected/068f0778-94e5-4eca-b150-9ba914b8b879-kube-api-access-m67gc\") pod \"auto-csr-approver-29585626-cqxvf\" (UID: \"068f0778-94e5-4eca-b150-9ba914b8b879\") " pod="openshift-infra/auto-csr-approver-29585626-cqxvf" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.683203 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b7q5\" (UniqueName: \"kubernetes.io/projected/c2c9f0ff-65e0-4d5f-8518-4461263be6c2-kube-api-access-2b7q5\") pod \"marketplace-operator-79b997595-w7mzq\" (UID: \"c2c9f0ff-65e0-4d5f-8518-4461263be6c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-w7mzq" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.784506 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b7q5\" (UniqueName: \"kubernetes.io/projected/c2c9f0ff-65e0-4d5f-8518-4461263be6c2-kube-api-access-2b7q5\") pod \"marketplace-operator-79b997595-w7mzq\" (UID: \"c2c9f0ff-65e0-4d5f-8518-4461263be6c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-w7mzq" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.784674 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2c9f0ff-65e0-4d5f-8518-4461263be6c2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-w7mzq\" (UID: \"c2c9f0ff-65e0-4d5f-8518-4461263be6c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-w7mzq" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.785861 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2c9f0ff-65e0-4d5f-8518-4461263be6c2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-w7mzq\" (UID: \"c2c9f0ff-65e0-4d5f-8518-4461263be6c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-w7mzq" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.786028 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c2c9f0ff-65e0-4d5f-8518-4461263be6c2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-w7mzq\" (UID: \"c2c9f0ff-65e0-4d5f-8518-4461263be6c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-w7mzq" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.786939 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m67gc\" (UniqueName: \"kubernetes.io/projected/068f0778-94e5-4eca-b150-9ba914b8b879-kube-api-access-m67gc\") pod \"auto-csr-approver-29585626-cqxvf\" (UID: \"068f0778-94e5-4eca-b150-9ba914b8b879\") " pod="openshift-infra/auto-csr-approver-29585626-cqxvf" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.795736 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c2c9f0ff-65e0-4d5f-8518-4461263be6c2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-w7mzq\" (UID: \"c2c9f0ff-65e0-4d5f-8518-4461263be6c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-w7mzq" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.801923 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b7q5\" (UniqueName: \"kubernetes.io/projected/c2c9f0ff-65e0-4d5f-8518-4461263be6c2-kube-api-access-2b7q5\") pod \"marketplace-operator-79b997595-w7mzq\" (UID: \"c2c9f0ff-65e0-4d5f-8518-4461263be6c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-w7mzq" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.813924 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m67gc\" (UniqueName: \"kubernetes.io/projected/068f0778-94e5-4eca-b150-9ba914b8b879-kube-api-access-m67gc\") pod \"auto-csr-approver-29585626-cqxvf\" (UID: \"068f0778-94e5-4eca-b150-9ba914b8b879\") " pod="openshift-infra/auto-csr-approver-29585626-cqxvf" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.871982 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585626-cqxvf" Apr 02 13:46:34 crc kubenswrapper[4732]: I0402 13:46:34.969477 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-w7mzq" Apr 02 13:46:35 crc kubenswrapper[4732]: I0402 13:46:35.328115 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585626-cqxvf"] Apr 02 13:46:35 crc kubenswrapper[4732]: I0402 13:46:35.337296 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 02 13:46:35 crc kubenswrapper[4732]: I0402 13:46:35.427495 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w7mzq"] Apr 02 13:46:35 crc kubenswrapper[4732]: W0402 13:46:35.433528 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2c9f0ff_65e0_4d5f_8518_4461263be6c2.slice/crio-04e5501bb97d7991e5d9ebeca40d062823e0b4dba8707f949821c095248f7061 WatchSource:0}: Error finding container 04e5501bb97d7991e5d9ebeca40d062823e0b4dba8707f949821c095248f7061: Status 404 returned error can't find the container with id 04e5501bb97d7991e5d9ebeca40d062823e0b4dba8707f949821c095248f7061 Apr 02 13:46:35 crc kubenswrapper[4732]: I0402 13:46:35.815856 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-w7mzq" event={"ID":"c2c9f0ff-65e0-4d5f-8518-4461263be6c2","Type":"ContainerStarted","Data":"be6e2beea77bd8e0357d133153e457886032731228f47bf38590131353aa9925"} Apr 02 13:46:35 crc kubenswrapper[4732]: I0402 13:46:35.816186 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-w7mzq" event={"ID":"c2c9f0ff-65e0-4d5f-8518-4461263be6c2","Type":"ContainerStarted","Data":"04e5501bb97d7991e5d9ebeca40d062823e0b4dba8707f949821c095248f7061"} Apr 02 13:46:35 crc kubenswrapper[4732]: I0402 13:46:35.816219 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-w7mzq" Apr 02 13:46:35 crc kubenswrapper[4732]: I0402 13:46:35.817955 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-w7mzq" Apr 02 13:46:35 crc kubenswrapper[4732]: I0402 13:46:35.818302 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585626-cqxvf" event={"ID":"068f0778-94e5-4eca-b150-9ba914b8b879","Type":"ContainerStarted","Data":"eae477b108589f0d3514ca2c346604136ac6a929f6751e5e5090dabaf52bc1ec"} Apr 02 13:46:35 crc kubenswrapper[4732]: I0402 13:46:35.834397 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-w7mzq" podStartSLOduration=1.8343804000000001 podStartE2EDuration="1.8343804s" podCreationTimestamp="2026-04-02 13:46:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:46:35.831773799 +0000 UTC m=+552.736181362" watchObservedRunningTime="2026-04-02 13:46:35.8343804 +0000 UTC m=+552.738787953" Apr 02 13:46:36 crc kubenswrapper[4732]: I0402 13:46:36.824574 4732 generic.go:334] "Generic (PLEG): container finished" podID="068f0778-94e5-4eca-b150-9ba914b8b879" containerID="e42ab516580b36a04af43f8a58d74b6d65395a948ab9bd93f4a6d73b00933406" exitCode=0 Apr 02 13:46:36 crc kubenswrapper[4732]: I0402 13:46:36.824791 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585626-cqxvf" event={"ID":"068f0778-94e5-4eca-b150-9ba914b8b879","Type":"ContainerDied","Data":"e42ab516580b36a04af43f8a58d74b6d65395a948ab9bd93f4a6d73b00933406"} Apr 02 13:46:38 crc kubenswrapper[4732]: I0402 13:46:38.054942 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585626-cqxvf" Apr 02 13:46:38 crc kubenswrapper[4732]: I0402 13:46:38.227367 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m67gc\" (UniqueName: \"kubernetes.io/projected/068f0778-94e5-4eca-b150-9ba914b8b879-kube-api-access-m67gc\") pod \"068f0778-94e5-4eca-b150-9ba914b8b879\" (UID: \"068f0778-94e5-4eca-b150-9ba914b8b879\") " Apr 02 13:46:38 crc kubenswrapper[4732]: I0402 13:46:38.236758 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/068f0778-94e5-4eca-b150-9ba914b8b879-kube-api-access-m67gc" (OuterVolumeSpecName: "kube-api-access-m67gc") pod "068f0778-94e5-4eca-b150-9ba914b8b879" (UID: "068f0778-94e5-4eca-b150-9ba914b8b879"). InnerVolumeSpecName "kube-api-access-m67gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:46:38 crc kubenswrapper[4732]: I0402 13:46:38.328795 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m67gc\" (UniqueName: \"kubernetes.io/projected/068f0778-94e5-4eca-b150-9ba914b8b879-kube-api-access-m67gc\") on node \"crc\" DevicePath \"\"" Apr 02 13:46:38 crc kubenswrapper[4732]: I0402 13:46:38.836352 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585626-cqxvf" event={"ID":"068f0778-94e5-4eca-b150-9ba914b8b879","Type":"ContainerDied","Data":"eae477b108589f0d3514ca2c346604136ac6a929f6751e5e5090dabaf52bc1ec"} Apr 02 13:46:38 crc kubenswrapper[4732]: I0402 13:46:38.836396 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eae477b108589f0d3514ca2c346604136ac6a929f6751e5e5090dabaf52bc1ec" Apr 02 13:46:38 crc kubenswrapper[4732]: I0402 13:46:38.836416 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585626-cqxvf" Apr 02 13:46:39 crc kubenswrapper[4732]: I0402 13:46:39.111454 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585620-t897v"] Apr 02 13:46:39 crc kubenswrapper[4732]: I0402 13:46:39.111510 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585620-t897v"] Apr 02 13:46:40 crc kubenswrapper[4732]: I0402 13:46:40.687497 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a82c61a-7d7e-4401-963a-1f1fe908002c" path="/var/lib/kubelet/pods/9a82c61a-7d7e-4401-963a-1f1fe908002c/volumes" Apr 02 13:46:59 crc kubenswrapper[4732]: I0402 13:46:59.646315 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" podUID="57f69fbf-b90f-411e-8e0c-70a25bb01566" containerName="registry" containerID="cri-o://0ed979adaba6f9d50a6af5062245f390e4ff9c86cf2c4cb22731d7cc664ad8ff" gracePeriod=30 Apr 02 13:46:59 crc kubenswrapper[4732]: I0402 13:46:59.970650 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" event={"ID":"57f69fbf-b90f-411e-8e0c-70a25bb01566","Type":"ContainerDied","Data":"0ed979adaba6f9d50a6af5062245f390e4ff9c86cf2c4cb22731d7cc664ad8ff"} Apr 02 13:46:59 crc kubenswrapper[4732]: I0402 13:46:59.970673 4732 generic.go:334] "Generic (PLEG): container finished" podID="57f69fbf-b90f-411e-8e0c-70a25bb01566" containerID="0ed979adaba6f9d50a6af5062245f390e4ff9c86cf2c4cb22731d7cc664ad8ff" exitCode=0 Apr 02 13:46:59 crc kubenswrapper[4732]: I0402 13:46:59.970997 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" event={"ID":"57f69fbf-b90f-411e-8e0c-70a25bb01566","Type":"ContainerDied","Data":"bef1f039b81372c2a0160ad8768c3351ed910c6e18562099f965ed401a3b2142"} Apr 02 13:46:59 crc kubenswrapper[4732]: I0402 13:46:59.971008 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bef1f039b81372c2a0160ad8768c3351ed910c6e18562099f965ed401a3b2142" Apr 02 13:46:59 crc kubenswrapper[4732]: I0402 13:46:59.972168 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:47:00 crc kubenswrapper[4732]: I0402 13:47:00.107760 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57f69fbf-b90f-411e-8e0c-70a25bb01566-bound-sa-token\") pod \"57f69fbf-b90f-411e-8e0c-70a25bb01566\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " Apr 02 13:47:00 crc kubenswrapper[4732]: I0402 13:47:00.108095 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57f69fbf-b90f-411e-8e0c-70a25bb01566-installation-pull-secrets\") pod \"57f69fbf-b90f-411e-8e0c-70a25bb01566\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " Apr 02 13:47:00 crc kubenswrapper[4732]: I0402 13:47:00.108130 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57f69fbf-b90f-411e-8e0c-70a25bb01566-registry-certificates\") pod \"57f69fbf-b90f-411e-8e0c-70a25bb01566\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " Apr 02 13:47:00 crc kubenswrapper[4732]: I0402 13:47:00.108163 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-564q9\" (UniqueName: \"kubernetes.io/projected/57f69fbf-b90f-411e-8e0c-70a25bb01566-kube-api-access-564q9\") pod \"57f69fbf-b90f-411e-8e0c-70a25bb01566\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " Apr 02 13:47:00 crc kubenswrapper[4732]: I0402 13:47:00.108181 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57f69fbf-b90f-411e-8e0c-70a25bb01566-registry-tls\") pod \"57f69fbf-b90f-411e-8e0c-70a25bb01566\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " Apr 02 13:47:00 crc kubenswrapper[4732]: I0402 13:47:00.108288 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"57f69fbf-b90f-411e-8e0c-70a25bb01566\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " Apr 02 13:47:00 crc kubenswrapper[4732]: I0402 13:47:00.108317 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57f69fbf-b90f-411e-8e0c-70a25bb01566-ca-trust-extracted\") pod \"57f69fbf-b90f-411e-8e0c-70a25bb01566\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " Apr 02 13:47:00 crc kubenswrapper[4732]: I0402 13:47:00.108352 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57f69fbf-b90f-411e-8e0c-70a25bb01566-trusted-ca\") pod \"57f69fbf-b90f-411e-8e0c-70a25bb01566\" (UID: \"57f69fbf-b90f-411e-8e0c-70a25bb01566\") " Apr 02 13:47:00 crc kubenswrapper[4732]: I0402 13:47:00.109200 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57f69fbf-b90f-411e-8e0c-70a25bb01566-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "57f69fbf-b90f-411e-8e0c-70a25bb01566" (UID: "57f69fbf-b90f-411e-8e0c-70a25bb01566"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:47:00 crc kubenswrapper[4732]: I0402 13:47:00.113434 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57f69fbf-b90f-411e-8e0c-70a25bb01566-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "57f69fbf-b90f-411e-8e0c-70a25bb01566" (UID: "57f69fbf-b90f-411e-8e0c-70a25bb01566"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:47:00 crc kubenswrapper[4732]: I0402 13:47:00.114608 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57f69fbf-b90f-411e-8e0c-70a25bb01566-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "57f69fbf-b90f-411e-8e0c-70a25bb01566" (UID: "57f69fbf-b90f-411e-8e0c-70a25bb01566"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:47:00 crc kubenswrapper[4732]: I0402 13:47:00.115410 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57f69fbf-b90f-411e-8e0c-70a25bb01566-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "57f69fbf-b90f-411e-8e0c-70a25bb01566" (UID: "57f69fbf-b90f-411e-8e0c-70a25bb01566"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:47:00 crc kubenswrapper[4732]: I0402 13:47:00.117109 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57f69fbf-b90f-411e-8e0c-70a25bb01566-kube-api-access-564q9" (OuterVolumeSpecName: "kube-api-access-564q9") pod "57f69fbf-b90f-411e-8e0c-70a25bb01566" (UID: "57f69fbf-b90f-411e-8e0c-70a25bb01566"). InnerVolumeSpecName "kube-api-access-564q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:47:00 crc kubenswrapper[4732]: I0402 13:47:00.125234 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f69fbf-b90f-411e-8e0c-70a25bb01566-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "57f69fbf-b90f-411e-8e0c-70a25bb01566" (UID: "57f69fbf-b90f-411e-8e0c-70a25bb01566"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:47:00 crc kubenswrapper[4732]: I0402 13:47:00.131252 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57f69fbf-b90f-411e-8e0c-70a25bb01566-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "57f69fbf-b90f-411e-8e0c-70a25bb01566" (UID: "57f69fbf-b90f-411e-8e0c-70a25bb01566"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:47:00 crc kubenswrapper[4732]: I0402 13:47:00.135796 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "57f69fbf-b90f-411e-8e0c-70a25bb01566" (UID: "57f69fbf-b90f-411e-8e0c-70a25bb01566"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Apr 02 13:47:00 crc kubenswrapper[4732]: I0402 13:47:00.209448 4732 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57f69fbf-b90f-411e-8e0c-70a25bb01566-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Apr 02 13:47:00 crc kubenswrapper[4732]: I0402 13:47:00.209487 4732 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57f69fbf-b90f-411e-8e0c-70a25bb01566-registry-certificates\") on node \"crc\" DevicePath \"\"" Apr 02 13:47:00 crc kubenswrapper[4732]: I0402 13:47:00.209498 4732 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57f69fbf-b90f-411e-8e0c-70a25bb01566-registry-tls\") on node \"crc\" DevicePath \"\"" Apr 02 13:47:00 crc kubenswrapper[4732]: I0402 13:47:00.209520 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-564q9\" (UniqueName: \"kubernetes.io/projected/57f69fbf-b90f-411e-8e0c-70a25bb01566-kube-api-access-564q9\") on node \"crc\" DevicePath \"\"" Apr 02 13:47:00 crc kubenswrapper[4732]: I0402 13:47:00.209533 4732 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57f69fbf-b90f-411e-8e0c-70a25bb01566-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Apr 02 13:47:00 crc kubenswrapper[4732]: I0402 13:47:00.209543 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57f69fbf-b90f-411e-8e0c-70a25bb01566-trusted-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:47:00 crc kubenswrapper[4732]: I0402 13:47:00.209554 4732 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57f69fbf-b90f-411e-8e0c-70a25bb01566-bound-sa-token\") on node \"crc\" DevicePath \"\"" Apr 02 13:47:00 crc kubenswrapper[4732]: I0402 13:47:00.976468 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d9c747dbb-jqfln" Apr 02 13:47:01 crc kubenswrapper[4732]: I0402 13:47:01.631371 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-d9c747dbb-jqfln"] Apr 02 13:47:01 crc kubenswrapper[4732]: I0402 13:47:01.636591 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-d9c747dbb-jqfln"] Apr 02 13:47:02 crc kubenswrapper[4732]: I0402 13:47:02.686017 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57f69fbf-b90f-411e-8e0c-70a25bb01566" path="/var/lib/kubelet/pods/57f69fbf-b90f-411e-8e0c-70a25bb01566/volumes" Apr 02 13:47:07 crc kubenswrapper[4732]: I0402 13:47:07.360695 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Apr 02 13:47:13 crc kubenswrapper[4732]: I0402 13:47:13.687367 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Apr 02 13:47:16 crc kubenswrapper[4732]: I0402 13:47:16.942506 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Apr 02 13:47:17 crc kubenswrapper[4732]: I0402 13:47:17.240200 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.469311 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2mvbv"] Apr 02 13:47:20 crc kubenswrapper[4732]: E0402 13:47:20.469561 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f69fbf-b90f-411e-8e0c-70a25bb01566" containerName="registry" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.469574 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f69fbf-b90f-411e-8e0c-70a25bb01566" containerName="registry" Apr 02 13:47:20 crc kubenswrapper[4732]: E0402 13:47:20.469587 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="068f0778-94e5-4eca-b150-9ba914b8b879" containerName="oc" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.469594 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="068f0778-94e5-4eca-b150-9ba914b8b879" containerName="oc" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.469733 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="068f0778-94e5-4eca-b150-9ba914b8b879" containerName="oc" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.469750 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="57f69fbf-b90f-411e-8e0c-70a25bb01566" containerName="registry" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.470583 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2mvbv" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.473489 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.480601 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2mvbv"] Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.492030 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.567561 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w49wv\" (UniqueName: \"kubernetes.io/projected/7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b-kube-api-access-w49wv\") pod \"community-operators-2mvbv\" (UID: \"7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b\") " pod="openshift-marketplace/community-operators-2mvbv" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.567838 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b-catalog-content\") pod \"community-operators-2mvbv\" (UID: \"7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b\") " pod="openshift-marketplace/community-operators-2mvbv" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.567967 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b-utilities\") pod \"community-operators-2mvbv\" (UID: \"7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b\") " pod="openshift-marketplace/community-operators-2mvbv" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.660259 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6265v"] Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.661161 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6265v" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.662775 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.669587 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b-catalog-content\") pod \"community-operators-2mvbv\" (UID: \"7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b\") " pod="openshift-marketplace/community-operators-2mvbv" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.669692 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b-utilities\") pod \"community-operators-2mvbv\" (UID: \"7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b\") " pod="openshift-marketplace/community-operators-2mvbv" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.669737 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w49wv\" (UniqueName: \"kubernetes.io/projected/7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b-kube-api-access-w49wv\") pod \"community-operators-2mvbv\" (UID: \"7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b\") " pod="openshift-marketplace/community-operators-2mvbv" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.670105 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b-catalog-content\") pod \"community-operators-2mvbv\" (UID: \"7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b\") " pod="openshift-marketplace/community-operators-2mvbv" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.670120 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b-utilities\") pod \"community-operators-2mvbv\" (UID: \"7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b\") " pod="openshift-marketplace/community-operators-2mvbv" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.670285 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6265v"] Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.692570 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w49wv\" (UniqueName: \"kubernetes.io/projected/7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b-kube-api-access-w49wv\") pod \"community-operators-2mvbv\" (UID: \"7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b\") " pod="openshift-marketplace/community-operators-2mvbv" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.771065 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkg5t\" (UniqueName: \"kubernetes.io/projected/5d2e3468-f731-45d7-bc5f-5c32a739a196-kube-api-access-rkg5t\") pod \"certified-operators-6265v\" (UID: \"5d2e3468-f731-45d7-bc5f-5c32a739a196\") " pod="openshift-marketplace/certified-operators-6265v" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.771193 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d2e3468-f731-45d7-bc5f-5c32a739a196-utilities\") pod \"certified-operators-6265v\" (UID: \"5d2e3468-f731-45d7-bc5f-5c32a739a196\") " pod="openshift-marketplace/certified-operators-6265v" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.771245 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d2e3468-f731-45d7-bc5f-5c32a739a196-catalog-content\") pod \"certified-operators-6265v\" (UID: \"5d2e3468-f731-45d7-bc5f-5c32a739a196\") " pod="openshift-marketplace/certified-operators-6265v" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.788846 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2mvbv" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.872702 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d2e3468-f731-45d7-bc5f-5c32a739a196-utilities\") pod \"certified-operators-6265v\" (UID: \"5d2e3468-f731-45d7-bc5f-5c32a739a196\") " pod="openshift-marketplace/certified-operators-6265v" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.872761 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d2e3468-f731-45d7-bc5f-5c32a739a196-catalog-content\") pod \"certified-operators-6265v\" (UID: \"5d2e3468-f731-45d7-bc5f-5c32a739a196\") " pod="openshift-marketplace/certified-operators-6265v" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.872819 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkg5t\" (UniqueName: \"kubernetes.io/projected/5d2e3468-f731-45d7-bc5f-5c32a739a196-kube-api-access-rkg5t\") pod \"certified-operators-6265v\" (UID: \"5d2e3468-f731-45d7-bc5f-5c32a739a196\") " pod="openshift-marketplace/certified-operators-6265v" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.873863 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d2e3468-f731-45d7-bc5f-5c32a739a196-utilities\") pod \"certified-operators-6265v\" (UID: \"5d2e3468-f731-45d7-bc5f-5c32a739a196\") " pod="openshift-marketplace/certified-operators-6265v" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.874070 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d2e3468-f731-45d7-bc5f-5c32a739a196-catalog-content\") pod \"certified-operators-6265v\" (UID: \"5d2e3468-f731-45d7-bc5f-5c32a739a196\") " pod="openshift-marketplace/certified-operators-6265v" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.896002 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkg5t\" (UniqueName: \"kubernetes.io/projected/5d2e3468-f731-45d7-bc5f-5c32a739a196-kube-api-access-rkg5t\") pod \"certified-operators-6265v\" (UID: \"5d2e3468-f731-45d7-bc5f-5c32a739a196\") " pod="openshift-marketplace/certified-operators-6265v" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.985153 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6265v" Apr 02 13:47:20 crc kubenswrapper[4732]: I0402 13:47:20.997831 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2mvbv"] Apr 02 13:47:21 crc kubenswrapper[4732]: W0402 13:47:21.013484 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7eb89e74_cc6f_4763_a2ed_e4a3e8c2f89b.slice/crio-fb5d2fc8d23b9c040dd2f129213f604540eb2aa827f8164907d7dfd1ca589e16 WatchSource:0}: Error finding container fb5d2fc8d23b9c040dd2f129213f604540eb2aa827f8164907d7dfd1ca589e16: Status 404 returned error can't find the container with id fb5d2fc8d23b9c040dd2f129213f604540eb2aa827f8164907d7dfd1ca589e16 Apr 02 13:47:21 crc kubenswrapper[4732]: I0402 13:47:21.078466 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2mvbv" event={"ID":"7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b","Type":"ContainerStarted","Data":"fb5d2fc8d23b9c040dd2f129213f604540eb2aa827f8164907d7dfd1ca589e16"} Apr 02 13:47:21 crc kubenswrapper[4732]: I0402 13:47:21.386847 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6265v"] Apr 02 13:47:21 crc kubenswrapper[4732]: W0402 13:47:21.397846 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d2e3468_f731_45d7_bc5f_5c32a739a196.slice/crio-7a8f3431e8ad34c880c0b380828c8d6e786d8787f01d3a2409977a1522973258 WatchSource:0}: Error finding container 7a8f3431e8ad34c880c0b380828c8d6e786d8787f01d3a2409977a1522973258: Status 404 returned error can't find the container with id 7a8f3431e8ad34c880c0b380828c8d6e786d8787f01d3a2409977a1522973258 Apr 02 13:47:22 crc kubenswrapper[4732]: I0402 13:47:22.103475 4732 generic.go:334] "Generic (PLEG): container finished" podID="5d2e3468-f731-45d7-bc5f-5c32a739a196" containerID="5d81d40165dd6d52fa014e12c3fa4d5b642833ff901ffc8bc5886add4cadfc47" exitCode=0 Apr 02 13:47:22 crc kubenswrapper[4732]: I0402 13:47:22.103543 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6265v" event={"ID":"5d2e3468-f731-45d7-bc5f-5c32a739a196","Type":"ContainerDied","Data":"5d81d40165dd6d52fa014e12c3fa4d5b642833ff901ffc8bc5886add4cadfc47"} Apr 02 13:47:22 crc kubenswrapper[4732]: I0402 13:47:22.103598 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6265v" event={"ID":"5d2e3468-f731-45d7-bc5f-5c32a739a196","Type":"ContainerStarted","Data":"7a8f3431e8ad34c880c0b380828c8d6e786d8787f01d3a2409977a1522973258"} Apr 02 13:47:22 crc kubenswrapper[4732]: I0402 13:47:22.105272 4732 generic.go:334] "Generic (PLEG): container finished" podID="7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b" containerID="b2b6bf3fbd04c568377b8331c590f7a81b61f4b99dea103fd20e3a0258a95e0d" exitCode=0 Apr 02 13:47:22 crc kubenswrapper[4732]: I0402 13:47:22.105313 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2mvbv" event={"ID":"7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b","Type":"ContainerDied","Data":"b2b6bf3fbd04c568377b8331c590f7a81b61f4b99dea103fd20e3a0258a95e0d"} Apr 02 13:47:22 crc kubenswrapper[4732]: I0402 13:47:22.860322 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8jlqh"] Apr 02 13:47:22 crc kubenswrapper[4732]: I0402 13:47:22.862123 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8jlqh" Apr 02 13:47:22 crc kubenswrapper[4732]: I0402 13:47:22.865996 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Apr 02 13:47:22 crc kubenswrapper[4732]: I0402 13:47:22.869297 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8jlqh"] Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.010310 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76b4f705-b92d-4b11-8ddf-a4252a98c37d-catalog-content\") pod \"redhat-marketplace-8jlqh\" (UID: \"76b4f705-b92d-4b11-8ddf-a4252a98c37d\") " pod="openshift-marketplace/redhat-marketplace-8jlqh" Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.010380 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h75c2\" (UniqueName: \"kubernetes.io/projected/76b4f705-b92d-4b11-8ddf-a4252a98c37d-kube-api-access-h75c2\") pod \"redhat-marketplace-8jlqh\" (UID: \"76b4f705-b92d-4b11-8ddf-a4252a98c37d\") " pod="openshift-marketplace/redhat-marketplace-8jlqh" Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.010412 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76b4f705-b92d-4b11-8ddf-a4252a98c37d-utilities\") pod \"redhat-marketplace-8jlqh\" (UID: \"76b4f705-b92d-4b11-8ddf-a4252a98c37d\") " pod="openshift-marketplace/redhat-marketplace-8jlqh" Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.111277 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76b4f705-b92d-4b11-8ddf-a4252a98c37d-catalog-content\") pod \"redhat-marketplace-8jlqh\" (UID: \"76b4f705-b92d-4b11-8ddf-a4252a98c37d\") " pod="openshift-marketplace/redhat-marketplace-8jlqh" Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.111342 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h75c2\" (UniqueName: \"kubernetes.io/projected/76b4f705-b92d-4b11-8ddf-a4252a98c37d-kube-api-access-h75c2\") pod \"redhat-marketplace-8jlqh\" (UID: \"76b4f705-b92d-4b11-8ddf-a4252a98c37d\") " pod="openshift-marketplace/redhat-marketplace-8jlqh" Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.111402 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76b4f705-b92d-4b11-8ddf-a4252a98c37d-utilities\") pod \"redhat-marketplace-8jlqh\" (UID: \"76b4f705-b92d-4b11-8ddf-a4252a98c37d\") " pod="openshift-marketplace/redhat-marketplace-8jlqh" Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.111863 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76b4f705-b92d-4b11-8ddf-a4252a98c37d-catalog-content\") pod \"redhat-marketplace-8jlqh\" (UID: \"76b4f705-b92d-4b11-8ddf-a4252a98c37d\") " pod="openshift-marketplace/redhat-marketplace-8jlqh" Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.111872 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76b4f705-b92d-4b11-8ddf-a4252a98c37d-utilities\") pod \"redhat-marketplace-8jlqh\" (UID: \"76b4f705-b92d-4b11-8ddf-a4252a98c37d\") " pod="openshift-marketplace/redhat-marketplace-8jlqh" Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.113273 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6265v" event={"ID":"5d2e3468-f731-45d7-bc5f-5c32a739a196","Type":"ContainerStarted","Data":"45ae4907542bec6ab2072f13b5b2c2bc9f170fffd5fd8cd669f557641b2c618b"} Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.115303 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2mvbv" event={"ID":"7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b","Type":"ContainerStarted","Data":"f409342776e75306da098222c9e5e17a25ed0e6ba28bb510c5fe6db76e57be12"} Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.140762 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h75c2\" (UniqueName: \"kubernetes.io/projected/76b4f705-b92d-4b11-8ddf-a4252a98c37d-kube-api-access-h75c2\") pod \"redhat-marketplace-8jlqh\" (UID: \"76b4f705-b92d-4b11-8ddf-a4252a98c37d\") " pod="openshift-marketplace/redhat-marketplace-8jlqh" Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.186093 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8jlqh" Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.264492 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6c8xq"] Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.265785 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6c8xq" Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.267869 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.275373 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6c8xq"] Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.355364 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8jlqh"] Apr 02 13:47:23 crc kubenswrapper[4732]: W0402 13:47:23.361472 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76b4f705_b92d_4b11_8ddf_a4252a98c37d.slice/crio-e9eba08e6ed721735235cd57b30addcf7f0bb4b0b6dd2a6601025f06dc8a5cae WatchSource:0}: Error finding container e9eba08e6ed721735235cd57b30addcf7f0bb4b0b6dd2a6601025f06dc8a5cae: Status 404 returned error can't find the container with id e9eba08e6ed721735235cd57b30addcf7f0bb4b0b6dd2a6601025f06dc8a5cae Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.414627 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdbj7\" (UniqueName: \"kubernetes.io/projected/a9edc76f-550a-4fb7-b8e5-24a34beb38f8-kube-api-access-zdbj7\") pod \"redhat-operators-6c8xq\" (UID: \"a9edc76f-550a-4fb7-b8e5-24a34beb38f8\") " pod="openshift-marketplace/redhat-operators-6c8xq" Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.414732 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9edc76f-550a-4fb7-b8e5-24a34beb38f8-utilities\") pod \"redhat-operators-6c8xq\" (UID: \"a9edc76f-550a-4fb7-b8e5-24a34beb38f8\") " pod="openshift-marketplace/redhat-operators-6c8xq" Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.414780 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9edc76f-550a-4fb7-b8e5-24a34beb38f8-catalog-content\") pod \"redhat-operators-6c8xq\" (UID: \"a9edc76f-550a-4fb7-b8e5-24a34beb38f8\") " pod="openshift-marketplace/redhat-operators-6c8xq" Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.516265 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9edc76f-550a-4fb7-b8e5-24a34beb38f8-utilities\") pod \"redhat-operators-6c8xq\" (UID: \"a9edc76f-550a-4fb7-b8e5-24a34beb38f8\") " pod="openshift-marketplace/redhat-operators-6c8xq" Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.516360 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9edc76f-550a-4fb7-b8e5-24a34beb38f8-catalog-content\") pod \"redhat-operators-6c8xq\" (UID: \"a9edc76f-550a-4fb7-b8e5-24a34beb38f8\") " pod="openshift-marketplace/redhat-operators-6c8xq" Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.516407 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdbj7\" (UniqueName: \"kubernetes.io/projected/a9edc76f-550a-4fb7-b8e5-24a34beb38f8-kube-api-access-zdbj7\") pod \"redhat-operators-6c8xq\" (UID: \"a9edc76f-550a-4fb7-b8e5-24a34beb38f8\") " pod="openshift-marketplace/redhat-operators-6c8xq" Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.516937 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9edc76f-550a-4fb7-b8e5-24a34beb38f8-utilities\") pod \"redhat-operators-6c8xq\" (UID: \"a9edc76f-550a-4fb7-b8e5-24a34beb38f8\") " pod="openshift-marketplace/redhat-operators-6c8xq" Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.516996 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9edc76f-550a-4fb7-b8e5-24a34beb38f8-catalog-content\") pod \"redhat-operators-6c8xq\" (UID: \"a9edc76f-550a-4fb7-b8e5-24a34beb38f8\") " pod="openshift-marketplace/redhat-operators-6c8xq" Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.533596 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdbj7\" (UniqueName: \"kubernetes.io/projected/a9edc76f-550a-4fb7-b8e5-24a34beb38f8-kube-api-access-zdbj7\") pod \"redhat-operators-6c8xq\" (UID: \"a9edc76f-550a-4fb7-b8e5-24a34beb38f8\") " pod="openshift-marketplace/redhat-operators-6c8xq" Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.596264 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6c8xq" Apr 02 13:47:23 crc kubenswrapper[4732]: I0402 13:47:23.815178 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6c8xq"] Apr 02 13:47:24 crc kubenswrapper[4732]: I0402 13:47:24.122699 4732 generic.go:334] "Generic (PLEG): container finished" podID="5d2e3468-f731-45d7-bc5f-5c32a739a196" containerID="45ae4907542bec6ab2072f13b5b2c2bc9f170fffd5fd8cd669f557641b2c618b" exitCode=0 Apr 02 13:47:24 crc kubenswrapper[4732]: I0402 13:47:24.122774 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6265v" event={"ID":"5d2e3468-f731-45d7-bc5f-5c32a739a196","Type":"ContainerDied","Data":"45ae4907542bec6ab2072f13b5b2c2bc9f170fffd5fd8cd669f557641b2c618b"} Apr 02 13:47:24 crc kubenswrapper[4732]: I0402 13:47:24.125039 4732 generic.go:334] "Generic (PLEG): container finished" podID="76b4f705-b92d-4b11-8ddf-a4252a98c37d" containerID="a079eeccfe89020c86d52585038a86b1739a8a12e821b37d9bf9de45942ce084" exitCode=0 Apr 02 13:47:24 crc kubenswrapper[4732]: I0402 13:47:24.125302 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8jlqh" event={"ID":"76b4f705-b92d-4b11-8ddf-a4252a98c37d","Type":"ContainerDied","Data":"a079eeccfe89020c86d52585038a86b1739a8a12e821b37d9bf9de45942ce084"} Apr 02 13:47:24 crc kubenswrapper[4732]: I0402 13:47:24.125350 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8jlqh" event={"ID":"76b4f705-b92d-4b11-8ddf-a4252a98c37d","Type":"ContainerStarted","Data":"e9eba08e6ed721735235cd57b30addcf7f0bb4b0b6dd2a6601025f06dc8a5cae"} Apr 02 13:47:24 crc kubenswrapper[4732]: I0402 13:47:24.128232 4732 generic.go:334] "Generic (PLEG): container finished" podID="a9edc76f-550a-4fb7-b8e5-24a34beb38f8" containerID="0c0709b8ab176553c17093675d38ceb634c67c3962dbca740cd1444380dd16a8" exitCode=0 Apr 02 13:47:24 crc kubenswrapper[4732]: I0402 13:47:24.128324 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6c8xq" event={"ID":"a9edc76f-550a-4fb7-b8e5-24a34beb38f8","Type":"ContainerDied","Data":"0c0709b8ab176553c17093675d38ceb634c67c3962dbca740cd1444380dd16a8"} Apr 02 13:47:24 crc kubenswrapper[4732]: I0402 13:47:24.128358 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6c8xq" event={"ID":"a9edc76f-550a-4fb7-b8e5-24a34beb38f8","Type":"ContainerStarted","Data":"95ce32db34c67e396cac46f125255e354efa091ef0c7ef82382af09b8513658c"} Apr 02 13:47:24 crc kubenswrapper[4732]: I0402 13:47:24.131939 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2mvbv" event={"ID":"7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b","Type":"ContainerDied","Data":"f409342776e75306da098222c9e5e17a25ed0e6ba28bb510c5fe6db76e57be12"} Apr 02 13:47:24 crc kubenswrapper[4732]: I0402 13:47:24.132411 4732 generic.go:334] "Generic (PLEG): container finished" podID="7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b" containerID="f409342776e75306da098222c9e5e17a25ed0e6ba28bb510c5fe6db76e57be12" exitCode=0 Apr 02 13:47:25 crc kubenswrapper[4732]: I0402 13:47:25.145547 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6c8xq" event={"ID":"a9edc76f-550a-4fb7-b8e5-24a34beb38f8","Type":"ContainerStarted","Data":"3e705a21eb2565c81efc11c83d8e0787a49f9f96bffeade869087511629c5b6b"} Apr 02 13:47:25 crc kubenswrapper[4732]: I0402 13:47:25.148991 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2mvbv" event={"ID":"7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b","Type":"ContainerStarted","Data":"c9a5b3f087591ca669549185f641ccfb9a370a90efaff7bb30d3f9216e15e0b6"} Apr 02 13:47:25 crc kubenswrapper[4732]: I0402 13:47:25.151512 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6265v" event={"ID":"5d2e3468-f731-45d7-bc5f-5c32a739a196","Type":"ContainerStarted","Data":"16f7f609d00329b7faccb7054989127e7ea9edb325a64450e3c057b01a7f0f0a"} Apr 02 13:47:25 crc kubenswrapper[4732]: I0402 13:47:25.153052 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8jlqh" event={"ID":"76b4f705-b92d-4b11-8ddf-a4252a98c37d","Type":"ContainerStarted","Data":"a6ecabb6899a5b71113961093b61de33504d61d0b912ca3fb40f59df78c19533"} Apr 02 13:47:25 crc kubenswrapper[4732]: I0402 13:47:25.186578 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Apr 02 13:47:25 crc kubenswrapper[4732]: I0402 13:47:25.213243 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2mvbv" podStartSLOduration=2.836726231 podStartE2EDuration="5.213228165s" podCreationTimestamp="2026-04-02 13:47:20 +0000 UTC" firstStartedPulling="2026-04-02 13:47:22.108009026 +0000 UTC m=+599.012416579" lastFinishedPulling="2026-04-02 13:47:24.48451096 +0000 UTC m=+601.388918513" observedRunningTime="2026-04-02 13:47:25.209408761 +0000 UTC m=+602.113816334" watchObservedRunningTime="2026-04-02 13:47:25.213228165 +0000 UTC m=+602.117635718" Apr 02 13:47:25 crc kubenswrapper[4732]: I0402 13:47:25.222989 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6265v" podStartSLOduration=2.633115004 podStartE2EDuration="5.222974962s" podCreationTimestamp="2026-04-02 13:47:20 +0000 UTC" firstStartedPulling="2026-04-02 13:47:22.105851636 +0000 UTC m=+599.010259189" lastFinishedPulling="2026-04-02 13:47:24.695711594 +0000 UTC m=+601.600119147" observedRunningTime="2026-04-02 13:47:25.221683817 +0000 UTC m=+602.126091410" watchObservedRunningTime="2026-04-02 13:47:25.222974962 +0000 UTC m=+602.127382515" Apr 02 13:47:26 crc kubenswrapper[4732]: I0402 13:47:26.160587 4732 generic.go:334] "Generic (PLEG): container finished" podID="a9edc76f-550a-4fb7-b8e5-24a34beb38f8" containerID="3e705a21eb2565c81efc11c83d8e0787a49f9f96bffeade869087511629c5b6b" exitCode=0 Apr 02 13:47:26 crc kubenswrapper[4732]: I0402 13:47:26.160679 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6c8xq" event={"ID":"a9edc76f-550a-4fb7-b8e5-24a34beb38f8","Type":"ContainerDied","Data":"3e705a21eb2565c81efc11c83d8e0787a49f9f96bffeade869087511629c5b6b"} Apr 02 13:47:26 crc kubenswrapper[4732]: I0402 13:47:26.167603 4732 generic.go:334] "Generic (PLEG): container finished" podID="76b4f705-b92d-4b11-8ddf-a4252a98c37d" containerID="a6ecabb6899a5b71113961093b61de33504d61d0b912ca3fb40f59df78c19533" exitCode=0 Apr 02 13:47:26 crc kubenswrapper[4732]: I0402 13:47:26.167806 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8jlqh" event={"ID":"76b4f705-b92d-4b11-8ddf-a4252a98c37d","Type":"ContainerDied","Data":"a6ecabb6899a5b71113961093b61de33504d61d0b912ca3fb40f59df78c19533"} Apr 02 13:47:26 crc kubenswrapper[4732]: I0402 13:47:26.510531 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Apr 02 13:47:27 crc kubenswrapper[4732]: I0402 13:47:27.175091 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6c8xq" event={"ID":"a9edc76f-550a-4fb7-b8e5-24a34beb38f8","Type":"ContainerStarted","Data":"76848bb4992e39db8b30cb769c61916b185a374bc7e93a6edf4e62d2b5eace04"} Apr 02 13:47:27 crc kubenswrapper[4732]: I0402 13:47:27.177158 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8jlqh" event={"ID":"76b4f705-b92d-4b11-8ddf-a4252a98c37d","Type":"ContainerStarted","Data":"6b2ccb284372fc0380355e316f789d0b7189fe8a7afabd8acbc63bd4ac3e883b"} Apr 02 13:47:27 crc kubenswrapper[4732]: I0402 13:47:27.190829 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6c8xq" podStartSLOduration=1.747071561 podStartE2EDuration="4.190813729s" podCreationTimestamp="2026-04-02 13:47:23 +0000 UTC" firstStartedPulling="2026-04-02 13:47:24.129338465 +0000 UTC m=+601.033746028" lastFinishedPulling="2026-04-02 13:47:26.573080653 +0000 UTC m=+603.477488196" observedRunningTime="2026-04-02 13:47:27.188961338 +0000 UTC m=+604.093368901" watchObservedRunningTime="2026-04-02 13:47:27.190813729 +0000 UTC m=+604.095221282" Apr 02 13:47:27 crc kubenswrapper[4732]: I0402 13:47:27.207002 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8jlqh" podStartSLOduration=2.41106804 podStartE2EDuration="5.206985932s" podCreationTimestamp="2026-04-02 13:47:22 +0000 UTC" firstStartedPulling="2026-04-02 13:47:24.126143457 +0000 UTC m=+601.030551010" lastFinishedPulling="2026-04-02 13:47:26.922061349 +0000 UTC m=+603.826468902" observedRunningTime="2026-04-02 13:47:27.205912622 +0000 UTC m=+604.110320185" watchObservedRunningTime="2026-04-02 13:47:27.206985932 +0000 UTC m=+604.111393495" Apr 02 13:47:30 crc kubenswrapper[4732]: I0402 13:47:30.789720 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2mvbv" Apr 02 13:47:30 crc kubenswrapper[4732]: I0402 13:47:30.789976 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2mvbv" Apr 02 13:47:30 crc kubenswrapper[4732]: I0402 13:47:30.855083 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2mvbv" Apr 02 13:47:30 crc kubenswrapper[4732]: I0402 13:47:30.985838 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6265v" Apr 02 13:47:30 crc kubenswrapper[4732]: I0402 13:47:30.985915 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6265v" Apr 02 13:47:31 crc kubenswrapper[4732]: I0402 13:47:31.031031 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6265v" Apr 02 13:47:31 crc kubenswrapper[4732]: I0402 13:47:31.232296 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2mvbv" Apr 02 13:47:31 crc kubenswrapper[4732]: I0402 13:47:31.232357 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6265v" Apr 02 13:47:31 crc kubenswrapper[4732]: I0402 13:47:31.617143 4732 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","pod57f69fbf-b90f-411e-8e0c-70a25bb01566"] err="unable to destroy cgroup paths for cgroup [kubepods burstable pod57f69fbf-b90f-411e-8e0c-70a25bb01566] : Timed out while waiting for systemd to remove kubepods-burstable-pod57f69fbf_b90f_411e_8e0c_70a25bb01566.slice" Apr 02 13:47:33 crc kubenswrapper[4732]: I0402 13:47:33.187246 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8jlqh" Apr 02 13:47:33 crc kubenswrapper[4732]: I0402 13:47:33.187544 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8jlqh" Apr 02 13:47:33 crc kubenswrapper[4732]: I0402 13:47:33.234744 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8jlqh" Apr 02 13:47:33 crc kubenswrapper[4732]: I0402 13:47:33.596531 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6c8xq" Apr 02 13:47:33 crc kubenswrapper[4732]: I0402 13:47:33.596946 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6c8xq" Apr 02 13:47:33 crc kubenswrapper[4732]: I0402 13:47:33.633750 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6c8xq" Apr 02 13:47:34 crc kubenswrapper[4732]: I0402 13:47:34.269773 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8jlqh" Apr 02 13:47:34 crc kubenswrapper[4732]: I0402 13:47:34.272066 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6c8xq" Apr 02 13:47:36 crc kubenswrapper[4732]: I0402 13:47:36.088284 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Apr 02 13:47:39 crc kubenswrapper[4732]: I0402 13:47:39.054226 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-11-retry-2-crc"] Apr 02 13:47:39 crc kubenswrapper[4732]: I0402 13:47:39.055015 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-11-retry-2-crc" Apr 02 13:47:39 crc kubenswrapper[4732]: I0402 13:47:39.057156 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Apr 02 13:47:39 crc kubenswrapper[4732]: I0402 13:47:39.057197 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Apr 02 13:47:39 crc kubenswrapper[4732]: I0402 13:47:39.065715 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-11-retry-2-crc"] Apr 02 13:47:39 crc kubenswrapper[4732]: I0402 13:47:39.204521 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c8910d44-74f9-42e1-9497-8db09cc29b58-kube-api-access\") pod \"installer-11-retry-2-crc\" (UID: \"c8910d44-74f9-42e1-9497-8db09cc29b58\") " pod="openshift-kube-controller-manager/installer-11-retry-2-crc" Apr 02 13:47:39 crc kubenswrapper[4732]: I0402 13:47:39.204655 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c8910d44-74f9-42e1-9497-8db09cc29b58-kubelet-dir\") pod \"installer-11-retry-2-crc\" (UID: \"c8910d44-74f9-42e1-9497-8db09cc29b58\") " pod="openshift-kube-controller-manager/installer-11-retry-2-crc" Apr 02 13:47:39 crc kubenswrapper[4732]: I0402 13:47:39.204693 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c8910d44-74f9-42e1-9497-8db09cc29b58-var-lock\") pod \"installer-11-retry-2-crc\" (UID: \"c8910d44-74f9-42e1-9497-8db09cc29b58\") " pod="openshift-kube-controller-manager/installer-11-retry-2-crc" Apr 02 13:47:39 crc kubenswrapper[4732]: I0402 13:47:39.306865 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c8910d44-74f9-42e1-9497-8db09cc29b58-var-lock\") pod \"installer-11-retry-2-crc\" (UID: \"c8910d44-74f9-42e1-9497-8db09cc29b58\") " pod="openshift-kube-controller-manager/installer-11-retry-2-crc" Apr 02 13:47:39 crc kubenswrapper[4732]: I0402 13:47:39.307038 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c8910d44-74f9-42e1-9497-8db09cc29b58-var-lock\") pod \"installer-11-retry-2-crc\" (UID: \"c8910d44-74f9-42e1-9497-8db09cc29b58\") " pod="openshift-kube-controller-manager/installer-11-retry-2-crc" Apr 02 13:47:39 crc kubenswrapper[4732]: I0402 13:47:39.307246 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c8910d44-74f9-42e1-9497-8db09cc29b58-kube-api-access\") pod \"installer-11-retry-2-crc\" (UID: \"c8910d44-74f9-42e1-9497-8db09cc29b58\") " pod="openshift-kube-controller-manager/installer-11-retry-2-crc" Apr 02 13:47:39 crc kubenswrapper[4732]: I0402 13:47:39.307325 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c8910d44-74f9-42e1-9497-8db09cc29b58-kubelet-dir\") pod \"installer-11-retry-2-crc\" (UID: \"c8910d44-74f9-42e1-9497-8db09cc29b58\") " pod="openshift-kube-controller-manager/installer-11-retry-2-crc" Apr 02 13:47:39 crc kubenswrapper[4732]: I0402 13:47:39.307424 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c8910d44-74f9-42e1-9497-8db09cc29b58-kubelet-dir\") pod \"installer-11-retry-2-crc\" (UID: \"c8910d44-74f9-42e1-9497-8db09cc29b58\") " pod="openshift-kube-controller-manager/installer-11-retry-2-crc" Apr 02 13:47:39 crc kubenswrapper[4732]: I0402 13:47:39.342499 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c8910d44-74f9-42e1-9497-8db09cc29b58-kube-api-access\") pod \"installer-11-retry-2-crc\" (UID: \"c8910d44-74f9-42e1-9497-8db09cc29b58\") " pod="openshift-kube-controller-manager/installer-11-retry-2-crc" Apr 02 13:47:39 crc kubenswrapper[4732]: I0402 13:47:39.382205 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-11-retry-2-crc" Apr 02 13:47:39 crc kubenswrapper[4732]: I0402 13:47:39.600446 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-11-retry-2-crc"] Apr 02 13:47:40 crc kubenswrapper[4732]: I0402 13:47:40.252338 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-11-retry-2-crc" event={"ID":"c8910d44-74f9-42e1-9497-8db09cc29b58","Type":"ContainerStarted","Data":"2f2bf3cc272c926c88db8592fc5ac1e541eadc2c56222b00f1204f1a8d173707"} Apr 02 13:47:42 crc kubenswrapper[4732]: I0402 13:47:42.264865 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-11-retry-2-crc" event={"ID":"c8910d44-74f9-42e1-9497-8db09cc29b58","Type":"ContainerStarted","Data":"8aeb34400e98e11ec2c5267b7736bdbe7c7cc961e3916d9e555585b3145b6af4"} Apr 02 13:47:42 crc kubenswrapper[4732]: I0402 13:47:42.282469 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-11-retry-2-crc" podStartSLOduration=3.282451826 podStartE2EDuration="3.282451826s" podCreationTimestamp="2026-04-02 13:47:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:47:42.28004876 +0000 UTC m=+619.184456313" watchObservedRunningTime="2026-04-02 13:47:42.282451826 +0000 UTC m=+619.186859379" Apr 02 13:47:45 crc kubenswrapper[4732]: I0402 13:47:45.819496 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Apr 02 13:48:00 crc kubenswrapper[4732]: I0402 13:48:00.139420 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585628-h96tl"] Apr 02 13:48:00 crc kubenswrapper[4732]: I0402 13:48:00.141014 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585628-h96tl" Apr 02 13:48:00 crc kubenswrapper[4732]: I0402 13:48:00.142762 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 13:48:00 crc kubenswrapper[4732]: I0402 13:48:00.142844 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 13:48:00 crc kubenswrapper[4732]: I0402 13:48:00.143260 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 13:48:00 crc kubenswrapper[4732]: I0402 13:48:00.145164 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585628-h96tl"] Apr 02 13:48:00 crc kubenswrapper[4732]: I0402 13:48:00.196108 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h87l\" (UniqueName: \"kubernetes.io/projected/cde86b49-a33e-43f2-95ec-2b2864d315d5-kube-api-access-2h87l\") pod \"auto-csr-approver-29585628-h96tl\" (UID: \"cde86b49-a33e-43f2-95ec-2b2864d315d5\") " pod="openshift-infra/auto-csr-approver-29585628-h96tl" Apr 02 13:48:00 crc kubenswrapper[4732]: I0402 13:48:00.296975 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h87l\" (UniqueName: \"kubernetes.io/projected/cde86b49-a33e-43f2-95ec-2b2864d315d5-kube-api-access-2h87l\") pod \"auto-csr-approver-29585628-h96tl\" (UID: \"cde86b49-a33e-43f2-95ec-2b2864d315d5\") " pod="openshift-infra/auto-csr-approver-29585628-h96tl" Apr 02 13:48:00 crc kubenswrapper[4732]: I0402 13:48:00.319812 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h87l\" (UniqueName: \"kubernetes.io/projected/cde86b49-a33e-43f2-95ec-2b2864d315d5-kube-api-access-2h87l\") pod \"auto-csr-approver-29585628-h96tl\" (UID: \"cde86b49-a33e-43f2-95ec-2b2864d315d5\") " pod="openshift-infra/auto-csr-approver-29585628-h96tl" Apr 02 13:48:00 crc kubenswrapper[4732]: I0402 13:48:00.457360 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585628-h96tl" Apr 02 13:48:00 crc kubenswrapper[4732]: I0402 13:48:00.669435 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585628-h96tl"] Apr 02 13:48:00 crc kubenswrapper[4732]: W0402 13:48:00.674824 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcde86b49_a33e_43f2_95ec_2b2864d315d5.slice/crio-7bcd91475ef1319d7b14d570b71933ba36f4714ab6f3246a0401f9110b88f736 WatchSource:0}: Error finding container 7bcd91475ef1319d7b14d570b71933ba36f4714ab6f3246a0401f9110b88f736: Status 404 returned error can't find the container with id 7bcd91475ef1319d7b14d570b71933ba36f4714ab6f3246a0401f9110b88f736 Apr 02 13:48:01 crc kubenswrapper[4732]: I0402 13:48:01.385209 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585628-h96tl" event={"ID":"cde86b49-a33e-43f2-95ec-2b2864d315d5","Type":"ContainerStarted","Data":"7bcd91475ef1319d7b14d570b71933ba36f4714ab6f3246a0401f9110b88f736"} Apr 02 13:48:02 crc kubenswrapper[4732]: I0402 13:48:02.393632 4732 generic.go:334] "Generic (PLEG): container finished" podID="cde86b49-a33e-43f2-95ec-2b2864d315d5" containerID="2c42808e60f473901e6fb42946d6504b0c3182d7fc3b6eeb7b9bc6e4a6e3a37e" exitCode=0 Apr 02 13:48:02 crc kubenswrapper[4732]: I0402 13:48:02.393715 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585628-h96tl" event={"ID":"cde86b49-a33e-43f2-95ec-2b2864d315d5","Type":"ContainerDied","Data":"2c42808e60f473901e6fb42946d6504b0c3182d7fc3b6eeb7b9bc6e4a6e3a37e"} Apr 02 13:48:03 crc kubenswrapper[4732]: I0402 13:48:03.651026 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585628-h96tl" Apr 02 13:48:03 crc kubenswrapper[4732]: I0402 13:48:03.840063 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h87l\" (UniqueName: \"kubernetes.io/projected/cde86b49-a33e-43f2-95ec-2b2864d315d5-kube-api-access-2h87l\") pod \"cde86b49-a33e-43f2-95ec-2b2864d315d5\" (UID: \"cde86b49-a33e-43f2-95ec-2b2864d315d5\") " Apr 02 13:48:03 crc kubenswrapper[4732]: I0402 13:48:03.856423 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cde86b49-a33e-43f2-95ec-2b2864d315d5-kube-api-access-2h87l" (OuterVolumeSpecName: "kube-api-access-2h87l") pod "cde86b49-a33e-43f2-95ec-2b2864d315d5" (UID: "cde86b49-a33e-43f2-95ec-2b2864d315d5"). InnerVolumeSpecName "kube-api-access-2h87l". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:48:03 crc kubenswrapper[4732]: I0402 13:48:03.942122 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h87l\" (UniqueName: \"kubernetes.io/projected/cde86b49-a33e-43f2-95ec-2b2864d315d5-kube-api-access-2h87l\") on node \"crc\" DevicePath \"\"" Apr 02 13:48:04 crc kubenswrapper[4732]: I0402 13:48:04.408742 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585628-h96tl" event={"ID":"cde86b49-a33e-43f2-95ec-2b2864d315d5","Type":"ContainerDied","Data":"7bcd91475ef1319d7b14d570b71933ba36f4714ab6f3246a0401f9110b88f736"} Apr 02 13:48:04 crc kubenswrapper[4732]: I0402 13:48:04.408977 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bcd91475ef1319d7b14d570b71933ba36f4714ab6f3246a0401f9110b88f736" Apr 02 13:48:04 crc kubenswrapper[4732]: I0402 13:48:04.409028 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585628-h96tl" Apr 02 13:48:04 crc kubenswrapper[4732]: I0402 13:48:04.701978 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585622-lvtgw"] Apr 02 13:48:04 crc kubenswrapper[4732]: I0402 13:48:04.705288 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585622-lvtgw"] Apr 02 13:48:06 crc kubenswrapper[4732]: I0402 13:48:06.686270 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65013539-f3b4-4513-881a-14408a922424" path="/var/lib/kubelet/pods/65013539-f3b4-4513-881a-14408a922424/volumes" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.116810 4732 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.117817 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://5272db3013bda1d5fcb66bd2669d7714381d3bd39802499b4a27873cbccd6ff9" gracePeriod=30 Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.117863 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://e597b0db2773d2bb9d7e673759dee264ef036a174be679e2ccfa8ba03cf5c6c9" gracePeriod=30 Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.117936 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://a14d376cb655f7984c3fbe269826d57ea3e936ded391d3fbb9394a6b7960dad5" gracePeriod=30 Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.117735 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://1b40621c4cae0af5b1de321842441756156529e8dcb1aff14d7f5dc7db637127" gracePeriod=30 Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.118969 4732 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Apr 02 13:48:14 crc kubenswrapper[4732]: E0402 13:48:14.119278 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager-recovery-controller" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.119305 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager-recovery-controller" Apr 02 13:48:14 crc kubenswrapper[4732]: E0402 13:48:14.119324 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.119337 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" Apr 02 13:48:14 crc kubenswrapper[4732]: E0402 13:48:14.119355 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cde86b49-a33e-43f2-95ec-2b2864d315d5" containerName="oc" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.119368 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="cde86b49-a33e-43f2-95ec-2b2864d315d5" containerName="oc" Apr 02 13:48:14 crc kubenswrapper[4732]: E0402 13:48:14.119388 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.119400 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" Apr 02 13:48:14 crc kubenswrapper[4732]: E0402 13:48:14.119414 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.119427 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" Apr 02 13:48:14 crc kubenswrapper[4732]: E0402 13:48:14.119450 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.119462 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" Apr 02 13:48:14 crc kubenswrapper[4732]: E0402 13:48:14.119479 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.119491 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" Apr 02 13:48:14 crc kubenswrapper[4732]: E0402 13:48:14.119505 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.119517 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" Apr 02 13:48:14 crc kubenswrapper[4732]: E0402 13:48:14.119538 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager-cert-syncer" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.119550 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager-cert-syncer" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.119752 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.119779 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.119806 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="cde86b49-a33e-43f2-95ec-2b2864d315d5" containerName="oc" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.119829 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager-recovery-controller" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.119847 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.119865 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager-cert-syncer" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.119888 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.119912 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.120326 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.264075 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/235e9295064844132a05dc40ef3a886a-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"235e9295064844132a05dc40ef3a886a\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.264391 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/235e9295064844132a05dc40ef3a886a-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"235e9295064844132a05dc40ef3a886a\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.291977 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.292955 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.293893 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager-cert-syncer/0.log" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.293973 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.297696 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-crc" oldPodUID="f614b9022728cf315e60c057852e563e" podUID="235e9295064844132a05dc40ef3a886a" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.366683 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/235e9295064844132a05dc40ef3a886a-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"235e9295064844132a05dc40ef3a886a\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.367099 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/235e9295064844132a05dc40ef3a886a-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"235e9295064844132a05dc40ef3a886a\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.366852 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/235e9295064844132a05dc40ef3a886a-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"235e9295064844132a05dc40ef3a886a\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.367173 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/235e9295064844132a05dc40ef3a886a-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"235e9295064844132a05dc40ef3a886a\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.468191 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"f614b9022728cf315e60c057852e563e\" (UID: \"f614b9022728cf315e60c057852e563e\") " Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.468322 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"f614b9022728cf315e60c057852e563e\" (UID: \"f614b9022728cf315e60c057852e563e\") " Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.468335 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f614b9022728cf315e60c057852e563e" (UID: "f614b9022728cf315e60c057852e563e"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.468500 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f614b9022728cf315e60c057852e563e" (UID: "f614b9022728cf315e60c057852e563e"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.468741 4732 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.468767 4732 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.474895 4732 generic.go:334] "Generic (PLEG): container finished" podID="c8910d44-74f9-42e1-9497-8db09cc29b58" containerID="8aeb34400e98e11ec2c5267b7736bdbe7c7cc961e3916d9e555585b3145b6af4" exitCode=0 Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.474985 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-11-retry-2-crc" event={"ID":"c8910d44-74f9-42e1-9497-8db09cc29b58","Type":"ContainerDied","Data":"8aeb34400e98e11ec2c5267b7736bdbe7c7cc961e3916d9e555585b3145b6af4"} Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.478439 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.479301 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.480161 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager-cert-syncer/0.log" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.480279 4732 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="a14d376cb655f7984c3fbe269826d57ea3e936ded391d3fbb9394a6b7960dad5" exitCode=0 Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.480372 4732 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="e597b0db2773d2bb9d7e673759dee264ef036a174be679e2ccfa8ba03cf5c6c9" exitCode=0 Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.480519 4732 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="5272db3013bda1d5fcb66bd2669d7714381d3bd39802499b4a27873cbccd6ff9" exitCode=0 Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.480404 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.480599 4732 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="1b40621c4cae0af5b1de321842441756156529e8dcb1aff14d7f5dc7db637127" exitCode=2 Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.480770 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a13c7b3c22cb8c918dd27619d80784051c2ea4d1bdf2344919bea132c59daa7b" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.480340 4732 scope.go:117] "RemoveContainer" containerID="fd58fa9b97b4f1b471d0e141c18378167e8054616db491b1e79d2530f9ecab7f" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.494189 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-crc" oldPodUID="f614b9022728cf315e60c057852e563e" podUID="235e9295064844132a05dc40ef3a886a" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.509199 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-crc" oldPodUID="f614b9022728cf315e60c057852e563e" podUID="235e9295064844132a05dc40ef3a886a" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.509677 4732 scope.go:117] "RemoveContainer" containerID="8320a4ebc7edb8ed6702aa82a986ae487f9903f3f52ca9e7388135c88a1a8dea" Apr 02 13:48:14 crc kubenswrapper[4732]: I0402 13:48:14.689205 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f614b9022728cf315e60c057852e563e" path="/var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/volumes" Apr 02 13:48:15 crc kubenswrapper[4732]: I0402 13:48:15.491826 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager-cert-syncer/0.log" Apr 02 13:48:15 crc kubenswrapper[4732]: I0402 13:48:15.771195 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-11-retry-2-crc" Apr 02 13:48:15 crc kubenswrapper[4732]: I0402 13:48:15.889264 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c8910d44-74f9-42e1-9497-8db09cc29b58-kubelet-dir\") pod \"c8910d44-74f9-42e1-9497-8db09cc29b58\" (UID: \"c8910d44-74f9-42e1-9497-8db09cc29b58\") " Apr 02 13:48:15 crc kubenswrapper[4732]: I0402 13:48:15.889350 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c8910d44-74f9-42e1-9497-8db09cc29b58-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c8910d44-74f9-42e1-9497-8db09cc29b58" (UID: "c8910d44-74f9-42e1-9497-8db09cc29b58"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:48:15 crc kubenswrapper[4732]: I0402 13:48:15.889380 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c8910d44-74f9-42e1-9497-8db09cc29b58-kube-api-access\") pod \"c8910d44-74f9-42e1-9497-8db09cc29b58\" (UID: \"c8910d44-74f9-42e1-9497-8db09cc29b58\") " Apr 02 13:48:15 crc kubenswrapper[4732]: I0402 13:48:15.889481 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c8910d44-74f9-42e1-9497-8db09cc29b58-var-lock\") pod \"c8910d44-74f9-42e1-9497-8db09cc29b58\" (UID: \"c8910d44-74f9-42e1-9497-8db09cc29b58\") " Apr 02 13:48:15 crc kubenswrapper[4732]: I0402 13:48:15.889544 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c8910d44-74f9-42e1-9497-8db09cc29b58-var-lock" (OuterVolumeSpecName: "var-lock") pod "c8910d44-74f9-42e1-9497-8db09cc29b58" (UID: "c8910d44-74f9-42e1-9497-8db09cc29b58"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:48:15 crc kubenswrapper[4732]: I0402 13:48:15.889664 4732 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c8910d44-74f9-42e1-9497-8db09cc29b58-var-lock\") on node \"crc\" DevicePath \"\"" Apr 02 13:48:15 crc kubenswrapper[4732]: I0402 13:48:15.889678 4732 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c8910d44-74f9-42e1-9497-8db09cc29b58-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:48:15 crc kubenswrapper[4732]: I0402 13:48:15.896841 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8910d44-74f9-42e1-9497-8db09cc29b58-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c8910d44-74f9-42e1-9497-8db09cc29b58" (UID: "c8910d44-74f9-42e1-9497-8db09cc29b58"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:48:15 crc kubenswrapper[4732]: I0402 13:48:15.990899 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c8910d44-74f9-42e1-9497-8db09cc29b58-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 02 13:48:16 crc kubenswrapper[4732]: I0402 13:48:16.505133 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-11-retry-2-crc" event={"ID":"c8910d44-74f9-42e1-9497-8db09cc29b58","Type":"ContainerDied","Data":"2f2bf3cc272c926c88db8592fc5ac1e541eadc2c56222b00f1204f1a8d173707"} Apr 02 13:48:16 crc kubenswrapper[4732]: I0402 13:48:16.505185 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f2bf3cc272c926c88db8592fc5ac1e541eadc2c56222b00f1204f1a8d173707" Apr 02 13:48:16 crc kubenswrapper[4732]: I0402 13:48:16.505251 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-11-retry-2-crc" Apr 02 13:48:24 crc kubenswrapper[4732]: I0402 13:48:24.688978 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:48:24 crc kubenswrapper[4732]: I0402 13:48:24.703458 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="9ce5c3a8-e8a4-4074-96da-cfc340c2873f" Apr 02 13:48:24 crc kubenswrapper[4732]: I0402 13:48:24.703723 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="9ce5c3a8-e8a4-4074-96da-cfc340c2873f" Apr 02 13:48:24 crc kubenswrapper[4732]: I0402 13:48:24.712373 4732 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:48:24 crc kubenswrapper[4732]: I0402 13:48:24.718963 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Apr 02 13:48:24 crc kubenswrapper[4732]: I0402 13:48:24.726447 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Apr 02 13:48:24 crc kubenswrapper[4732]: I0402 13:48:24.735707 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:48:24 crc kubenswrapper[4732]: I0402 13:48:24.743847 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Apr 02 13:48:25 crc kubenswrapper[4732]: I0402 13:48:25.567054 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"235e9295064844132a05dc40ef3a886a","Type":"ContainerStarted","Data":"73c0dd13f9fc718080de9764a6d5dbfc2998a838fb22d3778e131c10889b83ec"} Apr 02 13:48:25 crc kubenswrapper[4732]: I0402 13:48:25.567436 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"235e9295064844132a05dc40ef3a886a","Type":"ContainerStarted","Data":"cf6d40837f21f6428d52f3bb75f667c0ebf856b0af510889a9695093c21b6885"} Apr 02 13:48:25 crc kubenswrapper[4732]: I0402 13:48:25.567446 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"235e9295064844132a05dc40ef3a886a","Type":"ContainerStarted","Data":"5c716ed2094065ff92ec97fb32bca6973e69e1cd0018780759102dde6b4df062"} Apr 02 13:48:25 crc kubenswrapper[4732]: I0402 13:48:25.567454 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"235e9295064844132a05dc40ef3a886a","Type":"ContainerStarted","Data":"853f32b8ecb8d68c2afb0765bbfd69b4645b7d30511096592b337a1c09167f3b"} Apr 02 13:48:26 crc kubenswrapper[4732]: I0402 13:48:26.576679 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"235e9295064844132a05dc40ef3a886a","Type":"ContainerStarted","Data":"07bfdaa7a1fbf8b3c33ccd90050537f2c5efaab4563f16feace73117b1bcdb91"} Apr 02 13:48:26 crc kubenswrapper[4732]: I0402 13:48:26.597061 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=2.597036893 podStartE2EDuration="2.597036893s" podCreationTimestamp="2026-04-02 13:48:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:48:26.596405666 +0000 UTC m=+663.500813229" watchObservedRunningTime="2026-04-02 13:48:26.597036893 +0000 UTC m=+663.501444466" Apr 02 13:48:34 crc kubenswrapper[4732]: I0402 13:48:34.736380 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:48:34 crc kubenswrapper[4732]: I0402 13:48:34.737256 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:48:34 crc kubenswrapper[4732]: I0402 13:48:34.737294 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:48:34 crc kubenswrapper[4732]: I0402 13:48:34.737319 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:48:34 crc kubenswrapper[4732]: I0402 13:48:34.741605 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:48:34 crc kubenswrapper[4732]: I0402 13:48:34.742366 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:48:35 crc kubenswrapper[4732]: I0402 13:48:35.640505 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:48:35 crc kubenswrapper[4732]: I0402 13:48:35.641485 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:48:45 crc kubenswrapper[4732]: I0402 13:48:45.266876 4732 scope.go:117] "RemoveContainer" containerID="0ed979adaba6f9d50a6af5062245f390e4ff9c86cf2c4cb22731d7cc664ad8ff" Apr 02 13:48:45 crc kubenswrapper[4732]: I0402 13:48:45.288519 4732 scope.go:117] "RemoveContainer" containerID="e597b0db2773d2bb9d7e673759dee264ef036a174be679e2ccfa8ba03cf5c6c9" Apr 02 13:48:45 crc kubenswrapper[4732]: I0402 13:48:45.303686 4732 scope.go:117] "RemoveContainer" containerID="1b40621c4cae0af5b1de321842441756156529e8dcb1aff14d7f5dc7db637127" Apr 02 13:48:45 crc kubenswrapper[4732]: I0402 13:48:45.318642 4732 scope.go:117] "RemoveContainer" containerID="5272db3013bda1d5fcb66bd2669d7714381d3bd39802499b4a27873cbccd6ff9" Apr 02 13:48:54 crc kubenswrapper[4732]: I0402 13:48:54.482092 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-8-crc"] Apr 02 13:48:54 crc kubenswrapper[4732]: E0402 13:48:54.482854 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8910d44-74f9-42e1-9497-8db09cc29b58" containerName="installer" Apr 02 13:48:54 crc kubenswrapper[4732]: I0402 13:48:54.482868 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8910d44-74f9-42e1-9497-8db09cc29b58" containerName="installer" Apr 02 13:48:54 crc kubenswrapper[4732]: I0402 13:48:54.482985 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8910d44-74f9-42e1-9497-8db09cc29b58" containerName="installer" Apr 02 13:48:54 crc kubenswrapper[4732]: I0402 13:48:54.483404 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-8-crc" Apr 02 13:48:54 crc kubenswrapper[4732]: I0402 13:48:54.486279 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Apr 02 13:48:54 crc kubenswrapper[4732]: I0402 13:48:54.486937 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-5vhrm" Apr 02 13:48:54 crc kubenswrapper[4732]: I0402 13:48:54.499494 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-8-crc"] Apr 02 13:48:54 crc kubenswrapper[4732]: I0402 13:48:54.620361 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cc10dea-f213-4bc6-b048-6496c3d0902c-kube-api-access\") pod \"installer-8-crc\" (UID: \"7cc10dea-f213-4bc6-b048-6496c3d0902c\") " pod="openshift-kube-scheduler/installer-8-crc" Apr 02 13:48:54 crc kubenswrapper[4732]: I0402 13:48:54.620510 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7cc10dea-f213-4bc6-b048-6496c3d0902c-kubelet-dir\") pod \"installer-8-crc\" (UID: \"7cc10dea-f213-4bc6-b048-6496c3d0902c\") " pod="openshift-kube-scheduler/installer-8-crc" Apr 02 13:48:54 crc kubenswrapper[4732]: I0402 13:48:54.620563 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7cc10dea-f213-4bc6-b048-6496c3d0902c-var-lock\") pod \"installer-8-crc\" (UID: \"7cc10dea-f213-4bc6-b048-6496c3d0902c\") " pod="openshift-kube-scheduler/installer-8-crc" Apr 02 13:48:54 crc kubenswrapper[4732]: I0402 13:48:54.722258 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7cc10dea-f213-4bc6-b048-6496c3d0902c-var-lock\") pod \"installer-8-crc\" (UID: \"7cc10dea-f213-4bc6-b048-6496c3d0902c\") " pod="openshift-kube-scheduler/installer-8-crc" Apr 02 13:48:54 crc kubenswrapper[4732]: I0402 13:48:54.722341 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cc10dea-f213-4bc6-b048-6496c3d0902c-kube-api-access\") pod \"installer-8-crc\" (UID: \"7cc10dea-f213-4bc6-b048-6496c3d0902c\") " pod="openshift-kube-scheduler/installer-8-crc" Apr 02 13:48:54 crc kubenswrapper[4732]: I0402 13:48:54.722387 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7cc10dea-f213-4bc6-b048-6496c3d0902c-var-lock\") pod \"installer-8-crc\" (UID: \"7cc10dea-f213-4bc6-b048-6496c3d0902c\") " pod="openshift-kube-scheduler/installer-8-crc" Apr 02 13:48:54 crc kubenswrapper[4732]: I0402 13:48:54.722417 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7cc10dea-f213-4bc6-b048-6496c3d0902c-kubelet-dir\") pod \"installer-8-crc\" (UID: \"7cc10dea-f213-4bc6-b048-6496c3d0902c\") " pod="openshift-kube-scheduler/installer-8-crc" Apr 02 13:48:54 crc kubenswrapper[4732]: I0402 13:48:54.722473 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7cc10dea-f213-4bc6-b048-6496c3d0902c-kubelet-dir\") pod \"installer-8-crc\" (UID: \"7cc10dea-f213-4bc6-b048-6496c3d0902c\") " pod="openshift-kube-scheduler/installer-8-crc" Apr 02 13:48:54 crc kubenswrapper[4732]: I0402 13:48:54.747651 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cc10dea-f213-4bc6-b048-6496c3d0902c-kube-api-access\") pod \"installer-8-crc\" (UID: \"7cc10dea-f213-4bc6-b048-6496c3d0902c\") " pod="openshift-kube-scheduler/installer-8-crc" Apr 02 13:48:54 crc kubenswrapper[4732]: I0402 13:48:54.817070 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-8-crc" Apr 02 13:48:55 crc kubenswrapper[4732]: I0402 13:48:55.025335 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-8-crc"] Apr 02 13:48:55 crc kubenswrapper[4732]: I0402 13:48:55.774200 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-8-crc" event={"ID":"7cc10dea-f213-4bc6-b048-6496c3d0902c","Type":"ContainerStarted","Data":"54050dbbf6412043b6e5d222ececd5f51a16684f6129a96cd451629585650b91"} Apr 02 13:48:55 crc kubenswrapper[4732]: I0402 13:48:55.774486 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-8-crc" event={"ID":"7cc10dea-f213-4bc6-b048-6496c3d0902c","Type":"ContainerStarted","Data":"b03cc473c4c3c98750329e51dcdae0995ba2665f6638f5798da5f2c52cf917bc"} Apr 02 13:48:55 crc kubenswrapper[4732]: I0402 13:48:55.796256 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-8-crc" podStartSLOduration=1.796224939 podStartE2EDuration="1.796224939s" podCreationTimestamp="2026-04-02 13:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:48:55.791234524 +0000 UTC m=+692.695642137" watchObservedRunningTime="2026-04-02 13:48:55.796224939 +0000 UTC m=+692.700632522" Apr 02 13:48:59 crc kubenswrapper[4732]: I0402 13:48:59.173961 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-12-crc"] Apr 02 13:48:59 crc kubenswrapper[4732]: I0402 13:48:59.175201 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-12-crc" Apr 02 13:48:59 crc kubenswrapper[4732]: I0402 13:48:59.178051 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Apr 02 13:48:59 crc kubenswrapper[4732]: I0402 13:48:59.178493 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Apr 02 13:48:59 crc kubenswrapper[4732]: I0402 13:48:59.179394 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4554477d-c36c-424b-82ee-d3811ba77903-kubelet-dir\") pod \"revision-pruner-12-crc\" (UID: \"4554477d-c36c-424b-82ee-d3811ba77903\") " pod="openshift-kube-controller-manager/revision-pruner-12-crc" Apr 02 13:48:59 crc kubenswrapper[4732]: I0402 13:48:59.179467 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4554477d-c36c-424b-82ee-d3811ba77903-kube-api-access\") pod \"revision-pruner-12-crc\" (UID: \"4554477d-c36c-424b-82ee-d3811ba77903\") " pod="openshift-kube-controller-manager/revision-pruner-12-crc" Apr 02 13:48:59 crc kubenswrapper[4732]: I0402 13:48:59.183039 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-12-crc"] Apr 02 13:48:59 crc kubenswrapper[4732]: I0402 13:48:59.280956 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4554477d-c36c-424b-82ee-d3811ba77903-kube-api-access\") pod \"revision-pruner-12-crc\" (UID: \"4554477d-c36c-424b-82ee-d3811ba77903\") " pod="openshift-kube-controller-manager/revision-pruner-12-crc" Apr 02 13:48:59 crc kubenswrapper[4732]: I0402 13:48:59.281053 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4554477d-c36c-424b-82ee-d3811ba77903-kubelet-dir\") pod \"revision-pruner-12-crc\" (UID: \"4554477d-c36c-424b-82ee-d3811ba77903\") " pod="openshift-kube-controller-manager/revision-pruner-12-crc" Apr 02 13:48:59 crc kubenswrapper[4732]: I0402 13:48:59.281118 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4554477d-c36c-424b-82ee-d3811ba77903-kubelet-dir\") pod \"revision-pruner-12-crc\" (UID: \"4554477d-c36c-424b-82ee-d3811ba77903\") " pod="openshift-kube-controller-manager/revision-pruner-12-crc" Apr 02 13:48:59 crc kubenswrapper[4732]: I0402 13:48:59.298724 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4554477d-c36c-424b-82ee-d3811ba77903-kube-api-access\") pod \"revision-pruner-12-crc\" (UID: \"4554477d-c36c-424b-82ee-d3811ba77903\") " pod="openshift-kube-controller-manager/revision-pruner-12-crc" Apr 02 13:48:59 crc kubenswrapper[4732]: I0402 13:48:59.496095 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-12-crc" Apr 02 13:48:59 crc kubenswrapper[4732]: I0402 13:48:59.650521 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-12-crc"] Apr 02 13:48:59 crc kubenswrapper[4732]: W0402 13:48:59.657108 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4554477d_c36c_424b_82ee_d3811ba77903.slice/crio-c09ace67ab92799d9fe29ff526b33c905a1b73c22abd52f4cd923dfb3644ee21 WatchSource:0}: Error finding container c09ace67ab92799d9fe29ff526b33c905a1b73c22abd52f4cd923dfb3644ee21: Status 404 returned error can't find the container with id c09ace67ab92799d9fe29ff526b33c905a1b73c22abd52f4cd923dfb3644ee21 Apr 02 13:48:59 crc kubenswrapper[4732]: I0402 13:48:59.803061 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-12-crc" event={"ID":"4554477d-c36c-424b-82ee-d3811ba77903","Type":"ContainerStarted","Data":"c09ace67ab92799d9fe29ff526b33c905a1b73c22abd52f4cd923dfb3644ee21"} Apr 02 13:49:00 crc kubenswrapper[4732]: I0402 13:49:00.814421 4732 generic.go:334] "Generic (PLEG): container finished" podID="4554477d-c36c-424b-82ee-d3811ba77903" containerID="5b3a08b8f38215cde2607d4a93b3202d4fe21767e84be2b6b6eff3f8c33b04f9" exitCode=0 Apr 02 13:49:00 crc kubenswrapper[4732]: I0402 13:49:00.815038 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-12-crc" event={"ID":"4554477d-c36c-424b-82ee-d3811ba77903","Type":"ContainerDied","Data":"5b3a08b8f38215cde2607d4a93b3202d4fe21767e84be2b6b6eff3f8c33b04f9"} Apr 02 13:49:01 crc kubenswrapper[4732]: I0402 13:49:01.773457 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-12-crc"] Apr 02 13:49:01 crc kubenswrapper[4732]: I0402 13:49:01.774369 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-12-crc" Apr 02 13:49:01 crc kubenswrapper[4732]: I0402 13:49:01.785440 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-12-crc"] Apr 02 13:49:01 crc kubenswrapper[4732]: I0402 13:49:01.914052 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8a46753-7553-4d02-9a3e-fdea2b938401-kube-api-access\") pod \"installer-12-crc\" (UID: \"a8a46753-7553-4d02-9a3e-fdea2b938401\") " pod="openshift-kube-controller-manager/installer-12-crc" Apr 02 13:49:01 crc kubenswrapper[4732]: I0402 13:49:01.914142 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a8a46753-7553-4d02-9a3e-fdea2b938401-var-lock\") pod \"installer-12-crc\" (UID: \"a8a46753-7553-4d02-9a3e-fdea2b938401\") " pod="openshift-kube-controller-manager/installer-12-crc" Apr 02 13:49:01 crc kubenswrapper[4732]: I0402 13:49:01.914163 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a8a46753-7553-4d02-9a3e-fdea2b938401-kubelet-dir\") pod \"installer-12-crc\" (UID: \"a8a46753-7553-4d02-9a3e-fdea2b938401\") " pod="openshift-kube-controller-manager/installer-12-crc" Apr 02 13:49:01 crc kubenswrapper[4732]: I0402 13:49:01.924510 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 13:49:01 crc kubenswrapper[4732]: I0402 13:49:01.924599 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 13:49:02 crc kubenswrapper[4732]: I0402 13:49:02.015290 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a8a46753-7553-4d02-9a3e-fdea2b938401-var-lock\") pod \"installer-12-crc\" (UID: \"a8a46753-7553-4d02-9a3e-fdea2b938401\") " pod="openshift-kube-controller-manager/installer-12-crc" Apr 02 13:49:02 crc kubenswrapper[4732]: I0402 13:49:02.015655 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a8a46753-7553-4d02-9a3e-fdea2b938401-kubelet-dir\") pod \"installer-12-crc\" (UID: \"a8a46753-7553-4d02-9a3e-fdea2b938401\") " pod="openshift-kube-controller-manager/installer-12-crc" Apr 02 13:49:02 crc kubenswrapper[4732]: I0402 13:49:02.015705 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8a46753-7553-4d02-9a3e-fdea2b938401-kube-api-access\") pod \"installer-12-crc\" (UID: \"a8a46753-7553-4d02-9a3e-fdea2b938401\") " pod="openshift-kube-controller-manager/installer-12-crc" Apr 02 13:49:02 crc kubenswrapper[4732]: I0402 13:49:02.015447 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a8a46753-7553-4d02-9a3e-fdea2b938401-var-lock\") pod \"installer-12-crc\" (UID: \"a8a46753-7553-4d02-9a3e-fdea2b938401\") " pod="openshift-kube-controller-manager/installer-12-crc" Apr 02 13:49:02 crc kubenswrapper[4732]: I0402 13:49:02.015810 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a8a46753-7553-4d02-9a3e-fdea2b938401-kubelet-dir\") pod \"installer-12-crc\" (UID: \"a8a46753-7553-4d02-9a3e-fdea2b938401\") " pod="openshift-kube-controller-manager/installer-12-crc" Apr 02 13:49:02 crc kubenswrapper[4732]: I0402 13:49:02.031091 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-12-crc" Apr 02 13:49:02 crc kubenswrapper[4732]: I0402 13:49:02.034563 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8a46753-7553-4d02-9a3e-fdea2b938401-kube-api-access\") pod \"installer-12-crc\" (UID: \"a8a46753-7553-4d02-9a3e-fdea2b938401\") " pod="openshift-kube-controller-manager/installer-12-crc" Apr 02 13:49:02 crc kubenswrapper[4732]: I0402 13:49:02.103807 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-12-crc" Apr 02 13:49:02 crc kubenswrapper[4732]: I0402 13:49:02.218445 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4554477d-c36c-424b-82ee-d3811ba77903-kube-api-access\") pod \"4554477d-c36c-424b-82ee-d3811ba77903\" (UID: \"4554477d-c36c-424b-82ee-d3811ba77903\") " Apr 02 13:49:02 crc kubenswrapper[4732]: I0402 13:49:02.218805 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4554477d-c36c-424b-82ee-d3811ba77903-kubelet-dir\") pod \"4554477d-c36c-424b-82ee-d3811ba77903\" (UID: \"4554477d-c36c-424b-82ee-d3811ba77903\") " Apr 02 13:49:02 crc kubenswrapper[4732]: I0402 13:49:02.218892 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4554477d-c36c-424b-82ee-d3811ba77903-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4554477d-c36c-424b-82ee-d3811ba77903" (UID: "4554477d-c36c-424b-82ee-d3811ba77903"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:49:02 crc kubenswrapper[4732]: I0402 13:49:02.219283 4732 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4554477d-c36c-424b-82ee-d3811ba77903-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:49:02 crc kubenswrapper[4732]: I0402 13:49:02.221865 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4554477d-c36c-424b-82ee-d3811ba77903-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4554477d-c36c-424b-82ee-d3811ba77903" (UID: "4554477d-c36c-424b-82ee-d3811ba77903"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:49:02 crc kubenswrapper[4732]: I0402 13:49:02.319689 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4554477d-c36c-424b-82ee-d3811ba77903-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 02 13:49:02 crc kubenswrapper[4732]: I0402 13:49:02.490651 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-12-crc"] Apr 02 13:49:02 crc kubenswrapper[4732]: W0402 13:49:02.502854 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda8a46753_7553_4d02_9a3e_fdea2b938401.slice/crio-daba196e90d4806ca0760ecdda0aaf291e45124b21467d1765e9c408cd593ec0 WatchSource:0}: Error finding container daba196e90d4806ca0760ecdda0aaf291e45124b21467d1765e9c408cd593ec0: Status 404 returned error can't find the container with id daba196e90d4806ca0760ecdda0aaf291e45124b21467d1765e9c408cd593ec0 Apr 02 13:49:02 crc kubenswrapper[4732]: I0402 13:49:02.832166 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-12-crc" Apr 02 13:49:02 crc kubenswrapper[4732]: I0402 13:49:02.832170 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-12-crc" event={"ID":"4554477d-c36c-424b-82ee-d3811ba77903","Type":"ContainerDied","Data":"c09ace67ab92799d9fe29ff526b33c905a1b73c22abd52f4cd923dfb3644ee21"} Apr 02 13:49:02 crc kubenswrapper[4732]: I0402 13:49:02.832606 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c09ace67ab92799d9fe29ff526b33c905a1b73c22abd52f4cd923dfb3644ee21" Apr 02 13:49:02 crc kubenswrapper[4732]: I0402 13:49:02.833623 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-12-crc" event={"ID":"a8a46753-7553-4d02-9a3e-fdea2b938401","Type":"ContainerStarted","Data":"daba196e90d4806ca0760ecdda0aaf291e45124b21467d1765e9c408cd593ec0"} Apr 02 13:49:03 crc kubenswrapper[4732]: I0402 13:49:03.844503 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-12-crc" event={"ID":"a8a46753-7553-4d02-9a3e-fdea2b938401","Type":"ContainerStarted","Data":"077e20f616dd0bbdfcfe3e19a35fffadc2ca7d93a2048a68a6bd7023d19b0b38"} Apr 02 13:49:03 crc kubenswrapper[4732]: I0402 13:49:03.866809 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-12-crc" podStartSLOduration=2.866790337 podStartE2EDuration="2.866790337s" podCreationTimestamp="2026-04-02 13:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:49:03.865750209 +0000 UTC m=+700.770157772" watchObservedRunningTime="2026-04-02 13:49:03.866790337 +0000 UTC m=+700.771197890" Apr 02 13:49:07 crc kubenswrapper[4732]: I0402 13:49:07.084232 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-11-crc"] Apr 02 13:49:07 crc kubenswrapper[4732]: E0402 13:49:07.084896 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4554477d-c36c-424b-82ee-d3811ba77903" containerName="pruner" Apr 02 13:49:07 crc kubenswrapper[4732]: I0402 13:49:07.084918 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4554477d-c36c-424b-82ee-d3811ba77903" containerName="pruner" Apr 02 13:49:07 crc kubenswrapper[4732]: I0402 13:49:07.085103 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4554477d-c36c-424b-82ee-d3811ba77903" containerName="pruner" Apr 02 13:49:07 crc kubenswrapper[4732]: I0402 13:49:07.085762 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-11-crc" Apr 02 13:49:07 crc kubenswrapper[4732]: I0402 13:49:07.094570 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-11-crc"] Apr 02 13:49:07 crc kubenswrapper[4732]: I0402 13:49:07.099047 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Apr 02 13:49:07 crc kubenswrapper[4732]: I0402 13:49:07.099090 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Apr 02 13:49:07 crc kubenswrapper[4732]: I0402 13:49:07.283348 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aa065b94-4a5e-4905-ad8e-72cbec230c0c-kubelet-dir\") pod \"revision-pruner-11-crc\" (UID: \"aa065b94-4a5e-4905-ad8e-72cbec230c0c\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Apr 02 13:49:07 crc kubenswrapper[4732]: I0402 13:49:07.283468 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa065b94-4a5e-4905-ad8e-72cbec230c0c-kube-api-access\") pod \"revision-pruner-11-crc\" (UID: \"aa065b94-4a5e-4905-ad8e-72cbec230c0c\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Apr 02 13:49:07 crc kubenswrapper[4732]: I0402 13:49:07.385495 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa065b94-4a5e-4905-ad8e-72cbec230c0c-kube-api-access\") pod \"revision-pruner-11-crc\" (UID: \"aa065b94-4a5e-4905-ad8e-72cbec230c0c\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Apr 02 13:49:07 crc kubenswrapper[4732]: I0402 13:49:07.385700 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aa065b94-4a5e-4905-ad8e-72cbec230c0c-kubelet-dir\") pod \"revision-pruner-11-crc\" (UID: \"aa065b94-4a5e-4905-ad8e-72cbec230c0c\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Apr 02 13:49:07 crc kubenswrapper[4732]: I0402 13:49:07.385826 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aa065b94-4a5e-4905-ad8e-72cbec230c0c-kubelet-dir\") pod \"revision-pruner-11-crc\" (UID: \"aa065b94-4a5e-4905-ad8e-72cbec230c0c\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Apr 02 13:49:07 crc kubenswrapper[4732]: I0402 13:49:07.412207 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa065b94-4a5e-4905-ad8e-72cbec230c0c-kube-api-access\") pod \"revision-pruner-11-crc\" (UID: \"aa065b94-4a5e-4905-ad8e-72cbec230c0c\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Apr 02 13:49:07 crc kubenswrapper[4732]: I0402 13:49:07.414263 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-11-crc" Apr 02 13:49:07 crc kubenswrapper[4732]: I0402 13:49:07.612919 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-11-crc"] Apr 02 13:49:07 crc kubenswrapper[4732]: W0402 13:49:07.617179 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podaa065b94_4a5e_4905_ad8e_72cbec230c0c.slice/crio-a633bf45efe4bb997f6e44f1af45c2b1405dec0ffab8f516013115e9482dcdc3 WatchSource:0}: Error finding container a633bf45efe4bb997f6e44f1af45c2b1405dec0ffab8f516013115e9482dcdc3: Status 404 returned error can't find the container with id a633bf45efe4bb997f6e44f1af45c2b1405dec0ffab8f516013115e9482dcdc3 Apr 02 13:49:07 crc kubenswrapper[4732]: I0402 13:49:07.879945 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-11-crc" event={"ID":"aa065b94-4a5e-4905-ad8e-72cbec230c0c","Type":"ContainerStarted","Data":"a633bf45efe4bb997f6e44f1af45c2b1405dec0ffab8f516013115e9482dcdc3"} Apr 02 13:49:08 crc kubenswrapper[4732]: I0402 13:49:08.888943 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-11-crc" event={"ID":"aa065b94-4a5e-4905-ad8e-72cbec230c0c","Type":"ContainerStarted","Data":"6d3e8aecc87d6e06d46168ab8ef277b74c5ba79bea9a93173db14173b564a7b2"} Apr 02 13:49:08 crc kubenswrapper[4732]: I0402 13:49:08.910541 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-11-crc" podStartSLOduration=1.910513466 podStartE2EDuration="1.910513466s" podCreationTimestamp="2026-04-02 13:49:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:49:08.904035171 +0000 UTC m=+705.808442754" watchObservedRunningTime="2026-04-02 13:49:08.910513466 +0000 UTC m=+705.814921080" Apr 02 13:49:09 crc kubenswrapper[4732]: I0402 13:49:09.896490 4732 generic.go:334] "Generic (PLEG): container finished" podID="aa065b94-4a5e-4905-ad8e-72cbec230c0c" containerID="6d3e8aecc87d6e06d46168ab8ef277b74c5ba79bea9a93173db14173b564a7b2" exitCode=0 Apr 02 13:49:09 crc kubenswrapper[4732]: I0402 13:49:09.896531 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-11-crc" event={"ID":"aa065b94-4a5e-4905-ad8e-72cbec230c0c","Type":"ContainerDied","Data":"6d3e8aecc87d6e06d46168ab8ef277b74c5ba79bea9a93173db14173b564a7b2"} Apr 02 13:49:11 crc kubenswrapper[4732]: I0402 13:49:11.191415 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-11-crc" Apr 02 13:49:11 crc kubenswrapper[4732]: I0402 13:49:11.356103 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa065b94-4a5e-4905-ad8e-72cbec230c0c-kube-api-access\") pod \"aa065b94-4a5e-4905-ad8e-72cbec230c0c\" (UID: \"aa065b94-4a5e-4905-ad8e-72cbec230c0c\") " Apr 02 13:49:11 crc kubenswrapper[4732]: I0402 13:49:11.356195 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aa065b94-4a5e-4905-ad8e-72cbec230c0c-kubelet-dir\") pod \"aa065b94-4a5e-4905-ad8e-72cbec230c0c\" (UID: \"aa065b94-4a5e-4905-ad8e-72cbec230c0c\") " Apr 02 13:49:11 crc kubenswrapper[4732]: I0402 13:49:11.356501 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aa065b94-4a5e-4905-ad8e-72cbec230c0c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "aa065b94-4a5e-4905-ad8e-72cbec230c0c" (UID: "aa065b94-4a5e-4905-ad8e-72cbec230c0c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:49:11 crc kubenswrapper[4732]: I0402 13:49:11.361532 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa065b94-4a5e-4905-ad8e-72cbec230c0c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "aa065b94-4a5e-4905-ad8e-72cbec230c0c" (UID: "aa065b94-4a5e-4905-ad8e-72cbec230c0c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:49:11 crc kubenswrapper[4732]: I0402 13:49:11.457732 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa065b94-4a5e-4905-ad8e-72cbec230c0c-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 02 13:49:11 crc kubenswrapper[4732]: I0402 13:49:11.457783 4732 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aa065b94-4a5e-4905-ad8e-72cbec230c0c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:49:11 crc kubenswrapper[4732]: I0402 13:49:11.913236 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-11-crc" event={"ID":"aa065b94-4a5e-4905-ad8e-72cbec230c0c","Type":"ContainerDied","Data":"a633bf45efe4bb997f6e44f1af45c2b1405dec0ffab8f516013115e9482dcdc3"} Apr 02 13:49:11 crc kubenswrapper[4732]: I0402 13:49:11.913276 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a633bf45efe4bb997f6e44f1af45c2b1405dec0ffab8f516013115e9482dcdc3" Apr 02 13:49:11 crc kubenswrapper[4732]: I0402 13:49:11.913318 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-11-crc" Apr 02 13:49:14 crc kubenswrapper[4732]: I0402 13:49:14.285931 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-11-crc"] Apr 02 13:49:14 crc kubenswrapper[4732]: E0402 13:49:14.286816 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa065b94-4a5e-4905-ad8e-72cbec230c0c" containerName="pruner" Apr 02 13:49:14 crc kubenswrapper[4732]: I0402 13:49:14.286843 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa065b94-4a5e-4905-ad8e-72cbec230c0c" containerName="pruner" Apr 02 13:49:14 crc kubenswrapper[4732]: I0402 13:49:14.287024 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa065b94-4a5e-4905-ad8e-72cbec230c0c" containerName="pruner" Apr 02 13:49:14 crc kubenswrapper[4732]: I0402 13:49:14.287654 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-11-crc" Apr 02 13:49:14 crc kubenswrapper[4732]: I0402 13:49:14.293063 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Apr 02 13:49:14 crc kubenswrapper[4732]: I0402 13:49:14.293360 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Apr 02 13:49:14 crc kubenswrapper[4732]: I0402 13:49:14.299405 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-11-crc"] Apr 02 13:49:14 crc kubenswrapper[4732]: I0402 13:49:14.399209 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7a85873f-8730-4bb9-8d05-621568f7774c-kubelet-dir\") pod \"installer-11-crc\" (UID: \"7a85873f-8730-4bb9-8d05-621568f7774c\") " pod="openshift-kube-apiserver/installer-11-crc" Apr 02 13:49:14 crc kubenswrapper[4732]: I0402 13:49:14.399322 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7a85873f-8730-4bb9-8d05-621568f7774c-var-lock\") pod \"installer-11-crc\" (UID: \"7a85873f-8730-4bb9-8d05-621568f7774c\") " pod="openshift-kube-apiserver/installer-11-crc" Apr 02 13:49:14 crc kubenswrapper[4732]: I0402 13:49:14.399353 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a85873f-8730-4bb9-8d05-621568f7774c-kube-api-access\") pod \"installer-11-crc\" (UID: \"7a85873f-8730-4bb9-8d05-621568f7774c\") " pod="openshift-kube-apiserver/installer-11-crc" Apr 02 13:49:14 crc kubenswrapper[4732]: I0402 13:49:14.500517 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7a85873f-8730-4bb9-8d05-621568f7774c-var-lock\") pod \"installer-11-crc\" (UID: \"7a85873f-8730-4bb9-8d05-621568f7774c\") " pod="openshift-kube-apiserver/installer-11-crc" Apr 02 13:49:14 crc kubenswrapper[4732]: I0402 13:49:14.500578 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a85873f-8730-4bb9-8d05-621568f7774c-kube-api-access\") pod \"installer-11-crc\" (UID: \"7a85873f-8730-4bb9-8d05-621568f7774c\") " pod="openshift-kube-apiserver/installer-11-crc" Apr 02 13:49:14 crc kubenswrapper[4732]: I0402 13:49:14.500670 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7a85873f-8730-4bb9-8d05-621568f7774c-kubelet-dir\") pod \"installer-11-crc\" (UID: \"7a85873f-8730-4bb9-8d05-621568f7774c\") " pod="openshift-kube-apiserver/installer-11-crc" Apr 02 13:49:14 crc kubenswrapper[4732]: I0402 13:49:14.500761 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7a85873f-8730-4bb9-8d05-621568f7774c-kubelet-dir\") pod \"installer-11-crc\" (UID: \"7a85873f-8730-4bb9-8d05-621568f7774c\") " pod="openshift-kube-apiserver/installer-11-crc" Apr 02 13:49:14 crc kubenswrapper[4732]: I0402 13:49:14.501206 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7a85873f-8730-4bb9-8d05-621568f7774c-var-lock\") pod \"installer-11-crc\" (UID: \"7a85873f-8730-4bb9-8d05-621568f7774c\") " pod="openshift-kube-apiserver/installer-11-crc" Apr 02 13:49:14 crc kubenswrapper[4732]: I0402 13:49:14.522337 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a85873f-8730-4bb9-8d05-621568f7774c-kube-api-access\") pod \"installer-11-crc\" (UID: \"7a85873f-8730-4bb9-8d05-621568f7774c\") " pod="openshift-kube-apiserver/installer-11-crc" Apr 02 13:49:14 crc kubenswrapper[4732]: I0402 13:49:14.617321 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-11-crc" Apr 02 13:49:14 crc kubenswrapper[4732]: I0402 13:49:14.839120 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-11-crc"] Apr 02 13:49:14 crc kubenswrapper[4732]: I0402 13:49:14.934941 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-11-crc" event={"ID":"7a85873f-8730-4bb9-8d05-621568f7774c","Type":"ContainerStarted","Data":"c152fbfbba34aed54d2bcd7880cddcc82c85a25be13ae81d7b902157ec5826cd"} Apr 02 13:49:15 crc kubenswrapper[4732]: I0402 13:49:15.945434 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-11-crc" event={"ID":"7a85873f-8730-4bb9-8d05-621568f7774c","Type":"ContainerStarted","Data":"bcb0f498aeabf86c37e1fb053e81393e5ce2f47637442681ca1d116d03ba6945"} Apr 02 13:49:15 crc kubenswrapper[4732]: I0402 13:49:15.970294 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-11-crc" podStartSLOduration=1.970266324 podStartE2EDuration="1.970266324s" podCreationTimestamp="2026-04-02 13:49:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:49:15.962672849 +0000 UTC m=+712.867080412" watchObservedRunningTime="2026-04-02 13:49:15.970266324 +0000 UTC m=+712.874673887" Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.608271 4732 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.610145 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="815516d0756bb9282f4d0a28cef72670" containerName="kube-scheduler" containerID="cri-o://a06c8fb9cbb6779ec889350af216d97764fbc5b1426de87ec0d07df751528119" gracePeriod=30 Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.610410 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="815516d0756bb9282f4d0a28cef72670" containerName="kube-scheduler-recovery-controller" containerID="cri-o://1e4dabd56041117649788f504b9c27c19def6d0699d4cb6d55cd4d16aed89a13" gracePeriod=30 Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.610539 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="815516d0756bb9282f4d0a28cef72670" containerName="kube-scheduler-cert-syncer" containerID="cri-o://ca9c3b985666916a34338e9f991c3bea2b33029e002346a256d5f70d34c5a009" gracePeriod=30 Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.610548 4732 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Apr 02 13:49:26 crc kubenswrapper[4732]: E0402 13:49:26.611022 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815516d0756bb9282f4d0a28cef72670" containerName="kube-scheduler-cert-syncer" Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.611088 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="815516d0756bb9282f4d0a28cef72670" containerName="kube-scheduler-cert-syncer" Apr 02 13:49:26 crc kubenswrapper[4732]: E0402 13:49:26.611105 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815516d0756bb9282f4d0a28cef72670" containerName="wait-for-host-port" Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.611115 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="815516d0756bb9282f4d0a28cef72670" containerName="wait-for-host-port" Apr 02 13:49:26 crc kubenswrapper[4732]: E0402 13:49:26.611135 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815516d0756bb9282f4d0a28cef72670" containerName="kube-scheduler-recovery-controller" Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.611144 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="815516d0756bb9282f4d0a28cef72670" containerName="kube-scheduler-recovery-controller" Apr 02 13:49:26 crc kubenswrapper[4732]: E0402 13:49:26.611157 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815516d0756bb9282f4d0a28cef72670" containerName="kube-scheduler" Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.611166 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="815516d0756bb9282f4d0a28cef72670" containerName="kube-scheduler" Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.611287 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="815516d0756bb9282f4d0a28cef72670" containerName="kube-scheduler" Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.611299 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="815516d0756bb9282f4d0a28cef72670" containerName="kube-scheduler-recovery-controller" Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.611315 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="815516d0756bb9282f4d0a28cef72670" containerName="kube-scheduler-cert-syncer" Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.760570 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d8fd3797d07faa04d98c33c6c96ee09f-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"d8fd3797d07faa04d98c33c6c96ee09f\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.760674 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d8fd3797d07faa04d98c33c6c96ee09f-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"d8fd3797d07faa04d98c33c6c96ee09f\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.791690 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-crc_815516d0756bb9282f4d0a28cef72670/kube-scheduler-cert-syncer/0.log" Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.792588 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.795605 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" oldPodUID="815516d0756bb9282f4d0a28cef72670" podUID="d8fd3797d07faa04d98c33c6c96ee09f" Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.862318 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/815516d0756bb9282f4d0a28cef72670-cert-dir\") pod \"815516d0756bb9282f4d0a28cef72670\" (UID: \"815516d0756bb9282f4d0a28cef72670\") " Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.862384 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/815516d0756bb9282f4d0a28cef72670-resource-dir\") pod \"815516d0756bb9282f4d0a28cef72670\" (UID: \"815516d0756bb9282f4d0a28cef72670\") " Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.862409 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/815516d0756bb9282f4d0a28cef72670-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "815516d0756bb9282f4d0a28cef72670" (UID: "815516d0756bb9282f4d0a28cef72670"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.862457 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/815516d0756bb9282f4d0a28cef72670-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "815516d0756bb9282f4d0a28cef72670" (UID: "815516d0756bb9282f4d0a28cef72670"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.862492 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d8fd3797d07faa04d98c33c6c96ee09f-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"d8fd3797d07faa04d98c33c6c96ee09f\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.862538 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d8fd3797d07faa04d98c33c6c96ee09f-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"d8fd3797d07faa04d98c33c6c96ee09f\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.862581 4732 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/815516d0756bb9282f4d0a28cef72670-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.862592 4732 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/815516d0756bb9282f4d0a28cef72670-cert-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.862638 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d8fd3797d07faa04d98c33c6c96ee09f-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"d8fd3797d07faa04d98c33c6c96ee09f\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:49:26 crc kubenswrapper[4732]: I0402 13:49:26.862665 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d8fd3797d07faa04d98c33c6c96ee09f-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"d8fd3797d07faa04d98c33c6c96ee09f\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.021991 4732 generic.go:334] "Generic (PLEG): container finished" podID="7cc10dea-f213-4bc6-b048-6496c3d0902c" containerID="54050dbbf6412043b6e5d222ececd5f51a16684f6129a96cd451629585650b91" exitCode=0 Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.022122 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-8-crc" event={"ID":"7cc10dea-f213-4bc6-b048-6496c3d0902c","Type":"ContainerDied","Data":"54050dbbf6412043b6e5d222ececd5f51a16684f6129a96cd451629585650b91"} Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.024535 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-crc_815516d0756bb9282f4d0a28cef72670/kube-scheduler-cert-syncer/0.log" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.025846 4732 generic.go:334] "Generic (PLEG): container finished" podID="815516d0756bb9282f4d0a28cef72670" containerID="1e4dabd56041117649788f504b9c27c19def6d0699d4cb6d55cd4d16aed89a13" exitCode=0 Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.025920 4732 generic.go:334] "Generic (PLEG): container finished" podID="815516d0756bb9282f4d0a28cef72670" containerID="ca9c3b985666916a34338e9f991c3bea2b33029e002346a256d5f70d34c5a009" exitCode=2 Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.025934 4732 generic.go:334] "Generic (PLEG): container finished" podID="815516d0756bb9282f4d0a28cef72670" containerID="a06c8fb9cbb6779ec889350af216d97764fbc5b1426de87ec0d07df751528119" exitCode=0 Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.025975 4732 scope.go:117] "RemoveContainer" containerID="1e4dabd56041117649788f504b9c27c19def6d0699d4cb6d55cd4d16aed89a13" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.027064 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.046606 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" oldPodUID="815516d0756bb9282f4d0a28cef72670" podUID="d8fd3797d07faa04d98c33c6c96ee09f" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.049725 4732 scope.go:117] "RemoveContainer" containerID="ca9c3b985666916a34338e9f991c3bea2b33029e002346a256d5f70d34c5a009" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.069984 4732 scope.go:117] "RemoveContainer" containerID="a06c8fb9cbb6779ec889350af216d97764fbc5b1426de87ec0d07df751528119" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.083453 4732 scope.go:117] "RemoveContainer" containerID="f32f0b52a1cae674c9138498583a3bcfbdf8a79fb8948fd230a9af6aded4f582" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.098412 4732 scope.go:117] "RemoveContainer" containerID="1e4dabd56041117649788f504b9c27c19def6d0699d4cb6d55cd4d16aed89a13" Apr 02 13:49:27 crc kubenswrapper[4732]: E0402 13:49:27.098799 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e4dabd56041117649788f504b9c27c19def6d0699d4cb6d55cd4d16aed89a13\": container with ID starting with 1e4dabd56041117649788f504b9c27c19def6d0699d4cb6d55cd4d16aed89a13 not found: ID does not exist" containerID="1e4dabd56041117649788f504b9c27c19def6d0699d4cb6d55cd4d16aed89a13" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.098838 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e4dabd56041117649788f504b9c27c19def6d0699d4cb6d55cd4d16aed89a13"} err="failed to get container status \"1e4dabd56041117649788f504b9c27c19def6d0699d4cb6d55cd4d16aed89a13\": rpc error: code = NotFound desc = could not find container \"1e4dabd56041117649788f504b9c27c19def6d0699d4cb6d55cd4d16aed89a13\": container with ID starting with 1e4dabd56041117649788f504b9c27c19def6d0699d4cb6d55cd4d16aed89a13 not found: ID does not exist" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.098864 4732 scope.go:117] "RemoveContainer" containerID="ca9c3b985666916a34338e9f991c3bea2b33029e002346a256d5f70d34c5a009" Apr 02 13:49:27 crc kubenswrapper[4732]: E0402 13:49:27.099198 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca9c3b985666916a34338e9f991c3bea2b33029e002346a256d5f70d34c5a009\": container with ID starting with ca9c3b985666916a34338e9f991c3bea2b33029e002346a256d5f70d34c5a009 not found: ID does not exist" containerID="ca9c3b985666916a34338e9f991c3bea2b33029e002346a256d5f70d34c5a009" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.099306 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca9c3b985666916a34338e9f991c3bea2b33029e002346a256d5f70d34c5a009"} err="failed to get container status \"ca9c3b985666916a34338e9f991c3bea2b33029e002346a256d5f70d34c5a009\": rpc error: code = NotFound desc = could not find container \"ca9c3b985666916a34338e9f991c3bea2b33029e002346a256d5f70d34c5a009\": container with ID starting with ca9c3b985666916a34338e9f991c3bea2b33029e002346a256d5f70d34c5a009 not found: ID does not exist" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.099405 4732 scope.go:117] "RemoveContainer" containerID="a06c8fb9cbb6779ec889350af216d97764fbc5b1426de87ec0d07df751528119" Apr 02 13:49:27 crc kubenswrapper[4732]: E0402 13:49:27.099793 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a06c8fb9cbb6779ec889350af216d97764fbc5b1426de87ec0d07df751528119\": container with ID starting with a06c8fb9cbb6779ec889350af216d97764fbc5b1426de87ec0d07df751528119 not found: ID does not exist" containerID="a06c8fb9cbb6779ec889350af216d97764fbc5b1426de87ec0d07df751528119" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.099821 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a06c8fb9cbb6779ec889350af216d97764fbc5b1426de87ec0d07df751528119"} err="failed to get container status \"a06c8fb9cbb6779ec889350af216d97764fbc5b1426de87ec0d07df751528119\": rpc error: code = NotFound desc = could not find container \"a06c8fb9cbb6779ec889350af216d97764fbc5b1426de87ec0d07df751528119\": container with ID starting with a06c8fb9cbb6779ec889350af216d97764fbc5b1426de87ec0d07df751528119 not found: ID does not exist" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.099836 4732 scope.go:117] "RemoveContainer" containerID="f32f0b52a1cae674c9138498583a3bcfbdf8a79fb8948fd230a9af6aded4f582" Apr 02 13:49:27 crc kubenswrapper[4732]: E0402 13:49:27.100054 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f32f0b52a1cae674c9138498583a3bcfbdf8a79fb8948fd230a9af6aded4f582\": container with ID starting with f32f0b52a1cae674c9138498583a3bcfbdf8a79fb8948fd230a9af6aded4f582 not found: ID does not exist" containerID="f32f0b52a1cae674c9138498583a3bcfbdf8a79fb8948fd230a9af6aded4f582" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.100077 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f32f0b52a1cae674c9138498583a3bcfbdf8a79fb8948fd230a9af6aded4f582"} err="failed to get container status \"f32f0b52a1cae674c9138498583a3bcfbdf8a79fb8948fd230a9af6aded4f582\": rpc error: code = NotFound desc = could not find container \"f32f0b52a1cae674c9138498583a3bcfbdf8a79fb8948fd230a9af6aded4f582\": container with ID starting with f32f0b52a1cae674c9138498583a3bcfbdf8a79fb8948fd230a9af6aded4f582 not found: ID does not exist" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.100089 4732 scope.go:117] "RemoveContainer" containerID="1e4dabd56041117649788f504b9c27c19def6d0699d4cb6d55cd4d16aed89a13" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.100258 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e4dabd56041117649788f504b9c27c19def6d0699d4cb6d55cd4d16aed89a13"} err="failed to get container status \"1e4dabd56041117649788f504b9c27c19def6d0699d4cb6d55cd4d16aed89a13\": rpc error: code = NotFound desc = could not find container \"1e4dabd56041117649788f504b9c27c19def6d0699d4cb6d55cd4d16aed89a13\": container with ID starting with 1e4dabd56041117649788f504b9c27c19def6d0699d4cb6d55cd4d16aed89a13 not found: ID does not exist" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.100275 4732 scope.go:117] "RemoveContainer" containerID="ca9c3b985666916a34338e9f991c3bea2b33029e002346a256d5f70d34c5a009" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.100454 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca9c3b985666916a34338e9f991c3bea2b33029e002346a256d5f70d34c5a009"} err="failed to get container status \"ca9c3b985666916a34338e9f991c3bea2b33029e002346a256d5f70d34c5a009\": rpc error: code = NotFound desc = could not find container \"ca9c3b985666916a34338e9f991c3bea2b33029e002346a256d5f70d34c5a009\": container with ID starting with ca9c3b985666916a34338e9f991c3bea2b33029e002346a256d5f70d34c5a009 not found: ID does not exist" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.100552 4732 scope.go:117] "RemoveContainer" containerID="a06c8fb9cbb6779ec889350af216d97764fbc5b1426de87ec0d07df751528119" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.100860 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a06c8fb9cbb6779ec889350af216d97764fbc5b1426de87ec0d07df751528119"} err="failed to get container status \"a06c8fb9cbb6779ec889350af216d97764fbc5b1426de87ec0d07df751528119\": rpc error: code = NotFound desc = could not find container \"a06c8fb9cbb6779ec889350af216d97764fbc5b1426de87ec0d07df751528119\": container with ID starting with a06c8fb9cbb6779ec889350af216d97764fbc5b1426de87ec0d07df751528119 not found: ID does not exist" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.100882 4732 scope.go:117] "RemoveContainer" containerID="f32f0b52a1cae674c9138498583a3bcfbdf8a79fb8948fd230a9af6aded4f582" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.101312 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f32f0b52a1cae674c9138498583a3bcfbdf8a79fb8948fd230a9af6aded4f582"} err="failed to get container status \"f32f0b52a1cae674c9138498583a3bcfbdf8a79fb8948fd230a9af6aded4f582\": rpc error: code = NotFound desc = could not find container \"f32f0b52a1cae674c9138498583a3bcfbdf8a79fb8948fd230a9af6aded4f582\": container with ID starting with f32f0b52a1cae674c9138498583a3bcfbdf8a79fb8948fd230a9af6aded4f582 not found: ID does not exist" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.101336 4732 scope.go:117] "RemoveContainer" containerID="1e4dabd56041117649788f504b9c27c19def6d0699d4cb6d55cd4d16aed89a13" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.101555 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e4dabd56041117649788f504b9c27c19def6d0699d4cb6d55cd4d16aed89a13"} err="failed to get container status \"1e4dabd56041117649788f504b9c27c19def6d0699d4cb6d55cd4d16aed89a13\": rpc error: code = NotFound desc = could not find container \"1e4dabd56041117649788f504b9c27c19def6d0699d4cb6d55cd4d16aed89a13\": container with ID starting with 1e4dabd56041117649788f504b9c27c19def6d0699d4cb6d55cd4d16aed89a13 not found: ID does not exist" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.101691 4732 scope.go:117] "RemoveContainer" containerID="ca9c3b985666916a34338e9f991c3bea2b33029e002346a256d5f70d34c5a009" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.102039 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca9c3b985666916a34338e9f991c3bea2b33029e002346a256d5f70d34c5a009"} err="failed to get container status \"ca9c3b985666916a34338e9f991c3bea2b33029e002346a256d5f70d34c5a009\": rpc error: code = NotFound desc = could not find container \"ca9c3b985666916a34338e9f991c3bea2b33029e002346a256d5f70d34c5a009\": container with ID starting with ca9c3b985666916a34338e9f991c3bea2b33029e002346a256d5f70d34c5a009 not found: ID does not exist" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.102060 4732 scope.go:117] "RemoveContainer" containerID="a06c8fb9cbb6779ec889350af216d97764fbc5b1426de87ec0d07df751528119" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.102366 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a06c8fb9cbb6779ec889350af216d97764fbc5b1426de87ec0d07df751528119"} err="failed to get container status \"a06c8fb9cbb6779ec889350af216d97764fbc5b1426de87ec0d07df751528119\": rpc error: code = NotFound desc = could not find container \"a06c8fb9cbb6779ec889350af216d97764fbc5b1426de87ec0d07df751528119\": container with ID starting with a06c8fb9cbb6779ec889350af216d97764fbc5b1426de87ec0d07df751528119 not found: ID does not exist" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.102392 4732 scope.go:117] "RemoveContainer" containerID="f32f0b52a1cae674c9138498583a3bcfbdf8a79fb8948fd230a9af6aded4f582" Apr 02 13:49:27 crc kubenswrapper[4732]: I0402 13:49:27.102558 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f32f0b52a1cae674c9138498583a3bcfbdf8a79fb8948fd230a9af6aded4f582"} err="failed to get container status \"f32f0b52a1cae674c9138498583a3bcfbdf8a79fb8948fd230a9af6aded4f582\": rpc error: code = NotFound desc = could not find container \"f32f0b52a1cae674c9138498583a3bcfbdf8a79fb8948fd230a9af6aded4f582\": container with ID starting with f32f0b52a1cae674c9138498583a3bcfbdf8a79fb8948fd230a9af6aded4f582 not found: ID does not exist" Apr 02 13:49:28 crc kubenswrapper[4732]: I0402 13:49:28.284857 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-8-crc" Apr 02 13:49:28 crc kubenswrapper[4732]: I0402 13:49:28.305381 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cc10dea-f213-4bc6-b048-6496c3d0902c-kube-api-access\") pod \"7cc10dea-f213-4bc6-b048-6496c3d0902c\" (UID: \"7cc10dea-f213-4bc6-b048-6496c3d0902c\") " Apr 02 13:49:28 crc kubenswrapper[4732]: I0402 13:49:28.305435 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7cc10dea-f213-4bc6-b048-6496c3d0902c-var-lock\") pod \"7cc10dea-f213-4bc6-b048-6496c3d0902c\" (UID: \"7cc10dea-f213-4bc6-b048-6496c3d0902c\") " Apr 02 13:49:28 crc kubenswrapper[4732]: I0402 13:49:28.305496 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7cc10dea-f213-4bc6-b048-6496c3d0902c-kubelet-dir\") pod \"7cc10dea-f213-4bc6-b048-6496c3d0902c\" (UID: \"7cc10dea-f213-4bc6-b048-6496c3d0902c\") " Apr 02 13:49:28 crc kubenswrapper[4732]: I0402 13:49:28.306027 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cc10dea-f213-4bc6-b048-6496c3d0902c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7cc10dea-f213-4bc6-b048-6496c3d0902c" (UID: "7cc10dea-f213-4bc6-b048-6496c3d0902c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:49:28 crc kubenswrapper[4732]: I0402 13:49:28.305851 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cc10dea-f213-4bc6-b048-6496c3d0902c-var-lock" (OuterVolumeSpecName: "var-lock") pod "7cc10dea-f213-4bc6-b048-6496c3d0902c" (UID: "7cc10dea-f213-4bc6-b048-6496c3d0902c"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:49:28 crc kubenswrapper[4732]: I0402 13:49:28.322954 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cc10dea-f213-4bc6-b048-6496c3d0902c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7cc10dea-f213-4bc6-b048-6496c3d0902c" (UID: "7cc10dea-f213-4bc6-b048-6496c3d0902c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:49:28 crc kubenswrapper[4732]: I0402 13:49:28.407695 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cc10dea-f213-4bc6-b048-6496c3d0902c-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 02 13:49:28 crc kubenswrapper[4732]: I0402 13:49:28.407742 4732 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7cc10dea-f213-4bc6-b048-6496c3d0902c-var-lock\") on node \"crc\" DevicePath \"\"" Apr 02 13:49:28 crc kubenswrapper[4732]: I0402 13:49:28.407754 4732 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7cc10dea-f213-4bc6-b048-6496c3d0902c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:49:28 crc kubenswrapper[4732]: I0402 13:49:28.688120 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="815516d0756bb9282f4d0a28cef72670" path="/var/lib/kubelet/pods/815516d0756bb9282f4d0a28cef72670/volumes" Apr 02 13:49:29 crc kubenswrapper[4732]: I0402 13:49:29.042936 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-8-crc" event={"ID":"7cc10dea-f213-4bc6-b048-6496c3d0902c","Type":"ContainerDied","Data":"b03cc473c4c3c98750329e51dcdae0995ba2665f6638f5798da5f2c52cf917bc"} Apr 02 13:49:29 crc kubenswrapper[4732]: I0402 13:49:29.042983 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b03cc473c4c3c98750329e51dcdae0995ba2665f6638f5798da5f2c52cf917bc" Apr 02 13:49:29 crc kubenswrapper[4732]: I0402 13:49:29.043572 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-8-crc" Apr 02 13:49:31 crc kubenswrapper[4732]: I0402 13:49:31.924961 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 13:49:31 crc kubenswrapper[4732]: I0402 13:49:31.926510 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 13:49:35 crc kubenswrapper[4732]: I0402 13:49:35.698575 4732 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Apr 02 13:49:35 crc kubenswrapper[4732]: I0402 13:49:35.699287 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="235e9295064844132a05dc40ef3a886a" containerName="kube-controller-manager" containerID="cri-o://5c716ed2094065ff92ec97fb32bca6973e69e1cd0018780759102dde6b4df062" gracePeriod=30 Apr 02 13:49:35 crc kubenswrapper[4732]: I0402 13:49:35.699364 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="235e9295064844132a05dc40ef3a886a" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://73c0dd13f9fc718080de9764a6d5dbfc2998a838fb22d3778e131c10889b83ec" gracePeriod=30 Apr 02 13:49:35 crc kubenswrapper[4732]: I0402 13:49:35.699382 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="235e9295064844132a05dc40ef3a886a" containerName="cluster-policy-controller" containerID="cri-o://cf6d40837f21f6428d52f3bb75f667c0ebf856b0af510889a9695093c21b6885" gracePeriod=30 Apr 02 13:49:35 crc kubenswrapper[4732]: I0402 13:49:35.699398 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="235e9295064844132a05dc40ef3a886a" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://07bfdaa7a1fbf8b3c33ccd90050537f2c5efaab4563f16feace73117b1bcdb91" gracePeriod=30 Apr 02 13:49:35 crc kubenswrapper[4732]: I0402 13:49:35.701017 4732 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Apr 02 13:49:35 crc kubenswrapper[4732]: E0402 13:49:35.701318 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc10dea-f213-4bc6-b048-6496c3d0902c" containerName="installer" Apr 02 13:49:35 crc kubenswrapper[4732]: I0402 13:49:35.701337 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc10dea-f213-4bc6-b048-6496c3d0902c" containerName="installer" Apr 02 13:49:35 crc kubenswrapper[4732]: E0402 13:49:35.701355 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="235e9295064844132a05dc40ef3a886a" containerName="kube-controller-manager" Apr 02 13:49:35 crc kubenswrapper[4732]: I0402 13:49:35.701367 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="235e9295064844132a05dc40ef3a886a" containerName="kube-controller-manager" Apr 02 13:49:35 crc kubenswrapper[4732]: E0402 13:49:35.701385 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="235e9295064844132a05dc40ef3a886a" containerName="cluster-policy-controller" Apr 02 13:49:35 crc kubenswrapper[4732]: I0402 13:49:35.701397 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="235e9295064844132a05dc40ef3a886a" containerName="cluster-policy-controller" Apr 02 13:49:35 crc kubenswrapper[4732]: E0402 13:49:35.701423 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="235e9295064844132a05dc40ef3a886a" containerName="kube-controller-manager-recovery-controller" Apr 02 13:49:35 crc kubenswrapper[4732]: I0402 13:49:35.701437 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="235e9295064844132a05dc40ef3a886a" containerName="kube-controller-manager-recovery-controller" Apr 02 13:49:35 crc kubenswrapper[4732]: E0402 13:49:35.701461 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="235e9295064844132a05dc40ef3a886a" containerName="kube-controller-manager-cert-syncer" Apr 02 13:49:35 crc kubenswrapper[4732]: I0402 13:49:35.701474 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="235e9295064844132a05dc40ef3a886a" containerName="kube-controller-manager-cert-syncer" Apr 02 13:49:35 crc kubenswrapper[4732]: I0402 13:49:35.701667 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="235e9295064844132a05dc40ef3a886a" containerName="kube-controller-manager" Apr 02 13:49:35 crc kubenswrapper[4732]: I0402 13:49:35.701688 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="235e9295064844132a05dc40ef3a886a" containerName="cluster-policy-controller" Apr 02 13:49:35 crc kubenswrapper[4732]: I0402 13:49:35.701707 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc10dea-f213-4bc6-b048-6496c3d0902c" containerName="installer" Apr 02 13:49:35 crc kubenswrapper[4732]: I0402 13:49:35.701731 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="235e9295064844132a05dc40ef3a886a" containerName="kube-controller-manager-recovery-controller" Apr 02 13:49:35 crc kubenswrapper[4732]: I0402 13:49:35.701757 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="235e9295064844132a05dc40ef3a886a" containerName="kube-controller-manager-cert-syncer" Apr 02 13:49:35 crc kubenswrapper[4732]: I0402 13:49:35.838248 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c32a96981201f35bdc64ba062620676a-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"c32a96981201f35bdc64ba062620676a\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:49:35 crc kubenswrapper[4732]: I0402 13:49:35.838533 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c32a96981201f35bdc64ba062620676a-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"c32a96981201f35bdc64ba062620676a\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:49:35 crc kubenswrapper[4732]: I0402 13:49:35.902689 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_235e9295064844132a05dc40ef3a886a/kube-controller-manager-cert-syncer/0.log" Apr 02 13:49:35 crc kubenswrapper[4732]: I0402 13:49:35.903817 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:49:35 crc kubenswrapper[4732]: I0402 13:49:35.906976 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-crc" oldPodUID="235e9295064844132a05dc40ef3a886a" podUID="c32a96981201f35bdc64ba062620676a" Apr 02 13:49:35 crc kubenswrapper[4732]: I0402 13:49:35.939821 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c32a96981201f35bdc64ba062620676a-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"c32a96981201f35bdc64ba062620676a\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:49:35 crc kubenswrapper[4732]: I0402 13:49:35.939875 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c32a96981201f35bdc64ba062620676a-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"c32a96981201f35bdc64ba062620676a\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:49:35 crc kubenswrapper[4732]: I0402 13:49:35.939972 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c32a96981201f35bdc64ba062620676a-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"c32a96981201f35bdc64ba062620676a\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:49:35 crc kubenswrapper[4732]: I0402 13:49:35.940383 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c32a96981201f35bdc64ba062620676a-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"c32a96981201f35bdc64ba062620676a\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:49:36 crc kubenswrapper[4732]: I0402 13:49:36.040544 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/235e9295064844132a05dc40ef3a886a-resource-dir\") pod \"235e9295064844132a05dc40ef3a886a\" (UID: \"235e9295064844132a05dc40ef3a886a\") " Apr 02 13:49:36 crc kubenswrapper[4732]: I0402 13:49:36.040649 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/235e9295064844132a05dc40ef3a886a-cert-dir\") pod \"235e9295064844132a05dc40ef3a886a\" (UID: \"235e9295064844132a05dc40ef3a886a\") " Apr 02 13:49:36 crc kubenswrapper[4732]: I0402 13:49:36.040885 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/235e9295064844132a05dc40ef3a886a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "235e9295064844132a05dc40ef3a886a" (UID: "235e9295064844132a05dc40ef3a886a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:49:36 crc kubenswrapper[4732]: I0402 13:49:36.040937 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/235e9295064844132a05dc40ef3a886a-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "235e9295064844132a05dc40ef3a886a" (UID: "235e9295064844132a05dc40ef3a886a"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:49:36 crc kubenswrapper[4732]: I0402 13:49:36.100572 4732 generic.go:334] "Generic (PLEG): container finished" podID="a8a46753-7553-4d02-9a3e-fdea2b938401" containerID="077e20f616dd0bbdfcfe3e19a35fffadc2ca7d93a2048a68a6bd7023d19b0b38" exitCode=0 Apr 02 13:49:36 crc kubenswrapper[4732]: I0402 13:49:36.100692 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-12-crc" event={"ID":"a8a46753-7553-4d02-9a3e-fdea2b938401","Type":"ContainerDied","Data":"077e20f616dd0bbdfcfe3e19a35fffadc2ca7d93a2048a68a6bd7023d19b0b38"} Apr 02 13:49:36 crc kubenswrapper[4732]: I0402 13:49:36.106996 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_235e9295064844132a05dc40ef3a886a/kube-controller-manager-cert-syncer/0.log" Apr 02 13:49:36 crc kubenswrapper[4732]: I0402 13:49:36.108851 4732 generic.go:334] "Generic (PLEG): container finished" podID="235e9295064844132a05dc40ef3a886a" containerID="07bfdaa7a1fbf8b3c33ccd90050537f2c5efaab4563f16feace73117b1bcdb91" exitCode=0 Apr 02 13:49:36 crc kubenswrapper[4732]: I0402 13:49:36.108896 4732 generic.go:334] "Generic (PLEG): container finished" podID="235e9295064844132a05dc40ef3a886a" containerID="73c0dd13f9fc718080de9764a6d5dbfc2998a838fb22d3778e131c10889b83ec" exitCode=2 Apr 02 13:49:36 crc kubenswrapper[4732]: I0402 13:49:36.108912 4732 generic.go:334] "Generic (PLEG): container finished" podID="235e9295064844132a05dc40ef3a886a" containerID="cf6d40837f21f6428d52f3bb75f667c0ebf856b0af510889a9695093c21b6885" exitCode=0 Apr 02 13:49:36 crc kubenswrapper[4732]: I0402 13:49:36.108929 4732 generic.go:334] "Generic (PLEG): container finished" podID="235e9295064844132a05dc40ef3a886a" containerID="5c716ed2094065ff92ec97fb32bca6973e69e1cd0018780759102dde6b4df062" exitCode=0 Apr 02 13:49:36 crc kubenswrapper[4732]: I0402 13:49:36.108947 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:49:36 crc kubenswrapper[4732]: I0402 13:49:36.108971 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="853f32b8ecb8d68c2afb0765bbfd69b4645b7d30511096592b337a1c09167f3b" Apr 02 13:49:36 crc kubenswrapper[4732]: I0402 13:49:36.124028 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-crc" oldPodUID="235e9295064844132a05dc40ef3a886a" podUID="c32a96981201f35bdc64ba062620676a" Apr 02 13:49:36 crc kubenswrapper[4732]: I0402 13:49:36.130821 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-crc" oldPodUID="235e9295064844132a05dc40ef3a886a" podUID="c32a96981201f35bdc64ba062620676a" Apr 02 13:49:36 crc kubenswrapper[4732]: I0402 13:49:36.142333 4732 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/235e9295064844132a05dc40ef3a886a-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:49:36 crc kubenswrapper[4732]: I0402 13:49:36.142451 4732 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/235e9295064844132a05dc40ef3a886a-cert-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:49:36 crc kubenswrapper[4732]: I0402 13:49:36.686707 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="235e9295064844132a05dc40ef3a886a" path="/var/lib/kubelet/pods/235e9295064844132a05dc40ef3a886a/volumes" Apr 02 13:49:37 crc kubenswrapper[4732]: I0402 13:49:37.344291 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-12-crc" Apr 02 13:49:37 crc kubenswrapper[4732]: I0402 13:49:37.458432 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8a46753-7553-4d02-9a3e-fdea2b938401-kube-api-access\") pod \"a8a46753-7553-4d02-9a3e-fdea2b938401\" (UID: \"a8a46753-7553-4d02-9a3e-fdea2b938401\") " Apr 02 13:49:37 crc kubenswrapper[4732]: I0402 13:49:37.458525 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a8a46753-7553-4d02-9a3e-fdea2b938401-kubelet-dir\") pod \"a8a46753-7553-4d02-9a3e-fdea2b938401\" (UID: \"a8a46753-7553-4d02-9a3e-fdea2b938401\") " Apr 02 13:49:37 crc kubenswrapper[4732]: I0402 13:49:37.458581 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a8a46753-7553-4d02-9a3e-fdea2b938401-var-lock\") pod \"a8a46753-7553-4d02-9a3e-fdea2b938401\" (UID: \"a8a46753-7553-4d02-9a3e-fdea2b938401\") " Apr 02 13:49:37 crc kubenswrapper[4732]: I0402 13:49:37.458777 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8a46753-7553-4d02-9a3e-fdea2b938401-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a8a46753-7553-4d02-9a3e-fdea2b938401" (UID: "a8a46753-7553-4d02-9a3e-fdea2b938401"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:49:37 crc kubenswrapper[4732]: I0402 13:49:37.458845 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8a46753-7553-4d02-9a3e-fdea2b938401-var-lock" (OuterVolumeSpecName: "var-lock") pod "a8a46753-7553-4d02-9a3e-fdea2b938401" (UID: "a8a46753-7553-4d02-9a3e-fdea2b938401"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:49:37 crc kubenswrapper[4732]: I0402 13:49:37.458999 4732 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a8a46753-7553-4d02-9a3e-fdea2b938401-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:49:37 crc kubenswrapper[4732]: I0402 13:49:37.459023 4732 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a8a46753-7553-4d02-9a3e-fdea2b938401-var-lock\") on node \"crc\" DevicePath \"\"" Apr 02 13:49:37 crc kubenswrapper[4732]: I0402 13:49:37.466771 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8a46753-7553-4d02-9a3e-fdea2b938401-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a8a46753-7553-4d02-9a3e-fdea2b938401" (UID: "a8a46753-7553-4d02-9a3e-fdea2b938401"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:49:37 crc kubenswrapper[4732]: I0402 13:49:37.559863 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8a46753-7553-4d02-9a3e-fdea2b938401-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 02 13:49:38 crc kubenswrapper[4732]: I0402 13:49:38.135539 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-12-crc" Apr 02 13:49:38 crc kubenswrapper[4732]: I0402 13:49:38.140107 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-12-crc" event={"ID":"a8a46753-7553-4d02-9a3e-fdea2b938401","Type":"ContainerDied","Data":"daba196e90d4806ca0760ecdda0aaf291e45124b21467d1765e9c408cd593ec0"} Apr 02 13:49:38 crc kubenswrapper[4732]: I0402 13:49:38.140234 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="daba196e90d4806ca0760ecdda0aaf291e45124b21467d1765e9c408cd593ec0" Apr 02 13:49:39 crc kubenswrapper[4732]: I0402 13:49:39.680010 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:49:39 crc kubenswrapper[4732]: I0402 13:49:39.705705 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="42dfcae1-0261-43f8-97e3-fe64c9532cc1" Apr 02 13:49:39 crc kubenswrapper[4732]: I0402 13:49:39.705745 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="42dfcae1-0261-43f8-97e3-fe64c9532cc1" Apr 02 13:49:39 crc kubenswrapper[4732]: I0402 13:49:39.715451 4732 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:49:39 crc kubenswrapper[4732]: I0402 13:49:39.725207 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Apr 02 13:49:39 crc kubenswrapper[4732]: I0402 13:49:39.728951 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:49:39 crc kubenswrapper[4732]: I0402 13:49:39.734348 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Apr 02 13:49:39 crc kubenswrapper[4732]: I0402 13:49:39.741327 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Apr 02 13:49:40 crc kubenswrapper[4732]: I0402 13:49:40.153579 4732 generic.go:334] "Generic (PLEG): container finished" podID="d8fd3797d07faa04d98c33c6c96ee09f" containerID="976b00991ba38d2c1854d7dda699ab78aaf707022190e0ccee565c7c4e72bd7b" exitCode=0 Apr 02 13:49:40 crc kubenswrapper[4732]: I0402 13:49:40.153705 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"d8fd3797d07faa04d98c33c6c96ee09f","Type":"ContainerDied","Data":"976b00991ba38d2c1854d7dda699ab78aaf707022190e0ccee565c7c4e72bd7b"} Apr 02 13:49:40 crc kubenswrapper[4732]: I0402 13:49:40.153778 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"d8fd3797d07faa04d98c33c6c96ee09f","Type":"ContainerStarted","Data":"7f0a512a7021294bcd990c67776f75c25857dd746f2f851b43878f63f9b00eca"} Apr 02 13:49:41 crc kubenswrapper[4732]: I0402 13:49:41.161721 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"d8fd3797d07faa04d98c33c6c96ee09f","Type":"ContainerStarted","Data":"4ed059d73d5177f68a46f298ec3eb39456b441969b2af7dda98d95ad86334462"} Apr 02 13:49:41 crc kubenswrapper[4732]: I0402 13:49:41.162036 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"d8fd3797d07faa04d98c33c6c96ee09f","Type":"ContainerStarted","Data":"475b46f5f5254dffe2ff508dca82a81dc2726ca71651984fdbc73b19dab3f944"} Apr 02 13:49:41 crc kubenswrapper[4732]: I0402 13:49:41.162049 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"d8fd3797d07faa04d98c33c6c96ee09f","Type":"ContainerStarted","Data":"a73db20091e3d124fdd48c59da82503a6d86f2f91e2943ab663e8086e51aea46"} Apr 02 13:49:41 crc kubenswrapper[4732]: I0402 13:49:41.162270 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:49:41 crc kubenswrapper[4732]: I0402 13:49:41.177657 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=2.177638104 podStartE2EDuration="2.177638104s" podCreationTimestamp="2026-04-02 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:49:41.174832198 +0000 UTC m=+738.079239771" watchObservedRunningTime="2026-04-02 13:49:41.177638104 +0000 UTC m=+738.082045657" Apr 02 13:49:45 crc kubenswrapper[4732]: I0402 13:49:45.377651 4732 scope.go:117] "RemoveContainer" containerID="4f97db3ca878363f668def2b878f242e7584741f7042e02345e8ff27a6849eaa" Apr 02 13:49:45 crc kubenswrapper[4732]: I0402 13:49:45.410084 4732 scope.go:117] "RemoveContainer" containerID="4fc7dc4786a810a2f97f61576688118cfe6a02779cb70bcb97a45de56c11f618" Apr 02 13:49:50 crc kubenswrapper[4732]: I0402 13:49:50.679791 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:49:50 crc kubenswrapper[4732]: I0402 13:49:50.693908 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="0be6f2ba-4800-4cde-a4d5-cc947c0a04d6" Apr 02 13:49:50 crc kubenswrapper[4732]: I0402 13:49:50.693934 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="0be6f2ba-4800-4cde-a4d5-cc947c0a04d6" Apr 02 13:49:50 crc kubenswrapper[4732]: I0402 13:49:50.700334 4732 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:49:50 crc kubenswrapper[4732]: I0402 13:49:50.717489 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:49:50 crc kubenswrapper[4732]: I0402 13:49:50.718748 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Apr 02 13:49:50 crc kubenswrapper[4732]: I0402 13:49:50.733688 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Apr 02 13:49:50 crc kubenswrapper[4732]: I0402 13:49:50.736822 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Apr 02 13:49:50 crc kubenswrapper[4732]: W0402 13:49:50.740450 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc32a96981201f35bdc64ba062620676a.slice/crio-a36367c0a97fca2791873091f79a0c426509fb35e281ddde288f1e630a81a347 WatchSource:0}: Error finding container a36367c0a97fca2791873091f79a0c426509fb35e281ddde288f1e630a81a347: Status 404 returned error can't find the container with id a36367c0a97fca2791873091f79a0c426509fb35e281ddde288f1e630a81a347 Apr 02 13:49:51 crc kubenswrapper[4732]: I0402 13:49:51.232681 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"c32a96981201f35bdc64ba062620676a","Type":"ContainerStarted","Data":"5dcb8da2fa054d1659926ae252b72af447a46132346d2936d32b53279b88ab6d"} Apr 02 13:49:51 crc kubenswrapper[4732]: I0402 13:49:51.232739 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"c32a96981201f35bdc64ba062620676a","Type":"ContainerStarted","Data":"d0fd335b7ea7cbbb67d3ec7dec5f65dadbe0a0e151cc666cc0f13812a984394b"} Apr 02 13:49:51 crc kubenswrapper[4732]: I0402 13:49:51.232755 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"c32a96981201f35bdc64ba062620676a","Type":"ContainerStarted","Data":"a36367c0a97fca2791873091f79a0c426509fb35e281ddde288f1e630a81a347"} Apr 02 13:49:52 crc kubenswrapper[4732]: I0402 13:49:52.244889 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"c32a96981201f35bdc64ba062620676a","Type":"ContainerStarted","Data":"8e16254843bf9a9daf5bfdcbbf6b32aaf2a57ecb0e7ae0360a0f0bd1cd440db4"} Apr 02 13:49:52 crc kubenswrapper[4732]: I0402 13:49:52.245251 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"c32a96981201f35bdc64ba062620676a","Type":"ContainerStarted","Data":"4f7fa6b6b139bfb359390f1adcd49312e8ae790fb7682aa0695cf9848a02c7f8"} Apr 02 13:49:52 crc kubenswrapper[4732]: I0402 13:49:52.277165 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=2.277145145 podStartE2EDuration="2.277145145s" podCreationTimestamp="2026-04-02 13:49:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:49:52.264760791 +0000 UTC m=+749.169168384" watchObservedRunningTime="2026-04-02 13:49:52.277145145 +0000 UTC m=+749.181552708" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.090435 4732 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.090866 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver" containerID="cri-o://96cc6a0134c07b1897cf08a443d481b40cafe36f9fc7219d1f0b472531c54e50" gracePeriod=15 Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.090944 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-check-endpoints" containerID="cri-o://2c9bd70787ed9f0ffcf2b5aa836cdc2f70f83b8baed3300ce8244a1130f6edc6" gracePeriod=15 Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.091011 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://154f96ad2e07e914c8919ce03f5a46d1f2c4770c36a0f52d47b1c9f79ba64c1e" gracePeriod=15 Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.091094 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://22672e53ab610750e5765dc8ce0aacf7b0848b1e72f7e1924976df8af0e68f30" gracePeriod=15 Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.091103 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-cert-syncer" containerID="cri-o://060b077035d0d8f3b137497cde23a9a2def5bcac1994d53cb009323a5939cb36" gracePeriod=15 Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.092859 4732 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Apr 02 13:49:53 crc kubenswrapper[4732]: E0402 13:49:53.093118 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8a46753-7553-4d02-9a3e-fdea2b938401" containerName="installer" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.093144 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8a46753-7553-4d02-9a3e-fdea2b938401" containerName="installer" Apr 02 13:49:53 crc kubenswrapper[4732]: E0402 13:49:53.093162 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.093174 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver" Apr 02 13:49:53 crc kubenswrapper[4732]: E0402 13:49:53.093194 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-cert-regeneration-controller" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.093206 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-cert-regeneration-controller" Apr 02 13:49:53 crc kubenswrapper[4732]: E0402 13:49:53.093223 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="setup" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.093232 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="setup" Apr 02 13:49:53 crc kubenswrapper[4732]: E0402 13:49:53.093250 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-cert-syncer" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.093262 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-cert-syncer" Apr 02 13:49:53 crc kubenswrapper[4732]: E0402 13:49:53.093278 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-check-endpoints" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.093290 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-check-endpoints" Apr 02 13:49:53 crc kubenswrapper[4732]: E0402 13:49:53.093310 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-insecure-readyz" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.093321 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-insecure-readyz" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.093468 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-insecure-readyz" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.093486 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-cert-syncer" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.093502 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-check-endpoints" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.093516 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-cert-regeneration-controller" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.093536 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.093550 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8a46753-7553-4d02-9a3e-fdea2b938401" containerName="installer" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.096330 4732 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.097822 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.105965 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="4e6039c7a12c5a0c0ef5917dc7ee5582" podUID="3f04c31653fd2d52d145a959c922a0d3" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.249463 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3f04c31653fd2d52d145a959c922a0d3-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"3f04c31653fd2d52d145a959c922a0d3\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.249844 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.249888 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.249916 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.249959 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.250002 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.250169 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3f04c31653fd2d52d145a959c922a0d3-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"3f04c31653fd2d52d145a959c922a0d3\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.250221 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3f04c31653fd2d52d145a959c922a0d3-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"3f04c31653fd2d52d145a959c922a0d3\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.255341 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_4e6039c7a12c5a0c0ef5917dc7ee5582/kube-apiserver-cert-syncer/0.log" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.256012 4732 generic.go:334] "Generic (PLEG): container finished" podID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerID="2c9bd70787ed9f0ffcf2b5aa836cdc2f70f83b8baed3300ce8244a1130f6edc6" exitCode=0 Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.256038 4732 generic.go:334] "Generic (PLEG): container finished" podID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerID="154f96ad2e07e914c8919ce03f5a46d1f2c4770c36a0f52d47b1c9f79ba64c1e" exitCode=0 Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.256046 4732 generic.go:334] "Generic (PLEG): container finished" podID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerID="22672e53ab610750e5765dc8ce0aacf7b0848b1e72f7e1924976df8af0e68f30" exitCode=0 Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.256055 4732 generic.go:334] "Generic (PLEG): container finished" podID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerID="060b077035d0d8f3b137497cde23a9a2def5bcac1994d53cb009323a5939cb36" exitCode=2 Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.257436 4732 generic.go:334] "Generic (PLEG): container finished" podID="7a85873f-8730-4bb9-8d05-621568f7774c" containerID="bcb0f498aeabf86c37e1fb053e81393e5ce2f47637442681ca1d116d03ba6945" exitCode=0 Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.257518 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-11-crc" event={"ID":"7a85873f-8730-4bb9-8d05-621568f7774c","Type":"ContainerDied","Data":"bcb0f498aeabf86c37e1fb053e81393e5ce2f47637442681ca1d116d03ba6945"} Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.258436 4732 status_manager.go:851] "Failed to get status for pod" podUID="7a85873f-8730-4bb9-8d05-621568f7774c" pod="openshift-kube-apiserver/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.351210 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.351253 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.351258 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.351286 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.351315 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.351329 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.351362 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.351387 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3f04c31653fd2d52d145a959c922a0d3-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"3f04c31653fd2d52d145a959c922a0d3\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.351394 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.351406 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3f04c31653fd2d52d145a959c922a0d3-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"3f04c31653fd2d52d145a959c922a0d3\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.351418 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.351424 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3f04c31653fd2d52d145a959c922a0d3-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"3f04c31653fd2d52d145a959c922a0d3\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.351420 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.351481 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3f04c31653fd2d52d145a959c922a0d3-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"3f04c31653fd2d52d145a959c922a0d3\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.351438 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3f04c31653fd2d52d145a959c922a0d3-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"3f04c31653fd2d52d145a959c922a0d3\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:49:53 crc kubenswrapper[4732]: I0402 13:49:53.351491 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3f04c31653fd2d52d145a959c922a0d3-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"3f04c31653fd2d52d145a959c922a0d3\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:49:54 crc kubenswrapper[4732]: I0402 13:49:54.546403 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-11-crc" Apr 02 13:49:54 crc kubenswrapper[4732]: I0402 13:49:54.547400 4732 status_manager.go:851] "Failed to get status for pod" podUID="7a85873f-8730-4bb9-8d05-621568f7774c" pod="openshift-kube-apiserver/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:49:54 crc kubenswrapper[4732]: I0402 13:49:54.669008 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7a85873f-8730-4bb9-8d05-621568f7774c-kubelet-dir\") pod \"7a85873f-8730-4bb9-8d05-621568f7774c\" (UID: \"7a85873f-8730-4bb9-8d05-621568f7774c\") " Apr 02 13:49:54 crc kubenswrapper[4732]: I0402 13:49:54.669113 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7a85873f-8730-4bb9-8d05-621568f7774c-var-lock\") pod \"7a85873f-8730-4bb9-8d05-621568f7774c\" (UID: \"7a85873f-8730-4bb9-8d05-621568f7774c\") " Apr 02 13:49:54 crc kubenswrapper[4732]: I0402 13:49:54.669140 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a85873f-8730-4bb9-8d05-621568f7774c-kube-api-access\") pod \"7a85873f-8730-4bb9-8d05-621568f7774c\" (UID: \"7a85873f-8730-4bb9-8d05-621568f7774c\") " Apr 02 13:49:54 crc kubenswrapper[4732]: I0402 13:49:54.669259 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a85873f-8730-4bb9-8d05-621568f7774c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7a85873f-8730-4bb9-8d05-621568f7774c" (UID: "7a85873f-8730-4bb9-8d05-621568f7774c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:49:54 crc kubenswrapper[4732]: I0402 13:49:54.669283 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a85873f-8730-4bb9-8d05-621568f7774c-var-lock" (OuterVolumeSpecName: "var-lock") pod "7a85873f-8730-4bb9-8d05-621568f7774c" (UID: "7a85873f-8730-4bb9-8d05-621568f7774c"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:49:54 crc kubenswrapper[4732]: I0402 13:49:54.669530 4732 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7a85873f-8730-4bb9-8d05-621568f7774c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:49:54 crc kubenswrapper[4732]: I0402 13:49:54.669557 4732 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7a85873f-8730-4bb9-8d05-621568f7774c-var-lock\") on node \"crc\" DevicePath \"\"" Apr 02 13:49:54 crc kubenswrapper[4732]: I0402 13:49:54.678017 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a85873f-8730-4bb9-8d05-621568f7774c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7a85873f-8730-4bb9-8d05-621568f7774c" (UID: "7a85873f-8730-4bb9-8d05-621568f7774c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:49:54 crc kubenswrapper[4732]: I0402 13:49:54.690634 4732 status_manager.go:851] "Failed to get status for pod" podUID="7a85873f-8730-4bb9-8d05-621568f7774c" pod="openshift-kube-apiserver/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:49:54 crc kubenswrapper[4732]: I0402 13:49:54.770461 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a85873f-8730-4bb9-8d05-621568f7774c-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 02 13:49:55 crc kubenswrapper[4732]: I0402 13:49:55.280538 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-11-crc" event={"ID":"7a85873f-8730-4bb9-8d05-621568f7774c","Type":"ContainerDied","Data":"c152fbfbba34aed54d2bcd7880cddcc82c85a25be13ae81d7b902157ec5826cd"} Apr 02 13:49:55 crc kubenswrapper[4732]: I0402 13:49:55.281025 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c152fbfbba34aed54d2bcd7880cddcc82c85a25be13ae81d7b902157ec5826cd" Apr 02 13:49:55 crc kubenswrapper[4732]: I0402 13:49:55.280654 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-11-crc" Apr 02 13:49:55 crc kubenswrapper[4732]: I0402 13:49:55.285262 4732 status_manager.go:851] "Failed to get status for pod" podUID="7a85873f-8730-4bb9-8d05-621568f7774c" pod="openshift-kube-apiserver/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:49:55 crc kubenswrapper[4732]: I0402 13:49:55.477656 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_4e6039c7a12c5a0c0ef5917dc7ee5582/kube-apiserver-cert-syncer/0.log" Apr 02 13:49:55 crc kubenswrapper[4732]: I0402 13:49:55.478794 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:49:55 crc kubenswrapper[4732]: I0402 13:49:55.479489 4732 status_manager.go:851] "Failed to get status for pod" podUID="7a85873f-8730-4bb9-8d05-621568f7774c" pod="openshift-kube-apiserver/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:49:55 crc kubenswrapper[4732]: I0402 13:49:55.480013 4732 status_manager.go:851] "Failed to get status for pod" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:49:55 crc kubenswrapper[4732]: I0402 13:49:55.580892 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-audit-dir\") pod \"4e6039c7a12c5a0c0ef5917dc7ee5582\" (UID: \"4e6039c7a12c5a0c0ef5917dc7ee5582\") " Apr 02 13:49:55 crc kubenswrapper[4732]: I0402 13:49:55.580982 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-cert-dir\") pod \"4e6039c7a12c5a0c0ef5917dc7ee5582\" (UID: \"4e6039c7a12c5a0c0ef5917dc7ee5582\") " Apr 02 13:49:55 crc kubenswrapper[4732]: I0402 13:49:55.581017 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "4e6039c7a12c5a0c0ef5917dc7ee5582" (UID: "4e6039c7a12c5a0c0ef5917dc7ee5582"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:49:55 crc kubenswrapper[4732]: I0402 13:49:55.581033 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-resource-dir\") pod \"4e6039c7a12c5a0c0ef5917dc7ee5582\" (UID: \"4e6039c7a12c5a0c0ef5917dc7ee5582\") " Apr 02 13:49:55 crc kubenswrapper[4732]: I0402 13:49:55.581075 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "4e6039c7a12c5a0c0ef5917dc7ee5582" (UID: "4e6039c7a12c5a0c0ef5917dc7ee5582"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:49:55 crc kubenswrapper[4732]: I0402 13:49:55.581166 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "4e6039c7a12c5a0c0ef5917dc7ee5582" (UID: "4e6039c7a12c5a0c0ef5917dc7ee5582"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:49:55 crc kubenswrapper[4732]: I0402 13:49:55.581428 4732 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-cert-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:49:55 crc kubenswrapper[4732]: I0402 13:49:55.581448 4732 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:49:55 crc kubenswrapper[4732]: I0402 13:49:55.581457 4732 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-audit-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:49:56 crc kubenswrapper[4732]: I0402 13:49:56.293063 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_4e6039c7a12c5a0c0ef5917dc7ee5582/kube-apiserver-cert-syncer/0.log" Apr 02 13:49:56 crc kubenswrapper[4732]: I0402 13:49:56.294234 4732 generic.go:334] "Generic (PLEG): container finished" podID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerID="96cc6a0134c07b1897cf08a443d481b40cafe36f9fc7219d1f0b472531c54e50" exitCode=0 Apr 02 13:49:56 crc kubenswrapper[4732]: I0402 13:49:56.294289 4732 scope.go:117] "RemoveContainer" containerID="2c9bd70787ed9f0ffcf2b5aa836cdc2f70f83b8baed3300ce8244a1130f6edc6" Apr 02 13:49:56 crc kubenswrapper[4732]: I0402 13:49:56.294428 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:49:56 crc kubenswrapper[4732]: I0402 13:49:56.312707 4732 scope.go:117] "RemoveContainer" containerID="154f96ad2e07e914c8919ce03f5a46d1f2c4770c36a0f52d47b1c9f79ba64c1e" Apr 02 13:49:56 crc kubenswrapper[4732]: I0402 13:49:56.315937 4732 status_manager.go:851] "Failed to get status for pod" podUID="7a85873f-8730-4bb9-8d05-621568f7774c" pod="openshift-kube-apiserver/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:49:56 crc kubenswrapper[4732]: I0402 13:49:56.316410 4732 status_manager.go:851] "Failed to get status for pod" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:49:56 crc kubenswrapper[4732]: I0402 13:49:56.334995 4732 scope.go:117] "RemoveContainer" containerID="22672e53ab610750e5765dc8ce0aacf7b0848b1e72f7e1924976df8af0e68f30" Apr 02 13:49:56 crc kubenswrapper[4732]: I0402 13:49:56.353990 4732 scope.go:117] "RemoveContainer" containerID="060b077035d0d8f3b137497cde23a9a2def5bcac1994d53cb009323a5939cb36" Apr 02 13:49:56 crc kubenswrapper[4732]: I0402 13:49:56.375178 4732 scope.go:117] "RemoveContainer" containerID="96cc6a0134c07b1897cf08a443d481b40cafe36f9fc7219d1f0b472531c54e50" Apr 02 13:49:56 crc kubenswrapper[4732]: I0402 13:49:56.393475 4732 scope.go:117] "RemoveContainer" containerID="3e705b44ea9269407921b4e2c12a475f27ad577094437a5ac000a6628c171724" Apr 02 13:49:56 crc kubenswrapper[4732]: I0402 13:49:56.411576 4732 scope.go:117] "RemoveContainer" containerID="2c9bd70787ed9f0ffcf2b5aa836cdc2f70f83b8baed3300ce8244a1130f6edc6" Apr 02 13:49:56 crc kubenswrapper[4732]: E0402 13:49:56.412979 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c9bd70787ed9f0ffcf2b5aa836cdc2f70f83b8baed3300ce8244a1130f6edc6\": container with ID starting with 2c9bd70787ed9f0ffcf2b5aa836cdc2f70f83b8baed3300ce8244a1130f6edc6 not found: ID does not exist" containerID="2c9bd70787ed9f0ffcf2b5aa836cdc2f70f83b8baed3300ce8244a1130f6edc6" Apr 02 13:49:56 crc kubenswrapper[4732]: I0402 13:49:56.413016 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c9bd70787ed9f0ffcf2b5aa836cdc2f70f83b8baed3300ce8244a1130f6edc6"} err="failed to get container status \"2c9bd70787ed9f0ffcf2b5aa836cdc2f70f83b8baed3300ce8244a1130f6edc6\": rpc error: code = NotFound desc = could not find container \"2c9bd70787ed9f0ffcf2b5aa836cdc2f70f83b8baed3300ce8244a1130f6edc6\": container with ID starting with 2c9bd70787ed9f0ffcf2b5aa836cdc2f70f83b8baed3300ce8244a1130f6edc6 not found: ID does not exist" Apr 02 13:49:56 crc kubenswrapper[4732]: I0402 13:49:56.413041 4732 scope.go:117] "RemoveContainer" containerID="154f96ad2e07e914c8919ce03f5a46d1f2c4770c36a0f52d47b1c9f79ba64c1e" Apr 02 13:49:56 crc kubenswrapper[4732]: E0402 13:49:56.413280 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"154f96ad2e07e914c8919ce03f5a46d1f2c4770c36a0f52d47b1c9f79ba64c1e\": container with ID starting with 154f96ad2e07e914c8919ce03f5a46d1f2c4770c36a0f52d47b1c9f79ba64c1e not found: ID does not exist" containerID="154f96ad2e07e914c8919ce03f5a46d1f2c4770c36a0f52d47b1c9f79ba64c1e" Apr 02 13:49:56 crc kubenswrapper[4732]: I0402 13:49:56.413298 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"154f96ad2e07e914c8919ce03f5a46d1f2c4770c36a0f52d47b1c9f79ba64c1e"} err="failed to get container status \"154f96ad2e07e914c8919ce03f5a46d1f2c4770c36a0f52d47b1c9f79ba64c1e\": rpc error: code = NotFound desc = could not find container \"154f96ad2e07e914c8919ce03f5a46d1f2c4770c36a0f52d47b1c9f79ba64c1e\": container with ID starting with 154f96ad2e07e914c8919ce03f5a46d1f2c4770c36a0f52d47b1c9f79ba64c1e not found: ID does not exist" Apr 02 13:49:56 crc kubenswrapper[4732]: I0402 13:49:56.413312 4732 scope.go:117] "RemoveContainer" containerID="22672e53ab610750e5765dc8ce0aacf7b0848b1e72f7e1924976df8af0e68f30" Apr 02 13:49:56 crc kubenswrapper[4732]: E0402 13:49:56.413503 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22672e53ab610750e5765dc8ce0aacf7b0848b1e72f7e1924976df8af0e68f30\": container with ID starting with 22672e53ab610750e5765dc8ce0aacf7b0848b1e72f7e1924976df8af0e68f30 not found: ID does not exist" containerID="22672e53ab610750e5765dc8ce0aacf7b0848b1e72f7e1924976df8af0e68f30" Apr 02 13:49:56 crc kubenswrapper[4732]: I0402 13:49:56.413524 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22672e53ab610750e5765dc8ce0aacf7b0848b1e72f7e1924976df8af0e68f30"} err="failed to get container status \"22672e53ab610750e5765dc8ce0aacf7b0848b1e72f7e1924976df8af0e68f30\": rpc error: code = NotFound desc = could not find container \"22672e53ab610750e5765dc8ce0aacf7b0848b1e72f7e1924976df8af0e68f30\": container with ID starting with 22672e53ab610750e5765dc8ce0aacf7b0848b1e72f7e1924976df8af0e68f30 not found: ID does not exist" Apr 02 13:49:56 crc kubenswrapper[4732]: I0402 13:49:56.413540 4732 scope.go:117] "RemoveContainer" containerID="060b077035d0d8f3b137497cde23a9a2def5bcac1994d53cb009323a5939cb36" Apr 02 13:49:56 crc kubenswrapper[4732]: E0402 13:49:56.413726 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"060b077035d0d8f3b137497cde23a9a2def5bcac1994d53cb009323a5939cb36\": container with ID starting with 060b077035d0d8f3b137497cde23a9a2def5bcac1994d53cb009323a5939cb36 not found: ID does not exist" containerID="060b077035d0d8f3b137497cde23a9a2def5bcac1994d53cb009323a5939cb36" Apr 02 13:49:56 crc kubenswrapper[4732]: I0402 13:49:56.413740 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"060b077035d0d8f3b137497cde23a9a2def5bcac1994d53cb009323a5939cb36"} err="failed to get container status \"060b077035d0d8f3b137497cde23a9a2def5bcac1994d53cb009323a5939cb36\": rpc error: code = NotFound desc = could not find container \"060b077035d0d8f3b137497cde23a9a2def5bcac1994d53cb009323a5939cb36\": container with ID starting with 060b077035d0d8f3b137497cde23a9a2def5bcac1994d53cb009323a5939cb36 not found: ID does not exist" Apr 02 13:49:56 crc kubenswrapper[4732]: I0402 13:49:56.413752 4732 scope.go:117] "RemoveContainer" containerID="96cc6a0134c07b1897cf08a443d481b40cafe36f9fc7219d1f0b472531c54e50" Apr 02 13:49:56 crc kubenswrapper[4732]: E0402 13:49:56.414104 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96cc6a0134c07b1897cf08a443d481b40cafe36f9fc7219d1f0b472531c54e50\": container with ID starting with 96cc6a0134c07b1897cf08a443d481b40cafe36f9fc7219d1f0b472531c54e50 not found: ID does not exist" containerID="96cc6a0134c07b1897cf08a443d481b40cafe36f9fc7219d1f0b472531c54e50" Apr 02 13:49:56 crc kubenswrapper[4732]: I0402 13:49:56.414120 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96cc6a0134c07b1897cf08a443d481b40cafe36f9fc7219d1f0b472531c54e50"} err="failed to get container status \"96cc6a0134c07b1897cf08a443d481b40cafe36f9fc7219d1f0b472531c54e50\": rpc error: code = NotFound desc = could not find container \"96cc6a0134c07b1897cf08a443d481b40cafe36f9fc7219d1f0b472531c54e50\": container with ID starting with 96cc6a0134c07b1897cf08a443d481b40cafe36f9fc7219d1f0b472531c54e50 not found: ID does not exist" Apr 02 13:49:56 crc kubenswrapper[4732]: I0402 13:49:56.414132 4732 scope.go:117] "RemoveContainer" containerID="3e705b44ea9269407921b4e2c12a475f27ad577094437a5ac000a6628c171724" Apr 02 13:49:56 crc kubenswrapper[4732]: E0402 13:49:56.414296 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e705b44ea9269407921b4e2c12a475f27ad577094437a5ac000a6628c171724\": container with ID starting with 3e705b44ea9269407921b4e2c12a475f27ad577094437a5ac000a6628c171724 not found: ID does not exist" containerID="3e705b44ea9269407921b4e2c12a475f27ad577094437a5ac000a6628c171724" Apr 02 13:49:56 crc kubenswrapper[4732]: I0402 13:49:56.414314 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e705b44ea9269407921b4e2c12a475f27ad577094437a5ac000a6628c171724"} err="failed to get container status \"3e705b44ea9269407921b4e2c12a475f27ad577094437a5ac000a6628c171724\": rpc error: code = NotFound desc = could not find container \"3e705b44ea9269407921b4e2c12a475f27ad577094437a5ac000a6628c171724\": container with ID starting with 3e705b44ea9269407921b4e2c12a475f27ad577094437a5ac000a6628c171724 not found: ID does not exist" Apr 02 13:49:56 crc kubenswrapper[4732]: I0402 13:49:56.685819 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" path="/var/lib/kubelet/pods/4e6039c7a12c5a0c0ef5917dc7ee5582/volumes" Apr 02 13:49:58 crc kubenswrapper[4732]: E0402 13:49:58.060285 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:49:58Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:49:58Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:49:58Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:49:58Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:49:58 crc kubenswrapper[4732]: E0402 13:49:58.061415 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:49:58 crc kubenswrapper[4732]: E0402 13:49:58.061835 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:49:58 crc kubenswrapper[4732]: E0402 13:49:58.062493 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:49:58 crc kubenswrapper[4732]: E0402 13:49:58.063065 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:49:58 crc kubenswrapper[4732]: E0402 13:49:58.063099 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Apr 02 13:49:58 crc kubenswrapper[4732]: E0402 13:49:58.139582 4732 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:49:58 crc kubenswrapper[4732]: I0402 13:49:58.140045 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:49:58 crc kubenswrapper[4732]: W0402 13:49:58.161766 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaec8d0ffd277c0e93001246672220ba.slice/crio-2d125a7a453423abbf7169044d25aa41dd54409a8f5f80d736451cf16e856bdf WatchSource:0}: Error finding container 2d125a7a453423abbf7169044d25aa41dd54409a8f5f80d736451cf16e856bdf: Status 404 returned error can't find the container with id 2d125a7a453423abbf7169044d25aa41dd54409a8f5f80d736451cf16e856bdf Apr 02 13:49:58 crc kubenswrapper[4732]: E0402 13:49:58.165247 4732 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.138:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18a28e6c1efba713 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:aaec8d0ffd277c0e93001246672220ba,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:49:58.164817683 +0000 UTC m=+755.069225256,LastTimestamp:2026-04-02 13:49:58.164817683 +0000 UTC m=+755.069225256,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:49:58 crc kubenswrapper[4732]: I0402 13:49:58.311189 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"aaec8d0ffd277c0e93001246672220ba","Type":"ContainerStarted","Data":"2d125a7a453423abbf7169044d25aa41dd54409a8f5f80d736451cf16e856bdf"} Apr 02 13:49:59 crc kubenswrapper[4732]: E0402 13:49:59.222521 4732 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:49:59 crc kubenswrapper[4732]: E0402 13:49:59.223985 4732 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:49:59 crc kubenswrapper[4732]: E0402 13:49:59.224437 4732 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:49:59 crc kubenswrapper[4732]: E0402 13:49:59.224694 4732 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:49:59 crc kubenswrapper[4732]: E0402 13:49:59.224905 4732 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:49:59 crc kubenswrapper[4732]: I0402 13:49:59.224952 4732 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Apr 02 13:49:59 crc kubenswrapper[4732]: E0402 13:49:59.225194 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="200ms" Apr 02 13:49:59 crc kubenswrapper[4732]: I0402 13:49:59.319717 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"aaec8d0ffd277c0e93001246672220ba","Type":"ContainerStarted","Data":"05b73558d0a4805040005db74b72895321403993df5016a328226833ee7de701"} Apr 02 13:49:59 crc kubenswrapper[4732]: E0402 13:49:59.320348 4732 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:49:59 crc kubenswrapper[4732]: I0402 13:49:59.320847 4732 status_manager.go:851] "Failed to get status for pod" podUID="7a85873f-8730-4bb9-8d05-621568f7774c" pod="openshift-kube-apiserver/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:49:59 crc kubenswrapper[4732]: E0402 13:49:59.426393 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="400ms" Apr 02 13:49:59 crc kubenswrapper[4732]: E0402 13:49:59.827439 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="800ms" Apr 02 13:50:00 crc kubenswrapper[4732]: E0402 13:50:00.326916 4732 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:50:00 crc kubenswrapper[4732]: E0402 13:50:00.628720 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="1.6s" Apr 02 13:50:00 crc kubenswrapper[4732]: I0402 13:50:00.718702 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:50:00 crc kubenswrapper[4732]: I0402 13:50:00.718765 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:50:00 crc kubenswrapper[4732]: I0402 13:50:00.718779 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:50:00 crc kubenswrapper[4732]: I0402 13:50:00.718790 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:50:00 crc kubenswrapper[4732]: I0402 13:50:00.722904 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:50:00 crc kubenswrapper[4732]: I0402 13:50:00.723299 4732 status_manager.go:851] "Failed to get status for pod" podUID="7a85873f-8730-4bb9-8d05-621568f7774c" pod="openshift-kube-apiserver/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:50:00 crc kubenswrapper[4732]: I0402 13:50:00.723511 4732 status_manager.go:851] "Failed to get status for pod" podUID="c32a96981201f35bdc64ba062620676a" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:50:00 crc kubenswrapper[4732]: I0402 13:50:00.724660 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:50:00 crc kubenswrapper[4732]: I0402 13:50:00.725245 4732 status_manager.go:851] "Failed to get status for pod" podUID="c32a96981201f35bdc64ba062620676a" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:50:00 crc kubenswrapper[4732]: I0402 13:50:00.725506 4732 status_manager.go:851] "Failed to get status for pod" podUID="7a85873f-8730-4bb9-8d05-621568f7774c" pod="openshift-kube-apiserver/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:50:01 crc kubenswrapper[4732]: I0402 13:50:01.338414 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:50:01 crc kubenswrapper[4732]: I0402 13:50:01.339086 4732 status_manager.go:851] "Failed to get status for pod" podUID="7a85873f-8730-4bb9-8d05-621568f7774c" pod="openshift-kube-apiserver/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:50:01 crc kubenswrapper[4732]: I0402 13:50:01.339388 4732 status_manager.go:851] "Failed to get status for pod" podUID="c32a96981201f35bdc64ba062620676a" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:50:01 crc kubenswrapper[4732]: I0402 13:50:01.339455 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 02 13:50:01 crc kubenswrapper[4732]: I0402 13:50:01.339715 4732 status_manager.go:851] "Failed to get status for pod" podUID="7a85873f-8730-4bb9-8d05-621568f7774c" pod="openshift-kube-apiserver/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:50:01 crc kubenswrapper[4732]: I0402 13:50:01.340030 4732 status_manager.go:851] "Failed to get status for pod" podUID="c32a96981201f35bdc64ba062620676a" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:50:01 crc kubenswrapper[4732]: I0402 13:50:01.924461 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 13:50:01 crc kubenswrapper[4732]: I0402 13:50:01.924525 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 13:50:01 crc kubenswrapper[4732]: I0402 13:50:01.924569 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 13:50:01 crc kubenswrapper[4732]: I0402 13:50:01.925088 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"af70e3cad0ce1292e5dbf16bae2fa3fb252384621cc6f2a55ab8798328ebbc50"} pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 02 13:50:01 crc kubenswrapper[4732]: I0402 13:50:01.925145 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" containerID="cri-o://af70e3cad0ce1292e5dbf16bae2fa3fb252384621cc6f2a55ab8798328ebbc50" gracePeriod=600 Apr 02 13:50:02 crc kubenswrapper[4732]: E0402 13:50:02.229555 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="3.2s" Apr 02 13:50:02 crc kubenswrapper[4732]: I0402 13:50:02.341646 4732 generic.go:334] "Generic (PLEG): container finished" podID="38409e5e-4545-49da-8f6c-4bfb30582878" containerID="af70e3cad0ce1292e5dbf16bae2fa3fb252384621cc6f2a55ab8798328ebbc50" exitCode=0 Apr 02 13:50:02 crc kubenswrapper[4732]: I0402 13:50:02.341682 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerDied","Data":"af70e3cad0ce1292e5dbf16bae2fa3fb252384621cc6f2a55ab8798328ebbc50"} Apr 02 13:50:02 crc kubenswrapper[4732]: I0402 13:50:02.341745 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerStarted","Data":"7878e048a4d64163e8a31b7ae9f684fbec512dadbd638965377c01f618d3ee60"} Apr 02 13:50:02 crc kubenswrapper[4732]: I0402 13:50:02.341768 4732 scope.go:117] "RemoveContainer" containerID="6679fff77ada4a54a69b7189491d8feac3c5def6519c359d285b772063d2ad8d" Apr 02 13:50:02 crc kubenswrapper[4732]: I0402 13:50:02.342527 4732 status_manager.go:851] "Failed to get status for pod" podUID="7a85873f-8730-4bb9-8d05-621568f7774c" pod="openshift-kube-apiserver/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:50:02 crc kubenswrapper[4732]: I0402 13:50:02.342881 4732 status_manager.go:851] "Failed to get status for pod" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-6vtmw\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:50:02 crc kubenswrapper[4732]: I0402 13:50:02.343176 4732 status_manager.go:851] "Failed to get status for pod" podUID="c32a96981201f35bdc64ba062620676a" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:50:04 crc kubenswrapper[4732]: E0402 13:50:04.676130 4732 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.138:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18a28e6c1efba713 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:aaec8d0ffd277c0e93001246672220ba,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-02 13:49:58.164817683 +0000 UTC m=+755.069225256,LastTimestamp:2026-04-02 13:49:58.164817683 +0000 UTC m=+755.069225256,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 02 13:50:04 crc kubenswrapper[4732]: I0402 13:50:04.685866 4732 status_manager.go:851] "Failed to get status for pod" podUID="c32a96981201f35bdc64ba062620676a" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:50:04 crc kubenswrapper[4732]: I0402 13:50:04.686160 4732 status_manager.go:851] "Failed to get status for pod" podUID="7a85873f-8730-4bb9-8d05-621568f7774c" pod="openshift-kube-apiserver/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:50:04 crc kubenswrapper[4732]: I0402 13:50:04.686447 4732 status_manager.go:851] "Failed to get status for pod" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-6vtmw\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:50:05 crc kubenswrapper[4732]: E0402 13:50:05.431078 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="6.4s" Apr 02 13:50:07 crc kubenswrapper[4732]: I0402 13:50:07.683457 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:50:07 crc kubenswrapper[4732]: I0402 13:50:07.684695 4732 status_manager.go:851] "Failed to get status for pod" podUID="7a85873f-8730-4bb9-8d05-621568f7774c" pod="openshift-kube-apiserver/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:50:07 crc kubenswrapper[4732]: I0402 13:50:07.684947 4732 status_manager.go:851] "Failed to get status for pod" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-6vtmw\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:50:07 crc kubenswrapper[4732]: I0402 13:50:07.687745 4732 status_manager.go:851] "Failed to get status for pod" podUID="c32a96981201f35bdc64ba062620676a" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:50:07 crc kubenswrapper[4732]: I0402 13:50:07.698759 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22fd4835-69d0-47de-8792-5c8c0e8028ce" Apr 02 13:50:07 crc kubenswrapper[4732]: I0402 13:50:07.698793 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22fd4835-69d0-47de-8792-5c8c0e8028ce" Apr 02 13:50:07 crc kubenswrapper[4732]: E0402 13:50:07.699279 4732 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:50:07 crc kubenswrapper[4732]: I0402 13:50:07.699889 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:50:07 crc kubenswrapper[4732]: W0402 13:50:07.728225 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f04c31653fd2d52d145a959c922a0d3.slice/crio-de8445d995b5ff33d559bccd852acae3ab5a51feb7efdb9a22b0f07104a91668 WatchSource:0}: Error finding container de8445d995b5ff33d559bccd852acae3ab5a51feb7efdb9a22b0f07104a91668: Status 404 returned error can't find the container with id de8445d995b5ff33d559bccd852acae3ab5a51feb7efdb9a22b0f07104a91668 Apr 02 13:50:08 crc kubenswrapper[4732]: I0402 13:50:08.407900 4732 generic.go:334] "Generic (PLEG): container finished" podID="3f04c31653fd2d52d145a959c922a0d3" containerID="ebf0655c059b9eefbb7438dc7f690f0bff9cd8feb092254e5d5b8874ddea64c9" exitCode=0 Apr 02 13:50:08 crc kubenswrapper[4732]: I0402 13:50:08.408035 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3f04c31653fd2d52d145a959c922a0d3","Type":"ContainerDied","Data":"ebf0655c059b9eefbb7438dc7f690f0bff9cd8feb092254e5d5b8874ddea64c9"} Apr 02 13:50:08 crc kubenswrapper[4732]: I0402 13:50:08.408294 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3f04c31653fd2d52d145a959c922a0d3","Type":"ContainerStarted","Data":"de8445d995b5ff33d559bccd852acae3ab5a51feb7efdb9a22b0f07104a91668"} Apr 02 13:50:08 crc kubenswrapper[4732]: I0402 13:50:08.408638 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22fd4835-69d0-47de-8792-5c8c0e8028ce" Apr 02 13:50:08 crc kubenswrapper[4732]: I0402 13:50:08.408655 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22fd4835-69d0-47de-8792-5c8c0e8028ce" Apr 02 13:50:08 crc kubenswrapper[4732]: E0402 13:50:08.409348 4732 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:50:08 crc kubenswrapper[4732]: I0402 13:50:08.409457 4732 status_manager.go:851] "Failed to get status for pod" podUID="7a85873f-8730-4bb9-8d05-621568f7774c" pod="openshift-kube-apiserver/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-11-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:50:08 crc kubenswrapper[4732]: I0402 13:50:08.410439 4732 status_manager.go:851] "Failed to get status for pod" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-6vtmw\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:50:08 crc kubenswrapper[4732]: I0402 13:50:08.411199 4732 status_manager.go:851] "Failed to get status for pod" podUID="c32a96981201f35bdc64ba062620676a" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:50:08 crc kubenswrapper[4732]: E0402 13:50:08.438807 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:50:08Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:50:08Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:50:08Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-02T13:50:08Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:50:08 crc kubenswrapper[4732]: E0402 13:50:08.439278 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:50:08 crc kubenswrapper[4732]: E0402 13:50:08.439884 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:50:08 crc kubenswrapper[4732]: E0402 13:50:08.440236 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:50:08 crc kubenswrapper[4732]: E0402 13:50:08.440599 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Apr 02 13:50:08 crc kubenswrapper[4732]: E0402 13:50:08.440636 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Apr 02 13:50:09 crc kubenswrapper[4732]: I0402 13:50:09.416971 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3f04c31653fd2d52d145a959c922a0d3","Type":"ContainerStarted","Data":"f580db146ff456a8235c4bd98a113259444c429190d447ebdd2a5d41668d7dfd"} Apr 02 13:50:09 crc kubenswrapper[4732]: I0402 13:50:09.417720 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3f04c31653fd2d52d145a959c922a0d3","Type":"ContainerStarted","Data":"d8e3b328ad99085fd3bf91573392f68b61beaf1faa61ce949557bedcb7492d70"} Apr 02 13:50:09 crc kubenswrapper[4732]: I0402 13:50:09.417763 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3f04c31653fd2d52d145a959c922a0d3","Type":"ContainerStarted","Data":"5d9aad6120bef5262305d6a46f4b11e8ac8daf1f6cf07355234e868951298df6"} Apr 02 13:50:09 crc kubenswrapper[4732]: I0402 13:50:09.417777 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3f04c31653fd2d52d145a959c922a0d3","Type":"ContainerStarted","Data":"4f2533c7ad4eec8191e5d76907f79813974b66951c9efbe2cebac3f1897c688c"} Apr 02 13:50:10 crc kubenswrapper[4732]: I0402 13:50:10.425191 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3f04c31653fd2d52d145a959c922a0d3","Type":"ContainerStarted","Data":"ae1c4246a052968817e626cf3c2d3f1f41a2d67d11ac59d47035d3badab7bb20"} Apr 02 13:50:10 crc kubenswrapper[4732]: I0402 13:50:10.425361 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:50:10 crc kubenswrapper[4732]: I0402 13:50:10.425458 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22fd4835-69d0-47de-8792-5c8c0e8028ce" Apr 02 13:50:10 crc kubenswrapper[4732]: I0402 13:50:10.425479 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22fd4835-69d0-47de-8792-5c8c0e8028ce" Apr 02 13:50:12 crc kubenswrapper[4732]: I0402 13:50:12.700288 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:50:12 crc kubenswrapper[4732]: I0402 13:50:12.700685 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:50:12 crc kubenswrapper[4732]: I0402 13:50:12.703935 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:50:15 crc kubenswrapper[4732]: I0402 13:50:15.434837 4732 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:50:15 crc kubenswrapper[4732]: I0402 13:50:15.453516 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22fd4835-69d0-47de-8792-5c8c0e8028ce" Apr 02 13:50:15 crc kubenswrapper[4732]: I0402 13:50:15.453551 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22fd4835-69d0-47de-8792-5c8c0e8028ce" Apr 02 13:50:15 crc kubenswrapper[4732]: I0402 13:50:15.458293 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:50:15 crc kubenswrapper[4732]: I0402 13:50:15.472451 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="3f04c31653fd2d52d145a959c922a0d3" podUID="0ed6791c-8091-49d3-9f8a-5d8b211f1bde" Apr 02 13:50:16 crc kubenswrapper[4732]: I0402 13:50:16.459040 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22fd4835-69d0-47de-8792-5c8c0e8028ce" Apr 02 13:50:16 crc kubenswrapper[4732]: I0402 13:50:16.459269 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22fd4835-69d0-47de-8792-5c8c0e8028ce" Apr 02 13:50:24 crc kubenswrapper[4732]: I0402 13:50:24.690163 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="3f04c31653fd2d52d145a959c922a0d3" podUID="0ed6791c-8091-49d3-9f8a-5d8b211f1bde" Apr 02 13:50:24 crc kubenswrapper[4732]: I0402 13:50:24.998332 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Apr 02 13:50:25 crc kubenswrapper[4732]: I0402 13:50:25.128700 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Apr 02 13:50:25 crc kubenswrapper[4732]: I0402 13:50:25.144175 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Apr 02 13:50:26 crc kubenswrapper[4732]: I0402 13:50:26.144145 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Apr 02 13:50:26 crc kubenswrapper[4732]: I0402 13:50:26.436238 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Apr 02 13:50:26 crc kubenswrapper[4732]: I0402 13:50:26.493732 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Apr 02 13:50:26 crc kubenswrapper[4732]: I0402 13:50:26.660195 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Apr 02 13:50:26 crc kubenswrapper[4732]: I0402 13:50:26.809329 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Apr 02 13:50:27 crc kubenswrapper[4732]: I0402 13:50:27.044608 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Apr 02 13:50:27 crc kubenswrapper[4732]: I0402 13:50:27.070171 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Apr 02 13:50:27 crc kubenswrapper[4732]: I0402 13:50:27.098533 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Apr 02 13:50:27 crc kubenswrapper[4732]: I0402 13:50:27.251844 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Apr 02 13:50:27 crc kubenswrapper[4732]: I0402 13:50:27.349173 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Apr 02 13:50:27 crc kubenswrapper[4732]: I0402 13:50:27.729259 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Apr 02 13:50:27 crc kubenswrapper[4732]: I0402 13:50:27.784433 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Apr 02 13:50:27 crc kubenswrapper[4732]: I0402 13:50:27.812263 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Apr 02 13:50:27 crc kubenswrapper[4732]: I0402 13:50:27.827885 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Apr 02 13:50:27 crc kubenswrapper[4732]: I0402 13:50:27.848876 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Apr 02 13:50:27 crc kubenswrapper[4732]: I0402 13:50:27.982560 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Apr 02 13:50:28 crc kubenswrapper[4732]: I0402 13:50:28.360006 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Apr 02 13:50:28 crc kubenswrapper[4732]: I0402 13:50:28.377896 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Apr 02 13:50:28 crc kubenswrapper[4732]: I0402 13:50:28.414201 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Apr 02 13:50:28 crc kubenswrapper[4732]: I0402 13:50:28.459579 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Apr 02 13:50:28 crc kubenswrapper[4732]: I0402 13:50:28.487917 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Apr 02 13:50:28 crc kubenswrapper[4732]: I0402 13:50:28.509510 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Apr 02 13:50:28 crc kubenswrapper[4732]: I0402 13:50:28.521528 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Apr 02 13:50:28 crc kubenswrapper[4732]: I0402 13:50:28.760339 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Apr 02 13:50:28 crc kubenswrapper[4732]: I0402 13:50:28.777168 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Apr 02 13:50:28 crc kubenswrapper[4732]: I0402 13:50:28.964980 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Apr 02 13:50:28 crc kubenswrapper[4732]: I0402 13:50:28.965549 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.112204 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.159307 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.233603 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.294156 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.315484 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.341818 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.354841 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.422842 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.494826 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.510072 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.511462 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.526813 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.527850 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.613204 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.614164 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.678904 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.709917 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.736527 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.765178 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.789831 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.790796 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.871005 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.876034 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.907699 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.919852 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Apr 02 13:50:29 crc kubenswrapper[4732]: I0402 13:50:29.983343 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Apr 02 13:50:30 crc kubenswrapper[4732]: I0402 13:50:30.021602 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Apr 02 13:50:30 crc kubenswrapper[4732]: I0402 13:50:30.061259 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Apr 02 13:50:30 crc kubenswrapper[4732]: I0402 13:50:30.098904 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Apr 02 13:50:30 crc kubenswrapper[4732]: I0402 13:50:30.104306 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Apr 02 13:50:30 crc kubenswrapper[4732]: I0402 13:50:30.171333 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Apr 02 13:50:30 crc kubenswrapper[4732]: I0402 13:50:30.172697 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Apr 02 13:50:30 crc kubenswrapper[4732]: I0402 13:50:30.187019 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Apr 02 13:50:30 crc kubenswrapper[4732]: I0402 13:50:30.399539 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Apr 02 13:50:30 crc kubenswrapper[4732]: I0402 13:50:30.440769 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Apr 02 13:50:30 crc kubenswrapper[4732]: I0402 13:50:30.460803 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Apr 02 13:50:30 crc kubenswrapper[4732]: I0402 13:50:30.498500 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Apr 02 13:50:30 crc kubenswrapper[4732]: I0402 13:50:30.519382 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Apr 02 13:50:30 crc kubenswrapper[4732]: I0402 13:50:30.530605 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Apr 02 13:50:30 crc kubenswrapper[4732]: I0402 13:50:30.567129 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Apr 02 13:50:30 crc kubenswrapper[4732]: I0402 13:50:30.596016 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Apr 02 13:50:30 crc kubenswrapper[4732]: I0402 13:50:30.661819 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Apr 02 13:50:30 crc kubenswrapper[4732]: I0402 13:50:30.741932 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Apr 02 13:50:30 crc kubenswrapper[4732]: I0402 13:50:30.814398 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Apr 02 13:50:30 crc kubenswrapper[4732]: I0402 13:50:30.846422 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Apr 02 13:50:30 crc kubenswrapper[4732]: I0402 13:50:30.853252 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Apr 02 13:50:30 crc kubenswrapper[4732]: I0402 13:50:30.875346 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Apr 02 13:50:30 crc kubenswrapper[4732]: I0402 13:50:30.905922 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Apr 02 13:50:30 crc kubenswrapper[4732]: I0402 13:50:30.933897 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Apr 02 13:50:30 crc kubenswrapper[4732]: I0402 13:50:30.962211 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Apr 02 13:50:31 crc kubenswrapper[4732]: I0402 13:50:31.154411 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Apr 02 13:50:31 crc kubenswrapper[4732]: I0402 13:50:31.154895 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Apr 02 13:50:31 crc kubenswrapper[4732]: I0402 13:50:31.203530 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Apr 02 13:50:31 crc kubenswrapper[4732]: I0402 13:50:31.426257 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Apr 02 13:50:31 crc kubenswrapper[4732]: I0402 13:50:31.554028 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Apr 02 13:50:31 crc kubenswrapper[4732]: I0402 13:50:31.632870 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Apr 02 13:50:31 crc kubenswrapper[4732]: I0402 13:50:31.675859 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Apr 02 13:50:31 crc kubenswrapper[4732]: I0402 13:50:31.812708 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Apr 02 13:50:31 crc kubenswrapper[4732]: I0402 13:50:31.818332 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Apr 02 13:50:31 crc kubenswrapper[4732]: I0402 13:50:31.891475 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Apr 02 13:50:31 crc kubenswrapper[4732]: I0402 13:50:31.912216 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Apr 02 13:50:31 crc kubenswrapper[4732]: I0402 13:50:31.917594 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Apr 02 13:50:31 crc kubenswrapper[4732]: I0402 13:50:31.995098 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Apr 02 13:50:32 crc kubenswrapper[4732]: I0402 13:50:32.146473 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Apr 02 13:50:32 crc kubenswrapper[4732]: I0402 13:50:32.187864 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Apr 02 13:50:32 crc kubenswrapper[4732]: I0402 13:50:32.191638 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Apr 02 13:50:32 crc kubenswrapper[4732]: I0402 13:50:32.241880 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Apr 02 13:50:32 crc kubenswrapper[4732]: I0402 13:50:32.249850 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Apr 02 13:50:32 crc kubenswrapper[4732]: I0402 13:50:32.257657 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Apr 02 13:50:32 crc kubenswrapper[4732]: I0402 13:50:32.422312 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Apr 02 13:50:32 crc kubenswrapper[4732]: I0402 13:50:32.450787 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Apr 02 13:50:32 crc kubenswrapper[4732]: I0402 13:50:32.471881 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Apr 02 13:50:32 crc kubenswrapper[4732]: I0402 13:50:32.478876 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Apr 02 13:50:32 crc kubenswrapper[4732]: I0402 13:50:32.520949 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Apr 02 13:50:32 crc kubenswrapper[4732]: I0402 13:50:32.542877 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Apr 02 13:50:32 crc kubenswrapper[4732]: I0402 13:50:32.561763 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Apr 02 13:50:32 crc kubenswrapper[4732]: I0402 13:50:32.572773 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Apr 02 13:50:32 crc kubenswrapper[4732]: I0402 13:50:32.639332 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Apr 02 13:50:32 crc kubenswrapper[4732]: I0402 13:50:32.754269 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Apr 02 13:50:32 crc kubenswrapper[4732]: I0402 13:50:32.804876 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Apr 02 13:50:32 crc kubenswrapper[4732]: I0402 13:50:32.829588 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Apr 02 13:50:32 crc kubenswrapper[4732]: I0402 13:50:32.853127 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Apr 02 13:50:32 crc kubenswrapper[4732]: I0402 13:50:32.878280 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Apr 02 13:50:32 crc kubenswrapper[4732]: I0402 13:50:32.885383 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Apr 02 13:50:32 crc kubenswrapper[4732]: I0402 13:50:32.926665 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.031656 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.050566 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.053258 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.053259 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.115544 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.168993 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.176187 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.250108 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.397996 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.469301 4732 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.470519 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.568519 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.605221 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.611446 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.672467 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.683023 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.683269 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.716553 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.759649 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.798608 4732 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.846396 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.846481 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.860237 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.862266 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.875195 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.903413 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.980310 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Apr 02 13:50:33 crc kubenswrapper[4732]: I0402 13:50:33.993513 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Apr 02 13:50:34 crc kubenswrapper[4732]: I0402 13:50:34.127448 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Apr 02 13:50:34 crc kubenswrapper[4732]: I0402 13:50:34.197719 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Apr 02 13:50:34 crc kubenswrapper[4732]: I0402 13:50:34.207107 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Apr 02 13:50:34 crc kubenswrapper[4732]: I0402 13:50:34.231760 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Apr 02 13:50:34 crc kubenswrapper[4732]: I0402 13:50:34.241525 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Apr 02 13:50:34 crc kubenswrapper[4732]: I0402 13:50:34.291182 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Apr 02 13:50:34 crc kubenswrapper[4732]: I0402 13:50:34.309209 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Apr 02 13:50:34 crc kubenswrapper[4732]: I0402 13:50:34.324518 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Apr 02 13:50:34 crc kubenswrapper[4732]: I0402 13:50:34.377576 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Apr 02 13:50:34 crc kubenswrapper[4732]: I0402 13:50:34.403086 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Apr 02 13:50:34 crc kubenswrapper[4732]: I0402 13:50:34.440736 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Apr 02 13:50:34 crc kubenswrapper[4732]: I0402 13:50:34.459355 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Apr 02 13:50:34 crc kubenswrapper[4732]: I0402 13:50:34.561844 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Apr 02 13:50:34 crc kubenswrapper[4732]: I0402 13:50:34.566269 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Apr 02 13:50:34 crc kubenswrapper[4732]: I0402 13:50:34.605721 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Apr 02 13:50:34 crc kubenswrapper[4732]: I0402 13:50:34.642017 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Apr 02 13:50:34 crc kubenswrapper[4732]: I0402 13:50:34.720834 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Apr 02 13:50:34 crc kubenswrapper[4732]: I0402 13:50:34.779936 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Apr 02 13:50:34 crc kubenswrapper[4732]: I0402 13:50:34.887313 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Apr 02 13:50:35 crc kubenswrapper[4732]: I0402 13:50:35.023576 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Apr 02 13:50:35 crc kubenswrapper[4732]: I0402 13:50:35.031440 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Apr 02 13:50:35 crc kubenswrapper[4732]: I0402 13:50:35.173285 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Apr 02 13:50:35 crc kubenswrapper[4732]: I0402 13:50:35.182925 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Apr 02 13:50:35 crc kubenswrapper[4732]: I0402 13:50:35.227424 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Apr 02 13:50:35 crc kubenswrapper[4732]: I0402 13:50:35.273522 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Apr 02 13:50:35 crc kubenswrapper[4732]: I0402 13:50:35.328754 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Apr 02 13:50:35 crc kubenswrapper[4732]: I0402 13:50:35.463574 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Apr 02 13:50:35 crc kubenswrapper[4732]: I0402 13:50:35.524699 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Apr 02 13:50:35 crc kubenswrapper[4732]: I0402 13:50:35.599408 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Apr 02 13:50:35 crc kubenswrapper[4732]: I0402 13:50:35.672456 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Apr 02 13:50:35 crc kubenswrapper[4732]: I0402 13:50:35.696053 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Apr 02 13:50:35 crc kubenswrapper[4732]: I0402 13:50:35.719522 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Apr 02 13:50:35 crc kubenswrapper[4732]: I0402 13:50:35.749532 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Apr 02 13:50:35 crc kubenswrapper[4732]: I0402 13:50:35.794147 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Apr 02 13:50:35 crc kubenswrapper[4732]: I0402 13:50:35.969745 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Apr 02 13:50:36 crc kubenswrapper[4732]: I0402 13:50:36.094398 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Apr 02 13:50:36 crc kubenswrapper[4732]: I0402 13:50:36.160871 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Apr 02 13:50:36 crc kubenswrapper[4732]: I0402 13:50:36.162553 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Apr 02 13:50:36 crc kubenswrapper[4732]: I0402 13:50:36.169360 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Apr 02 13:50:36 crc kubenswrapper[4732]: I0402 13:50:36.171112 4732 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Apr 02 13:50:36 crc kubenswrapper[4732]: I0402 13:50:36.184302 4732 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Apr 02 13:50:36 crc kubenswrapper[4732]: I0402 13:50:36.206555 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Apr 02 13:50:36 crc kubenswrapper[4732]: I0402 13:50:36.306634 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Apr 02 13:50:36 crc kubenswrapper[4732]: I0402 13:50:36.329473 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Apr 02 13:50:36 crc kubenswrapper[4732]: I0402 13:50:36.334058 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Apr 02 13:50:36 crc kubenswrapper[4732]: I0402 13:50:36.382458 4732 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Apr 02 13:50:36 crc kubenswrapper[4732]: I0402 13:50:36.433427 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Apr 02 13:50:36 crc kubenswrapper[4732]: I0402 13:50:36.527656 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Apr 02 13:50:36 crc kubenswrapper[4732]: I0402 13:50:36.590886 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Apr 02 13:50:36 crc kubenswrapper[4732]: I0402 13:50:36.678428 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Apr 02 13:50:36 crc kubenswrapper[4732]: I0402 13:50:36.717739 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Apr 02 13:50:36 crc kubenswrapper[4732]: I0402 13:50:36.769407 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Apr 02 13:50:36 crc kubenswrapper[4732]: I0402 13:50:36.773726 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Apr 02 13:50:36 crc kubenswrapper[4732]: I0402 13:50:36.820070 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Apr 02 13:50:36 crc kubenswrapper[4732]: I0402 13:50:36.874233 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Apr 02 13:50:36 crc kubenswrapper[4732]: I0402 13:50:36.880315 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Apr 02 13:50:36 crc kubenswrapper[4732]: I0402 13:50:36.889426 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Apr 02 13:50:36 crc kubenswrapper[4732]: I0402 13:50:36.939157 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Apr 02 13:50:36 crc kubenswrapper[4732]: I0402 13:50:36.982426 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Apr 02 13:50:37 crc kubenswrapper[4732]: I0402 13:50:37.129841 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Apr 02 13:50:37 crc kubenswrapper[4732]: I0402 13:50:37.203591 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Apr 02 13:50:37 crc kubenswrapper[4732]: I0402 13:50:37.249037 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Apr 02 13:50:37 crc kubenswrapper[4732]: I0402 13:50:37.260672 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Apr 02 13:50:37 crc kubenswrapper[4732]: I0402 13:50:37.287158 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Apr 02 13:50:37 crc kubenswrapper[4732]: I0402 13:50:37.293693 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Apr 02 13:50:37 crc kubenswrapper[4732]: I0402 13:50:37.356368 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Apr 02 13:50:37 crc kubenswrapper[4732]: I0402 13:50:37.388145 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Apr 02 13:50:37 crc kubenswrapper[4732]: I0402 13:50:37.418838 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Apr 02 13:50:37 crc kubenswrapper[4732]: I0402 13:50:37.453695 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Apr 02 13:50:37 crc kubenswrapper[4732]: I0402 13:50:37.478507 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Apr 02 13:50:37 crc kubenswrapper[4732]: I0402 13:50:37.560137 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Apr 02 13:50:37 crc kubenswrapper[4732]: I0402 13:50:37.799665 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Apr 02 13:50:37 crc kubenswrapper[4732]: I0402 13:50:37.931789 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Apr 02 13:50:37 crc kubenswrapper[4732]: I0402 13:50:37.968903 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Apr 02 13:50:37 crc kubenswrapper[4732]: I0402 13:50:37.986143 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Apr 02 13:50:38 crc kubenswrapper[4732]: I0402 13:50:38.019320 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Apr 02 13:50:38 crc kubenswrapper[4732]: I0402 13:50:38.029271 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Apr 02 13:50:38 crc kubenswrapper[4732]: I0402 13:50:38.152908 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Apr 02 13:50:38 crc kubenswrapper[4732]: I0402 13:50:38.251153 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Apr 02 13:50:38 crc kubenswrapper[4732]: I0402 13:50:38.278708 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Apr 02 13:50:38 crc kubenswrapper[4732]: I0402 13:50:38.298160 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Apr 02 13:50:38 crc kubenswrapper[4732]: I0402 13:50:38.316725 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Apr 02 13:50:38 crc kubenswrapper[4732]: I0402 13:50:38.385099 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Apr 02 13:50:38 crc kubenswrapper[4732]: I0402 13:50:38.456271 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Apr 02 13:50:38 crc kubenswrapper[4732]: I0402 13:50:38.830181 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Apr 02 13:50:38 crc kubenswrapper[4732]: I0402 13:50:38.965539 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.084409 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.140068 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.282269 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.301554 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.305382 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.352400 4732 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.357142 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.357208 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-infra/auto-csr-approver-29585630-gh7px"] Apr 02 13:50:39 crc kubenswrapper[4732]: E0402 13:50:39.357472 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a85873f-8730-4bb9-8d05-621568f7774c" containerName="installer" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.357494 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a85873f-8730-4bb9-8d05-621568f7774c" containerName="installer" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.357600 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22fd4835-69d0-47de-8792-5c8c0e8028ce" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.357651 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22fd4835-69d0-47de-8792-5c8c0e8028ce" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.357680 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a85873f-8730-4bb9-8d05-621568f7774c" containerName="installer" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.358086 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585630-gh7px" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.360154 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.360661 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.360899 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.362979 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.384532 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=24.384514016 podStartE2EDuration="24.384514016s" podCreationTimestamp="2026-04-02 13:50:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:50:39.380691942 +0000 UTC m=+796.285099525" watchObservedRunningTime="2026-04-02 13:50:39.384514016 +0000 UTC m=+796.288921569" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.395957 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.481524 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfld5\" (UniqueName: \"kubernetes.io/projected/c553e4fa-2f84-4e9c-be8f-bd59677b63b5-kube-api-access-sfld5\") pod \"auto-csr-approver-29585630-gh7px\" (UID: \"c553e4fa-2f84-4e9c-be8f-bd59677b63b5\") " pod="openshift-infra/auto-csr-approver-29585630-gh7px" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.520411 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.582928 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfld5\" (UniqueName: \"kubernetes.io/projected/c553e4fa-2f84-4e9c-be8f-bd59677b63b5-kube-api-access-sfld5\") pod \"auto-csr-approver-29585630-gh7px\" (UID: \"c553e4fa-2f84-4e9c-be8f-bd59677b63b5\") " pod="openshift-infra/auto-csr-approver-29585630-gh7px" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.609356 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfld5\" (UniqueName: \"kubernetes.io/projected/c553e4fa-2f84-4e9c-be8f-bd59677b63b5-kube-api-access-sfld5\") pod \"auto-csr-approver-29585630-gh7px\" (UID: \"c553e4fa-2f84-4e9c-be8f-bd59677b63b5\") " pod="openshift-infra/auto-csr-approver-29585630-gh7px" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.676625 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585630-gh7px" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.677123 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.717022 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.717257 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.831042 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Apr 02 13:50:39 crc kubenswrapper[4732]: I0402 13:50:39.844310 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585630-gh7px"] Apr 02 13:50:40 crc kubenswrapper[4732]: I0402 13:50:40.071461 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Apr 02 13:50:40 crc kubenswrapper[4732]: I0402 13:50:40.122425 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Apr 02 13:50:40 crc kubenswrapper[4732]: I0402 13:50:40.148762 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Apr 02 13:50:40 crc kubenswrapper[4732]: I0402 13:50:40.226337 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Apr 02 13:50:40 crc kubenswrapper[4732]: I0402 13:50:40.399032 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Apr 02 13:50:40 crc kubenswrapper[4732]: I0402 13:50:40.403182 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Apr 02 13:50:40 crc kubenswrapper[4732]: I0402 13:50:40.412950 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Apr 02 13:50:40 crc kubenswrapper[4732]: I0402 13:50:40.422038 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Apr 02 13:50:40 crc kubenswrapper[4732]: I0402 13:50:40.609699 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585630-gh7px" event={"ID":"c553e4fa-2f84-4e9c-be8f-bd59677b63b5","Type":"ContainerStarted","Data":"adda3e11eeb064356d0618ac1e94cc6c0e11c20c024e7a0a45281c2418b9e71a"} Apr 02 13:50:41 crc kubenswrapper[4732]: I0402 13:50:41.217189 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Apr 02 13:50:41 crc kubenswrapper[4732]: I0402 13:50:41.444242 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Apr 02 13:50:41 crc kubenswrapper[4732]: I0402 13:50:41.617422 4732 generic.go:334] "Generic (PLEG): container finished" podID="c553e4fa-2f84-4e9c-be8f-bd59677b63b5" containerID="0ad2dae1a4cd4b1fd0cf7f3610ac1b2608ff38d2b83dc2c4bbc87f7fe141ef48" exitCode=0 Apr 02 13:50:41 crc kubenswrapper[4732]: I0402 13:50:41.617473 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585630-gh7px" event={"ID":"c553e4fa-2f84-4e9c-be8f-bd59677b63b5","Type":"ContainerDied","Data":"0ad2dae1a4cd4b1fd0cf7f3610ac1b2608ff38d2b83dc2c4bbc87f7fe141ef48"} Apr 02 13:50:41 crc kubenswrapper[4732]: I0402 13:50:41.845571 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Apr 02 13:50:42 crc kubenswrapper[4732]: I0402 13:50:42.859481 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585630-gh7px" Apr 02 13:50:42 crc kubenswrapper[4732]: I0402 13:50:42.926629 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfld5\" (UniqueName: \"kubernetes.io/projected/c553e4fa-2f84-4e9c-be8f-bd59677b63b5-kube-api-access-sfld5\") pod \"c553e4fa-2f84-4e9c-be8f-bd59677b63b5\" (UID: \"c553e4fa-2f84-4e9c-be8f-bd59677b63b5\") " Apr 02 13:50:42 crc kubenswrapper[4732]: I0402 13:50:42.932087 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c553e4fa-2f84-4e9c-be8f-bd59677b63b5-kube-api-access-sfld5" (OuterVolumeSpecName: "kube-api-access-sfld5") pod "c553e4fa-2f84-4e9c-be8f-bd59677b63b5" (UID: "c553e4fa-2f84-4e9c-be8f-bd59677b63b5"). InnerVolumeSpecName "kube-api-access-sfld5". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:50:43 crc kubenswrapper[4732]: I0402 13:50:43.028367 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfld5\" (UniqueName: \"kubernetes.io/projected/c553e4fa-2f84-4e9c-be8f-bd59677b63b5-kube-api-access-sfld5\") on node \"crc\" DevicePath \"\"" Apr 02 13:50:43 crc kubenswrapper[4732]: I0402 13:50:43.637964 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585630-gh7px" event={"ID":"c553e4fa-2f84-4e9c-be8f-bd59677b63b5","Type":"ContainerDied","Data":"adda3e11eeb064356d0618ac1e94cc6c0e11c20c024e7a0a45281c2418b9e71a"} Apr 02 13:50:43 crc kubenswrapper[4732]: I0402 13:50:43.638229 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adda3e11eeb064356d0618ac1e94cc6c0e11c20c024e7a0a45281c2418b9e71a" Apr 02 13:50:43 crc kubenswrapper[4732]: I0402 13:50:43.638003 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585630-gh7px" Apr 02 13:50:48 crc kubenswrapper[4732]: I0402 13:50:48.977910 4732 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Apr 02 13:50:48 crc kubenswrapper[4732]: I0402 13:50:48.978750 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="aaec8d0ffd277c0e93001246672220ba" containerName="startup-monitor" containerID="cri-o://05b73558d0a4805040005db74b72895321403993df5016a328226833ee7de701" gracePeriod=5 Apr 02 13:50:54 crc kubenswrapper[4732]: I0402 13:50:54.544667 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_aaec8d0ffd277c0e93001246672220ba/startup-monitor/0.log" Apr 02 13:50:54 crc kubenswrapper[4732]: I0402 13:50:54.545174 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:50:54 crc kubenswrapper[4732]: I0402 13:50:54.666476 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-var-log\") pod \"aaec8d0ffd277c0e93001246672220ba\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " Apr 02 13:50:54 crc kubenswrapper[4732]: I0402 13:50:54.666564 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-resource-dir\") pod \"aaec8d0ffd277c0e93001246672220ba\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " Apr 02 13:50:54 crc kubenswrapper[4732]: I0402 13:50:54.666591 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-var-log" (OuterVolumeSpecName: "var-log") pod "aaec8d0ffd277c0e93001246672220ba" (UID: "aaec8d0ffd277c0e93001246672220ba"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:50:54 crc kubenswrapper[4732]: I0402 13:50:54.666663 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-var-lock\") pod \"aaec8d0ffd277c0e93001246672220ba\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " Apr 02 13:50:54 crc kubenswrapper[4732]: I0402 13:50:54.666725 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-var-lock" (OuterVolumeSpecName: "var-lock") pod "aaec8d0ffd277c0e93001246672220ba" (UID: "aaec8d0ffd277c0e93001246672220ba"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:50:54 crc kubenswrapper[4732]: I0402 13:50:54.666726 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "aaec8d0ffd277c0e93001246672220ba" (UID: "aaec8d0ffd277c0e93001246672220ba"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:50:54 crc kubenswrapper[4732]: I0402 13:50:54.666780 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-manifests\") pod \"aaec8d0ffd277c0e93001246672220ba\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " Apr 02 13:50:54 crc kubenswrapper[4732]: I0402 13:50:54.666805 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-pod-resource-dir\") pod \"aaec8d0ffd277c0e93001246672220ba\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " Apr 02 13:50:54 crc kubenswrapper[4732]: I0402 13:50:54.666862 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-manifests" (OuterVolumeSpecName: "manifests") pod "aaec8d0ffd277c0e93001246672220ba" (UID: "aaec8d0ffd277c0e93001246672220ba"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:50:54 crc kubenswrapper[4732]: I0402 13:50:54.666990 4732 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:50:54 crc kubenswrapper[4732]: I0402 13:50:54.667003 4732 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-var-lock\") on node \"crc\" DevicePath \"\"" Apr 02 13:50:54 crc kubenswrapper[4732]: I0402 13:50:54.667010 4732 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-manifests\") on node \"crc\" DevicePath \"\"" Apr 02 13:50:54 crc kubenswrapper[4732]: I0402 13:50:54.667018 4732 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-var-log\") on node \"crc\" DevicePath \"\"" Apr 02 13:50:54 crc kubenswrapper[4732]: I0402 13:50:54.674519 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "aaec8d0ffd277c0e93001246672220ba" (UID: "aaec8d0ffd277c0e93001246672220ba"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:50:54 crc kubenswrapper[4732]: I0402 13:50:54.687196 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaec8d0ffd277c0e93001246672220ba" path="/var/lib/kubelet/pods/aaec8d0ffd277c0e93001246672220ba/volumes" Apr 02 13:50:54 crc kubenswrapper[4732]: I0402 13:50:54.719912 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_aaec8d0ffd277c0e93001246672220ba/startup-monitor/0.log" Apr 02 13:50:54 crc kubenswrapper[4732]: I0402 13:50:54.719969 4732 generic.go:334] "Generic (PLEG): container finished" podID="aaec8d0ffd277c0e93001246672220ba" containerID="05b73558d0a4805040005db74b72895321403993df5016a328226833ee7de701" exitCode=137 Apr 02 13:50:54 crc kubenswrapper[4732]: I0402 13:50:54.720013 4732 scope.go:117] "RemoveContainer" containerID="05b73558d0a4805040005db74b72895321403993df5016a328226833ee7de701" Apr 02 13:50:54 crc kubenswrapper[4732]: I0402 13:50:54.720072 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 02 13:50:54 crc kubenswrapper[4732]: I0402 13:50:54.736832 4732 scope.go:117] "RemoveContainer" containerID="05b73558d0a4805040005db74b72895321403993df5016a328226833ee7de701" Apr 02 13:50:54 crc kubenswrapper[4732]: E0402 13:50:54.737248 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05b73558d0a4805040005db74b72895321403993df5016a328226833ee7de701\": container with ID starting with 05b73558d0a4805040005db74b72895321403993df5016a328226833ee7de701 not found: ID does not exist" containerID="05b73558d0a4805040005db74b72895321403993df5016a328226833ee7de701" Apr 02 13:50:54 crc kubenswrapper[4732]: I0402 13:50:54.737283 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05b73558d0a4805040005db74b72895321403993df5016a328226833ee7de701"} err="failed to get container status \"05b73558d0a4805040005db74b72895321403993df5016a328226833ee7de701\": rpc error: code = NotFound desc = could not find container \"05b73558d0a4805040005db74b72895321403993df5016a328226833ee7de701\": container with ID starting with 05b73558d0a4805040005db74b72895321403993df5016a328226833ee7de701 not found: ID does not exist" Apr 02 13:50:54 crc kubenswrapper[4732]: I0402 13:50:54.768496 4732 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:50:56 crc kubenswrapper[4732]: I0402 13:50:56.640393 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585624-8cm2s"] Apr 02 13:50:56 crc kubenswrapper[4732]: I0402 13:50:56.644683 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585624-8cm2s"] Apr 02 13:50:56 crc kubenswrapper[4732]: I0402 13:50:56.688563 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18370e02-d95b-4be8-a495-ee908136bee4" path="/var/lib/kubelet/pods/18370e02-d95b-4be8-a495-ee908136bee4/volumes" Apr 02 13:50:57 crc kubenswrapper[4732]: I0402 13:50:57.322190 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Apr 02 13:51:42 crc kubenswrapper[4732]: I0402 13:51:42.969573 4732 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 02 13:51:43 crc kubenswrapper[4732]: I0402 13:51:43.288711 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xzg9j"] Apr 02 13:51:43 crc kubenswrapper[4732]: I0402 13:51:43.289005 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" containerID="cri-o://97d6c3ee10c0b8caf6d9b9655805d29e3f2e2a039f4b7e5e1c23bc1c3e534800" gracePeriod=120 Apr 02 13:51:43 crc kubenswrapper[4732]: I0402 13:51:43.289142 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver-check-endpoints" containerID="cri-o://ccff9c2b817177ce6f647ac40187504767ebaef3aa02d263c169e6f13c255e1c" gracePeriod=120 Apr 02 13:51:44 crc kubenswrapper[4732]: I0402 13:51:44.010411 4732 generic.go:334] "Generic (PLEG): container finished" podID="ea530987-3884-4994-9574-b73fc76fcdde" containerID="ccff9c2b817177ce6f647ac40187504767ebaef3aa02d263c169e6f13c255e1c" exitCode=0 Apr 02 13:51:44 crc kubenswrapper[4732]: I0402 13:51:44.010515 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" event={"ID":"ea530987-3884-4994-9574-b73fc76fcdde","Type":"ContainerDied","Data":"ccff9c2b817177ce6f647ac40187504767ebaef3aa02d263c169e6f13c255e1c"} Apr 02 13:51:45 crc kubenswrapper[4732]: I0402 13:51:45.521372 4732 scope.go:117] "RemoveContainer" containerID="146571f443aca9aede42e319eb68c0aad3e5dca2be8b8406fde6882547e09f0a" Apr 02 13:51:46 crc kubenswrapper[4732]: I0402 13:51:46.901179 4732 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xzg9j container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Apr 02 13:51:46 crc kubenswrapper[4732]: [+]log ok Apr 02 13:51:46 crc kubenswrapper[4732]: [+]etcd excluded: ok Apr 02 13:51:46 crc kubenswrapper[4732]: [+]etcd-readiness excluded: ok Apr 02 13:51:46 crc kubenswrapper[4732]: [+]poststarthook/start-apiserver-admission-initializer ok Apr 02 13:51:46 crc kubenswrapper[4732]: [+]informer-sync ok Apr 02 13:51:46 crc kubenswrapper[4732]: [+]poststarthook/generic-apiserver-start-informers ok Apr 02 13:51:46 crc kubenswrapper[4732]: [+]poststarthook/max-in-flight-filter ok Apr 02 13:51:46 crc kubenswrapper[4732]: [+]poststarthook/storage-object-count-tracker-hook ok Apr 02 13:51:46 crc kubenswrapper[4732]: [+]poststarthook/image.openshift.io-apiserver-caches ok Apr 02 13:51:46 crc kubenswrapper[4732]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Apr 02 13:51:46 crc kubenswrapper[4732]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Apr 02 13:51:46 crc kubenswrapper[4732]: [+]poststarthook/project.openshift.io-projectcache ok Apr 02 13:51:46 crc kubenswrapper[4732]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Apr 02 13:51:46 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-startinformers ok Apr 02 13:51:46 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-restmapperupdater ok Apr 02 13:51:46 crc kubenswrapper[4732]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Apr 02 13:51:46 crc kubenswrapper[4732]: [-]shutdown failed: reason withheld Apr 02 13:51:46 crc kubenswrapper[4732]: readyz check failed Apr 02 13:51:46 crc kubenswrapper[4732]: I0402 13:51:46.901240 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:51:51 crc kubenswrapper[4732]: I0402 13:51:51.901526 4732 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xzg9j container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Apr 02 13:51:51 crc kubenswrapper[4732]: [+]log ok Apr 02 13:51:51 crc kubenswrapper[4732]: [+]etcd excluded: ok Apr 02 13:51:51 crc kubenswrapper[4732]: [+]etcd-readiness excluded: ok Apr 02 13:51:51 crc kubenswrapper[4732]: [+]poststarthook/start-apiserver-admission-initializer ok Apr 02 13:51:51 crc kubenswrapper[4732]: [+]informer-sync ok Apr 02 13:51:51 crc kubenswrapper[4732]: [+]poststarthook/generic-apiserver-start-informers ok Apr 02 13:51:51 crc kubenswrapper[4732]: [+]poststarthook/max-in-flight-filter ok Apr 02 13:51:51 crc kubenswrapper[4732]: [+]poststarthook/storage-object-count-tracker-hook ok Apr 02 13:51:51 crc kubenswrapper[4732]: [+]poststarthook/image.openshift.io-apiserver-caches ok Apr 02 13:51:51 crc kubenswrapper[4732]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Apr 02 13:51:51 crc kubenswrapper[4732]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Apr 02 13:51:51 crc kubenswrapper[4732]: [+]poststarthook/project.openshift.io-projectcache ok Apr 02 13:51:51 crc kubenswrapper[4732]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Apr 02 13:51:51 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-startinformers ok Apr 02 13:51:51 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-restmapperupdater ok Apr 02 13:51:51 crc kubenswrapper[4732]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Apr 02 13:51:51 crc kubenswrapper[4732]: [-]shutdown failed: reason withheld Apr 02 13:51:51 crc kubenswrapper[4732]: readyz check failed Apr 02 13:51:51 crc kubenswrapper[4732]: I0402 13:51:51.902750 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:51:56 crc kubenswrapper[4732]: I0402 13:51:56.904131 4732 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xzg9j container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Apr 02 13:51:56 crc kubenswrapper[4732]: [+]log ok Apr 02 13:51:56 crc kubenswrapper[4732]: [+]etcd excluded: ok Apr 02 13:51:56 crc kubenswrapper[4732]: [+]etcd-readiness excluded: ok Apr 02 13:51:56 crc kubenswrapper[4732]: [+]poststarthook/start-apiserver-admission-initializer ok Apr 02 13:51:56 crc kubenswrapper[4732]: [+]informer-sync ok Apr 02 13:51:56 crc kubenswrapper[4732]: [+]poststarthook/generic-apiserver-start-informers ok Apr 02 13:51:56 crc kubenswrapper[4732]: [+]poststarthook/max-in-flight-filter ok Apr 02 13:51:56 crc kubenswrapper[4732]: [+]poststarthook/storage-object-count-tracker-hook ok Apr 02 13:51:56 crc kubenswrapper[4732]: [+]poststarthook/image.openshift.io-apiserver-caches ok Apr 02 13:51:56 crc kubenswrapper[4732]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Apr 02 13:51:56 crc kubenswrapper[4732]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Apr 02 13:51:56 crc kubenswrapper[4732]: [+]poststarthook/project.openshift.io-projectcache ok Apr 02 13:51:56 crc kubenswrapper[4732]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Apr 02 13:51:56 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-startinformers ok Apr 02 13:51:56 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-restmapperupdater ok Apr 02 13:51:56 crc kubenswrapper[4732]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Apr 02 13:51:56 crc kubenswrapper[4732]: [-]shutdown failed: reason withheld Apr 02 13:51:56 crc kubenswrapper[4732]: readyz check failed Apr 02 13:51:56 crc kubenswrapper[4732]: I0402 13:51:56.906958 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:51:56 crc kubenswrapper[4732]: I0402 13:51:56.907322 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:52:00 crc kubenswrapper[4732]: I0402 13:52:00.141442 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585632-sh6n2"] Apr 02 13:52:00 crc kubenswrapper[4732]: E0402 13:52:00.142247 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c553e4fa-2f84-4e9c-be8f-bd59677b63b5" containerName="oc" Apr 02 13:52:00 crc kubenswrapper[4732]: I0402 13:52:00.142267 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c553e4fa-2f84-4e9c-be8f-bd59677b63b5" containerName="oc" Apr 02 13:52:00 crc kubenswrapper[4732]: E0402 13:52:00.142287 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaec8d0ffd277c0e93001246672220ba" containerName="startup-monitor" Apr 02 13:52:00 crc kubenswrapper[4732]: I0402 13:52:00.142295 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaec8d0ffd277c0e93001246672220ba" containerName="startup-monitor" Apr 02 13:52:00 crc kubenswrapper[4732]: I0402 13:52:00.142414 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaec8d0ffd277c0e93001246672220ba" containerName="startup-monitor" Apr 02 13:52:00 crc kubenswrapper[4732]: I0402 13:52:00.142432 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c553e4fa-2f84-4e9c-be8f-bd59677b63b5" containerName="oc" Apr 02 13:52:00 crc kubenswrapper[4732]: I0402 13:52:00.143506 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585632-sh6n2" Apr 02 13:52:00 crc kubenswrapper[4732]: I0402 13:52:00.149695 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 13:52:00 crc kubenswrapper[4732]: I0402 13:52:00.149835 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 13:52:00 crc kubenswrapper[4732]: I0402 13:52:00.149866 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 13:52:00 crc kubenswrapper[4732]: I0402 13:52:00.154282 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585632-sh6n2"] Apr 02 13:52:00 crc kubenswrapper[4732]: I0402 13:52:00.204415 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pr8h\" (UniqueName: \"kubernetes.io/projected/bd8b784e-9585-4fa8-b133-7c9b77ff167c-kube-api-access-5pr8h\") pod \"auto-csr-approver-29585632-sh6n2\" (UID: \"bd8b784e-9585-4fa8-b133-7c9b77ff167c\") " pod="openshift-infra/auto-csr-approver-29585632-sh6n2" Apr 02 13:52:00 crc kubenswrapper[4732]: I0402 13:52:00.305364 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pr8h\" (UniqueName: \"kubernetes.io/projected/bd8b784e-9585-4fa8-b133-7c9b77ff167c-kube-api-access-5pr8h\") pod \"auto-csr-approver-29585632-sh6n2\" (UID: \"bd8b784e-9585-4fa8-b133-7c9b77ff167c\") " pod="openshift-infra/auto-csr-approver-29585632-sh6n2" Apr 02 13:52:00 crc kubenswrapper[4732]: I0402 13:52:00.324884 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pr8h\" (UniqueName: \"kubernetes.io/projected/bd8b784e-9585-4fa8-b133-7c9b77ff167c-kube-api-access-5pr8h\") pod \"auto-csr-approver-29585632-sh6n2\" (UID: \"bd8b784e-9585-4fa8-b133-7c9b77ff167c\") " pod="openshift-infra/auto-csr-approver-29585632-sh6n2" Apr 02 13:52:00 crc kubenswrapper[4732]: I0402 13:52:00.463884 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585632-sh6n2" Apr 02 13:52:00 crc kubenswrapper[4732]: I0402 13:52:00.636290 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585632-sh6n2"] Apr 02 13:52:00 crc kubenswrapper[4732]: I0402 13:52:00.640945 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 02 13:52:01 crc kubenswrapper[4732]: I0402 13:52:01.117356 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585632-sh6n2" event={"ID":"bd8b784e-9585-4fa8-b133-7c9b77ff167c","Type":"ContainerStarted","Data":"f8446e22a3df907cb82cd3aaad953c2d3dff04c9c42be9e340557be352cf1739"} Apr 02 13:52:01 crc kubenswrapper[4732]: I0402 13:52:01.902730 4732 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xzg9j container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Apr 02 13:52:01 crc kubenswrapper[4732]: [+]log ok Apr 02 13:52:01 crc kubenswrapper[4732]: [+]etcd excluded: ok Apr 02 13:52:01 crc kubenswrapper[4732]: [+]etcd-readiness excluded: ok Apr 02 13:52:01 crc kubenswrapper[4732]: [+]poststarthook/start-apiserver-admission-initializer ok Apr 02 13:52:01 crc kubenswrapper[4732]: [+]informer-sync ok Apr 02 13:52:01 crc kubenswrapper[4732]: [+]poststarthook/generic-apiserver-start-informers ok Apr 02 13:52:01 crc kubenswrapper[4732]: [+]poststarthook/max-in-flight-filter ok Apr 02 13:52:01 crc kubenswrapper[4732]: [+]poststarthook/storage-object-count-tracker-hook ok Apr 02 13:52:01 crc kubenswrapper[4732]: [+]poststarthook/image.openshift.io-apiserver-caches ok Apr 02 13:52:01 crc kubenswrapper[4732]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Apr 02 13:52:01 crc kubenswrapper[4732]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Apr 02 13:52:01 crc kubenswrapper[4732]: [+]poststarthook/project.openshift.io-projectcache ok Apr 02 13:52:01 crc kubenswrapper[4732]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Apr 02 13:52:01 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-startinformers ok Apr 02 13:52:01 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-restmapperupdater ok Apr 02 13:52:01 crc kubenswrapper[4732]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Apr 02 13:52:01 crc kubenswrapper[4732]: [-]shutdown failed: reason withheld Apr 02 13:52:01 crc kubenswrapper[4732]: readyz check failed Apr 02 13:52:01 crc kubenswrapper[4732]: I0402 13:52:01.902788 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:52:02 crc kubenswrapper[4732]: I0402 13:52:02.124803 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585632-sh6n2" event={"ID":"bd8b784e-9585-4fa8-b133-7c9b77ff167c","Type":"ContainerStarted","Data":"0e8e27dfabd47f779aa28ec5352a03bfe37b30c693763287ea923bfc71287f19"} Apr 02 13:52:02 crc kubenswrapper[4732]: I0402 13:52:02.137587 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29585632-sh6n2" podStartSLOduration=0.995626878 podStartE2EDuration="2.137570533s" podCreationTimestamp="2026-04-02 13:52:00 +0000 UTC" firstStartedPulling="2026-04-02 13:52:00.640657941 +0000 UTC m=+877.545065494" lastFinishedPulling="2026-04-02 13:52:01.782601596 +0000 UTC m=+878.687009149" observedRunningTime="2026-04-02 13:52:02.136052882 +0000 UTC m=+879.040460435" watchObservedRunningTime="2026-04-02 13:52:02.137570533 +0000 UTC m=+879.041978086" Apr 02 13:52:03 crc kubenswrapper[4732]: I0402 13:52:03.131591 4732 generic.go:334] "Generic (PLEG): container finished" podID="bd8b784e-9585-4fa8-b133-7c9b77ff167c" containerID="0e8e27dfabd47f779aa28ec5352a03bfe37b30c693763287ea923bfc71287f19" exitCode=0 Apr 02 13:52:03 crc kubenswrapper[4732]: I0402 13:52:03.131652 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585632-sh6n2" event={"ID":"bd8b784e-9585-4fa8-b133-7c9b77ff167c","Type":"ContainerDied","Data":"0e8e27dfabd47f779aa28ec5352a03bfe37b30c693763287ea923bfc71287f19"} Apr 02 13:52:04 crc kubenswrapper[4732]: I0402 13:52:04.353093 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585632-sh6n2" Apr 02 13:52:04 crc kubenswrapper[4732]: I0402 13:52:04.460761 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pr8h\" (UniqueName: \"kubernetes.io/projected/bd8b784e-9585-4fa8-b133-7c9b77ff167c-kube-api-access-5pr8h\") pod \"bd8b784e-9585-4fa8-b133-7c9b77ff167c\" (UID: \"bd8b784e-9585-4fa8-b133-7c9b77ff167c\") " Apr 02 13:52:04 crc kubenswrapper[4732]: I0402 13:52:04.466976 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd8b784e-9585-4fa8-b133-7c9b77ff167c-kube-api-access-5pr8h" (OuterVolumeSpecName: "kube-api-access-5pr8h") pod "bd8b784e-9585-4fa8-b133-7c9b77ff167c" (UID: "bd8b784e-9585-4fa8-b133-7c9b77ff167c"). InnerVolumeSpecName "kube-api-access-5pr8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:52:04 crc kubenswrapper[4732]: I0402 13:52:04.561966 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pr8h\" (UniqueName: \"kubernetes.io/projected/bd8b784e-9585-4fa8-b133-7c9b77ff167c-kube-api-access-5pr8h\") on node \"crc\" DevicePath \"\"" Apr 02 13:52:05 crc kubenswrapper[4732]: I0402 13:52:05.166873 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585632-sh6n2" event={"ID":"bd8b784e-9585-4fa8-b133-7c9b77ff167c","Type":"ContainerDied","Data":"f8446e22a3df907cb82cd3aaad953c2d3dff04c9c42be9e340557be352cf1739"} Apr 02 13:52:05 crc kubenswrapper[4732]: I0402 13:52:05.167238 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8446e22a3df907cb82cd3aaad953c2d3dff04c9c42be9e340557be352cf1739" Apr 02 13:52:05 crc kubenswrapper[4732]: I0402 13:52:05.166963 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585632-sh6n2" Apr 02 13:52:05 crc kubenswrapper[4732]: I0402 13:52:05.193812 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585626-cqxvf"] Apr 02 13:52:05 crc kubenswrapper[4732]: I0402 13:52:05.196679 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585626-cqxvf"] Apr 02 13:52:06 crc kubenswrapper[4732]: I0402 13:52:06.690073 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="068f0778-94e5-4eca-b150-9ba914b8b879" path="/var/lib/kubelet/pods/068f0778-94e5-4eca-b150-9ba914b8b879/volumes" Apr 02 13:52:06 crc kubenswrapper[4732]: I0402 13:52:06.903788 4732 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xzg9j container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Apr 02 13:52:06 crc kubenswrapper[4732]: [+]log ok Apr 02 13:52:06 crc kubenswrapper[4732]: [+]etcd excluded: ok Apr 02 13:52:06 crc kubenswrapper[4732]: [+]etcd-readiness excluded: ok Apr 02 13:52:06 crc kubenswrapper[4732]: [+]poststarthook/start-apiserver-admission-initializer ok Apr 02 13:52:06 crc kubenswrapper[4732]: [+]informer-sync ok Apr 02 13:52:06 crc kubenswrapper[4732]: [+]poststarthook/generic-apiserver-start-informers ok Apr 02 13:52:06 crc kubenswrapper[4732]: [+]poststarthook/max-in-flight-filter ok Apr 02 13:52:06 crc kubenswrapper[4732]: [+]poststarthook/storage-object-count-tracker-hook ok Apr 02 13:52:06 crc kubenswrapper[4732]: [+]poststarthook/image.openshift.io-apiserver-caches ok Apr 02 13:52:06 crc kubenswrapper[4732]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Apr 02 13:52:06 crc kubenswrapper[4732]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Apr 02 13:52:06 crc kubenswrapper[4732]: [+]poststarthook/project.openshift.io-projectcache ok Apr 02 13:52:06 crc kubenswrapper[4732]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Apr 02 13:52:06 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-startinformers ok Apr 02 13:52:06 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-restmapperupdater ok Apr 02 13:52:06 crc kubenswrapper[4732]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Apr 02 13:52:06 crc kubenswrapper[4732]: [-]shutdown failed: reason withheld Apr 02 13:52:06 crc kubenswrapper[4732]: readyz check failed Apr 02 13:52:06 crc kubenswrapper[4732]: I0402 13:52:06.903844 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:52:11 crc kubenswrapper[4732]: I0402 13:52:11.905167 4732 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xzg9j container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Apr 02 13:52:11 crc kubenswrapper[4732]: [+]log ok Apr 02 13:52:11 crc kubenswrapper[4732]: [+]etcd excluded: ok Apr 02 13:52:11 crc kubenswrapper[4732]: [+]etcd-readiness excluded: ok Apr 02 13:52:11 crc kubenswrapper[4732]: [+]poststarthook/start-apiserver-admission-initializer ok Apr 02 13:52:11 crc kubenswrapper[4732]: [+]informer-sync ok Apr 02 13:52:11 crc kubenswrapper[4732]: [+]poststarthook/generic-apiserver-start-informers ok Apr 02 13:52:11 crc kubenswrapper[4732]: [+]poststarthook/max-in-flight-filter ok Apr 02 13:52:11 crc kubenswrapper[4732]: [+]poststarthook/storage-object-count-tracker-hook ok Apr 02 13:52:11 crc kubenswrapper[4732]: [+]poststarthook/image.openshift.io-apiserver-caches ok Apr 02 13:52:11 crc kubenswrapper[4732]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Apr 02 13:52:11 crc kubenswrapper[4732]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Apr 02 13:52:11 crc kubenswrapper[4732]: [+]poststarthook/project.openshift.io-projectcache ok Apr 02 13:52:11 crc kubenswrapper[4732]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Apr 02 13:52:11 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-startinformers ok Apr 02 13:52:11 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-restmapperupdater ok Apr 02 13:52:11 crc kubenswrapper[4732]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Apr 02 13:52:11 crc kubenswrapper[4732]: [-]shutdown failed: reason withheld Apr 02 13:52:11 crc kubenswrapper[4732]: readyz check failed Apr 02 13:52:11 crc kubenswrapper[4732]: I0402 13:52:11.905588 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:52:16 crc kubenswrapper[4732]: I0402 13:52:16.902425 4732 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xzg9j container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Apr 02 13:52:16 crc kubenswrapper[4732]: [+]log ok Apr 02 13:52:16 crc kubenswrapper[4732]: [+]etcd excluded: ok Apr 02 13:52:16 crc kubenswrapper[4732]: [+]etcd-readiness excluded: ok Apr 02 13:52:16 crc kubenswrapper[4732]: [+]poststarthook/start-apiserver-admission-initializer ok Apr 02 13:52:16 crc kubenswrapper[4732]: [+]informer-sync ok Apr 02 13:52:16 crc kubenswrapper[4732]: [+]poststarthook/generic-apiserver-start-informers ok Apr 02 13:52:16 crc kubenswrapper[4732]: [+]poststarthook/max-in-flight-filter ok Apr 02 13:52:16 crc kubenswrapper[4732]: [+]poststarthook/storage-object-count-tracker-hook ok Apr 02 13:52:16 crc kubenswrapper[4732]: [+]poststarthook/image.openshift.io-apiserver-caches ok Apr 02 13:52:16 crc kubenswrapper[4732]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Apr 02 13:52:16 crc kubenswrapper[4732]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Apr 02 13:52:16 crc kubenswrapper[4732]: [+]poststarthook/project.openshift.io-projectcache ok Apr 02 13:52:16 crc kubenswrapper[4732]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Apr 02 13:52:16 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-startinformers ok Apr 02 13:52:16 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-restmapperupdater ok Apr 02 13:52:16 crc kubenswrapper[4732]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Apr 02 13:52:16 crc kubenswrapper[4732]: [-]shutdown failed: reason withheld Apr 02 13:52:16 crc kubenswrapper[4732]: readyz check failed Apr 02 13:52:16 crc kubenswrapper[4732]: I0402 13:52:16.902801 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:52:21 crc kubenswrapper[4732]: I0402 13:52:21.902196 4732 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xzg9j container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Apr 02 13:52:21 crc kubenswrapper[4732]: [+]log ok Apr 02 13:52:21 crc kubenswrapper[4732]: [+]etcd excluded: ok Apr 02 13:52:21 crc kubenswrapper[4732]: [+]etcd-readiness excluded: ok Apr 02 13:52:21 crc kubenswrapper[4732]: [+]poststarthook/start-apiserver-admission-initializer ok Apr 02 13:52:21 crc kubenswrapper[4732]: [+]informer-sync ok Apr 02 13:52:21 crc kubenswrapper[4732]: [+]poststarthook/generic-apiserver-start-informers ok Apr 02 13:52:21 crc kubenswrapper[4732]: [+]poststarthook/max-in-flight-filter ok Apr 02 13:52:21 crc kubenswrapper[4732]: [+]poststarthook/storage-object-count-tracker-hook ok Apr 02 13:52:21 crc kubenswrapper[4732]: [+]poststarthook/image.openshift.io-apiserver-caches ok Apr 02 13:52:21 crc kubenswrapper[4732]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Apr 02 13:52:21 crc kubenswrapper[4732]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Apr 02 13:52:21 crc kubenswrapper[4732]: [+]poststarthook/project.openshift.io-projectcache ok Apr 02 13:52:21 crc kubenswrapper[4732]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Apr 02 13:52:21 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-startinformers ok Apr 02 13:52:21 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-restmapperupdater ok Apr 02 13:52:21 crc kubenswrapper[4732]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Apr 02 13:52:21 crc kubenswrapper[4732]: [-]shutdown failed: reason withheld Apr 02 13:52:21 crc kubenswrapper[4732]: readyz check failed Apr 02 13:52:21 crc kubenswrapper[4732]: I0402 13:52:21.902572 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:52:26 crc kubenswrapper[4732]: I0402 13:52:26.903437 4732 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xzg9j container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Apr 02 13:52:26 crc kubenswrapper[4732]: [+]log ok Apr 02 13:52:26 crc kubenswrapper[4732]: [+]etcd excluded: ok Apr 02 13:52:26 crc kubenswrapper[4732]: [+]etcd-readiness excluded: ok Apr 02 13:52:26 crc kubenswrapper[4732]: [+]poststarthook/start-apiserver-admission-initializer ok Apr 02 13:52:26 crc kubenswrapper[4732]: [+]informer-sync ok Apr 02 13:52:26 crc kubenswrapper[4732]: [+]poststarthook/generic-apiserver-start-informers ok Apr 02 13:52:26 crc kubenswrapper[4732]: [+]poststarthook/max-in-flight-filter ok Apr 02 13:52:26 crc kubenswrapper[4732]: [+]poststarthook/storage-object-count-tracker-hook ok Apr 02 13:52:26 crc kubenswrapper[4732]: [+]poststarthook/image.openshift.io-apiserver-caches ok Apr 02 13:52:26 crc kubenswrapper[4732]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Apr 02 13:52:26 crc kubenswrapper[4732]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Apr 02 13:52:26 crc kubenswrapper[4732]: [+]poststarthook/project.openshift.io-projectcache ok Apr 02 13:52:26 crc kubenswrapper[4732]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Apr 02 13:52:26 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-startinformers ok Apr 02 13:52:26 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-restmapperupdater ok Apr 02 13:52:26 crc kubenswrapper[4732]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Apr 02 13:52:26 crc kubenswrapper[4732]: [-]shutdown failed: reason withheld Apr 02 13:52:26 crc kubenswrapper[4732]: readyz check failed Apr 02 13:52:26 crc kubenswrapper[4732]: I0402 13:52:26.903833 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:52:31 crc kubenswrapper[4732]: I0402 13:52:31.901494 4732 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xzg9j container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Apr 02 13:52:31 crc kubenswrapper[4732]: [+]log ok Apr 02 13:52:31 crc kubenswrapper[4732]: [+]etcd excluded: ok Apr 02 13:52:31 crc kubenswrapper[4732]: [+]etcd-readiness excluded: ok Apr 02 13:52:31 crc kubenswrapper[4732]: [+]poststarthook/start-apiserver-admission-initializer ok Apr 02 13:52:31 crc kubenswrapper[4732]: [+]informer-sync ok Apr 02 13:52:31 crc kubenswrapper[4732]: [+]poststarthook/generic-apiserver-start-informers ok Apr 02 13:52:31 crc kubenswrapper[4732]: [+]poststarthook/max-in-flight-filter ok Apr 02 13:52:31 crc kubenswrapper[4732]: [+]poststarthook/storage-object-count-tracker-hook ok Apr 02 13:52:31 crc kubenswrapper[4732]: [+]poststarthook/image.openshift.io-apiserver-caches ok Apr 02 13:52:31 crc kubenswrapper[4732]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Apr 02 13:52:31 crc kubenswrapper[4732]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Apr 02 13:52:31 crc kubenswrapper[4732]: [+]poststarthook/project.openshift.io-projectcache ok Apr 02 13:52:31 crc kubenswrapper[4732]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Apr 02 13:52:31 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-startinformers ok Apr 02 13:52:31 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-restmapperupdater ok Apr 02 13:52:31 crc kubenswrapper[4732]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Apr 02 13:52:31 crc kubenswrapper[4732]: [-]shutdown failed: reason withheld Apr 02 13:52:31 crc kubenswrapper[4732]: readyz check failed Apr 02 13:52:31 crc kubenswrapper[4732]: I0402 13:52:31.901935 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 02 13:52:31 crc kubenswrapper[4732]: I0402 13:52:31.925011 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 13:52:31 crc kubenswrapper[4732]: I0402 13:52:31.925081 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 13:52:36 crc kubenswrapper[4732]: I0402 13:52:36.897873 4732 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xzg9j container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Apr 02 13:52:36 crc kubenswrapper[4732]: I0402 13:52:36.898423 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.34:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.34:8443: connect: connection refused" Apr 02 13:52:41 crc kubenswrapper[4732]: I0402 13:52:41.897305 4732 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xzg9j container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Apr 02 13:52:41 crc kubenswrapper[4732]: I0402 13:52:41.897366 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.34:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.34:8443: connect: connection refused" Apr 02 13:52:45 crc kubenswrapper[4732]: I0402 13:52:45.594850 4732 scope.go:117] "RemoveContainer" containerID="e42ab516580b36a04af43f8a58d74b6d65395a948ab9bd93f4a6d73b00933406" Apr 02 13:52:45 crc kubenswrapper[4732]: I0402 13:52:45.636772 4732 scope.go:117] "RemoveContainer" containerID="a14d376cb655f7984c3fbe269826d57ea3e936ded391d3fbb9394a6b7960dad5" Apr 02 13:52:46 crc kubenswrapper[4732]: I0402 13:52:46.897108 4732 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xzg9j container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Apr 02 13:52:46 crc kubenswrapper[4732]: I0402 13:52:46.897183 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.34:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.34:8443: connect: connection refused" Apr 02 13:52:51 crc kubenswrapper[4732]: I0402 13:52:51.898045 4732 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xzg9j container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Apr 02 13:52:51 crc kubenswrapper[4732]: I0402 13:52:51.898273 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.34:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.34:8443: connect: connection refused" Apr 02 13:52:56 crc kubenswrapper[4732]: I0402 13:52:56.898039 4732 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xzg9j container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Apr 02 13:52:56 crc kubenswrapper[4732]: I0402 13:52:56.898566 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.34:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.34:8443: connect: connection refused" Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.354678 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-v9s5t"] Apr 02 13:52:58 crc kubenswrapper[4732]: E0402 13:52:58.355159 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd8b784e-9585-4fa8-b133-7c9b77ff167c" containerName="oc" Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.355213 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd8b784e-9585-4fa8-b133-7c9b77ff167c" containerName="oc" Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.355323 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd8b784e-9585-4fa8-b133-7c9b77ff167c" containerName="oc" Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.355704 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-v9s5t" Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.358853 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.359158 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.359391 4732 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-dhvw9" Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.363142 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-pnknx"] Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.364096 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-pnknx" Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.367286 4732 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-b5xn8" Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.367752 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-v9s5t"] Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.390417 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-fhv6z"] Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.391230 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-fhv6z" Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.394209 4732 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-pqmp8" Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.407815 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-pnknx"] Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.410689 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-fhv6z"] Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.522801 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj7fv\" (UniqueName: \"kubernetes.io/projected/4bbf6f84-13bb-4562-af35-10ab372d6580-kube-api-access-nj7fv\") pod \"cert-manager-webhook-687f57d79b-fhv6z\" (UID: \"4bbf6f84-13bb-4562-af35-10ab372d6580\") " pod="cert-manager/cert-manager-webhook-687f57d79b-fhv6z" Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.522939 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrwsq\" (UniqueName: \"kubernetes.io/projected/11b5be71-32b7-43a3-bf27-d6b1c73844c6-kube-api-access-jrwsq\") pod \"cert-manager-cainjector-cf98fcc89-v9s5t\" (UID: \"11b5be71-32b7-43a3-bf27-d6b1c73844c6\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-v9s5t" Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.523092 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj7dw\" (UniqueName: \"kubernetes.io/projected/fc67973f-1e3f-4aea-bf48-5914e7c8ddbb-kube-api-access-gj7dw\") pod \"cert-manager-858654f9db-pnknx\" (UID: \"fc67973f-1e3f-4aea-bf48-5914e7c8ddbb\") " pod="cert-manager/cert-manager-858654f9db-pnknx" Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.624183 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj7fv\" (UniqueName: \"kubernetes.io/projected/4bbf6f84-13bb-4562-af35-10ab372d6580-kube-api-access-nj7fv\") pod \"cert-manager-webhook-687f57d79b-fhv6z\" (UID: \"4bbf6f84-13bb-4562-af35-10ab372d6580\") " pod="cert-manager/cert-manager-webhook-687f57d79b-fhv6z" Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.624244 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrwsq\" (UniqueName: \"kubernetes.io/projected/11b5be71-32b7-43a3-bf27-d6b1c73844c6-kube-api-access-jrwsq\") pod \"cert-manager-cainjector-cf98fcc89-v9s5t\" (UID: \"11b5be71-32b7-43a3-bf27-d6b1c73844c6\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-v9s5t" Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.624292 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj7dw\" (UniqueName: \"kubernetes.io/projected/fc67973f-1e3f-4aea-bf48-5914e7c8ddbb-kube-api-access-gj7dw\") pod \"cert-manager-858654f9db-pnknx\" (UID: \"fc67973f-1e3f-4aea-bf48-5914e7c8ddbb\") " pod="cert-manager/cert-manager-858654f9db-pnknx" Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.647010 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj7dw\" (UniqueName: \"kubernetes.io/projected/fc67973f-1e3f-4aea-bf48-5914e7c8ddbb-kube-api-access-gj7dw\") pod \"cert-manager-858654f9db-pnknx\" (UID: \"fc67973f-1e3f-4aea-bf48-5914e7c8ddbb\") " pod="cert-manager/cert-manager-858654f9db-pnknx" Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.647637 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj7fv\" (UniqueName: \"kubernetes.io/projected/4bbf6f84-13bb-4562-af35-10ab372d6580-kube-api-access-nj7fv\") pod \"cert-manager-webhook-687f57d79b-fhv6z\" (UID: \"4bbf6f84-13bb-4562-af35-10ab372d6580\") " pod="cert-manager/cert-manager-webhook-687f57d79b-fhv6z" Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.648566 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrwsq\" (UniqueName: \"kubernetes.io/projected/11b5be71-32b7-43a3-bf27-d6b1c73844c6-kube-api-access-jrwsq\") pod \"cert-manager-cainjector-cf98fcc89-v9s5t\" (UID: \"11b5be71-32b7-43a3-bf27-d6b1c73844c6\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-v9s5t" Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.680661 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-v9s5t" Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.690307 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-pnknx" Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.708477 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-fhv6z" Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.878402 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-v9s5t"] Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.907774 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-fhv6z"] Apr 02 13:52:58 crc kubenswrapper[4732]: I0402 13:52:58.942442 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-pnknx"] Apr 02 13:52:58 crc kubenswrapper[4732]: W0402 13:52:58.943422 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc67973f_1e3f_4aea_bf48_5914e7c8ddbb.slice/crio-2b0ed8cc771cba06762557b03bb09851e73d532063bac0e25d6fc4473070c9a1 WatchSource:0}: Error finding container 2b0ed8cc771cba06762557b03bb09851e73d532063bac0e25d6fc4473070c9a1: Status 404 returned error can't find the container with id 2b0ed8cc771cba06762557b03bb09851e73d532063bac0e25d6fc4473070c9a1 Apr 02 13:52:59 crc kubenswrapper[4732]: I0402 13:52:59.499304 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-v9s5t" event={"ID":"11b5be71-32b7-43a3-bf27-d6b1c73844c6","Type":"ContainerStarted","Data":"629b188332017e8934e3ad359c63d4612c60ecea6456a172865af2a8351cf665"} Apr 02 13:52:59 crc kubenswrapper[4732]: I0402 13:52:59.501460 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-fhv6z" event={"ID":"4bbf6f84-13bb-4562-af35-10ab372d6580","Type":"ContainerStarted","Data":"011ed5ae71cc9176a041890698db0a01a93155f59a9386e092e4fb0f01b72aaa"} Apr 02 13:52:59 crc kubenswrapper[4732]: I0402 13:52:59.502722 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-pnknx" event={"ID":"fc67973f-1e3f-4aea-bf48-5914e7c8ddbb","Type":"ContainerStarted","Data":"2b0ed8cc771cba06762557b03bb09851e73d532063bac0e25d6fc4473070c9a1"} Apr 02 13:53:01 crc kubenswrapper[4732]: I0402 13:53:01.897206 4732 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xzg9j container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Apr 02 13:53:01 crc kubenswrapper[4732]: I0402 13:53:01.897270 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.34:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.34:8443: connect: connection refused" Apr 02 13:53:01 crc kubenswrapper[4732]: I0402 13:53:01.925008 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 13:53:01 crc kubenswrapper[4732]: I0402 13:53:01.925072 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 13:53:03 crc kubenswrapper[4732]: I0402 13:53:03.531007 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-v9s5t" event={"ID":"11b5be71-32b7-43a3-bf27-d6b1c73844c6","Type":"ContainerStarted","Data":"e8591e623eaa2a62222ed5ab10cf0ce4b84eaccbb0f235325e54096cbbec3551"} Apr 02 13:53:03 crc kubenswrapper[4732]: I0402 13:53:03.532166 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-fhv6z" event={"ID":"4bbf6f84-13bb-4562-af35-10ab372d6580","Type":"ContainerStarted","Data":"78f499db97f362f45bca98f58f5600df07584af01c54a8bd1e81f8b215d64cde"} Apr 02 13:53:03 crc kubenswrapper[4732]: I0402 13:53:03.532264 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-fhv6z" Apr 02 13:53:03 crc kubenswrapper[4732]: I0402 13:53:03.533439 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-pnknx" event={"ID":"fc67973f-1e3f-4aea-bf48-5914e7c8ddbb","Type":"ContainerStarted","Data":"324f1e9735a7d0c750c3c94440499c4dc2cb2a53f04aaaabc425e7f70a785711"} Apr 02 13:53:03 crc kubenswrapper[4732]: I0402 13:53:03.549983 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-v9s5t" podStartSLOduration=1.455242057 podStartE2EDuration="5.549956915s" podCreationTimestamp="2026-04-02 13:52:58 +0000 UTC" firstStartedPulling="2026-04-02 13:52:58.884535427 +0000 UTC m=+935.788942980" lastFinishedPulling="2026-04-02 13:53:02.979250285 +0000 UTC m=+939.883657838" observedRunningTime="2026-04-02 13:53:03.543725496 +0000 UTC m=+940.448133049" watchObservedRunningTime="2026-04-02 13:53:03.549956915 +0000 UTC m=+940.454364508" Apr 02 13:53:03 crc kubenswrapper[4732]: I0402 13:53:03.575717 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-fhv6z" podStartSLOduration=1.5089487400000001 podStartE2EDuration="5.575695171s" podCreationTimestamp="2026-04-02 13:52:58 +0000 UTC" firstStartedPulling="2026-04-02 13:52:58.912948446 +0000 UTC m=+935.817355999" lastFinishedPulling="2026-04-02 13:53:02.979694877 +0000 UTC m=+939.884102430" observedRunningTime="2026-04-02 13:53:03.566710218 +0000 UTC m=+940.471117881" watchObservedRunningTime="2026-04-02 13:53:03.575695171 +0000 UTC m=+940.480102734" Apr 02 13:53:03 crc kubenswrapper[4732]: I0402 13:53:03.588039 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-pnknx" podStartSLOduration=1.537854802 podStartE2EDuration="5.588017824s" podCreationTimestamp="2026-04-02 13:52:58 +0000 UTC" firstStartedPulling="2026-04-02 13:52:58.945127926 +0000 UTC m=+935.849535479" lastFinishedPulling="2026-04-02 13:53:02.995290938 +0000 UTC m=+939.899698501" observedRunningTime="2026-04-02 13:53:03.583357638 +0000 UTC m=+940.487765201" watchObservedRunningTime="2026-04-02 13:53:03.588017824 +0000 UTC m=+940.492425377" Apr 02 13:53:06 crc kubenswrapper[4732]: I0402 13:53:06.897403 4732 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xzg9j container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Apr 02 13:53:06 crc kubenswrapper[4732]: I0402 13:53:06.897953 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.34:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.34:8443: connect: connection refused" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.038382 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8qmgp"] Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.039358 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="ovn-controller" containerID="cri-o://4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f" gracePeriod=30 Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.039456 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="nbdb" containerID="cri-o://9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04" gracePeriod=30 Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.039540 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="kube-rbac-proxy-node" containerID="cri-o://dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7" gracePeriod=30 Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.039545 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="northd" containerID="cri-o://09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191" gracePeriod=30 Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.039605 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="sbdb" containerID="cri-o://2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7" gracePeriod=30 Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.039646 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="ovn-acl-logging" containerID="cri-o://39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e" gracePeriod=30 Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.039503 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39" gracePeriod=30 Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.079838 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="ovnkube-controller" containerID="cri-o://3e2ff95a85f25b815a62b7e159ba1b483e166fb7951d270c49c1d1cd5ba86e09" gracePeriod=30 Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.569032 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s52gj_ad206957-df5c-4b3e-bd35-e798a07d2f4e/kube-multus/2.log" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.569770 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s52gj_ad206957-df5c-4b3e-bd35-e798a07d2f4e/kube-multus/1.log" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.569937 4732 generic.go:334] "Generic (PLEG): container finished" podID="ad206957-df5c-4b3e-bd35-e798a07d2f4e" containerID="820f77a95a045dd94c63cb41cc026f01d115283a3748994d26bdeafeace13fd8" exitCode=2 Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.570002 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s52gj" event={"ID":"ad206957-df5c-4b3e-bd35-e798a07d2f4e","Type":"ContainerDied","Data":"820f77a95a045dd94c63cb41cc026f01d115283a3748994d26bdeafeace13fd8"} Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.570413 4732 scope.go:117] "RemoveContainer" containerID="59880ea9acc1fa69e9974f404371d2876c60b9ad12942fb9c6a0aa01b0632050" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.570899 4732 scope.go:117] "RemoveContainer" containerID="820f77a95a045dd94c63cb41cc026f01d115283a3748994d26bdeafeace13fd8" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.576823 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qmgp_a6f5e483-7d6b-4d6d-be84-303d8f07643e/ovnkube-controller/3.log" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.580564 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qmgp_a6f5e483-7d6b-4d6d-be84-303d8f07643e/ovn-acl-logging/0.log" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.581405 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qmgp_a6f5e483-7d6b-4d6d-be84-303d8f07643e/ovn-controller/0.log" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.581935 4732 generic.go:334] "Generic (PLEG): container finished" podID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerID="3e2ff95a85f25b815a62b7e159ba1b483e166fb7951d270c49c1d1cd5ba86e09" exitCode=0 Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.582024 4732 generic.go:334] "Generic (PLEG): container finished" podID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerID="2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7" exitCode=0 Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.582064 4732 generic.go:334] "Generic (PLEG): container finished" podID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerID="9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04" exitCode=0 Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.582072 4732 generic.go:334] "Generic (PLEG): container finished" podID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerID="09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191" exitCode=0 Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.582080 4732 generic.go:334] "Generic (PLEG): container finished" podID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerID="975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39" exitCode=0 Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.582090 4732 generic.go:334] "Generic (PLEG): container finished" podID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerID="dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7" exitCode=0 Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.582102 4732 generic.go:334] "Generic (PLEG): container finished" podID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerID="39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e" exitCode=143 Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.582134 4732 generic.go:334] "Generic (PLEG): container finished" podID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerID="4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f" exitCode=143 Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.582159 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerDied","Data":"3e2ff95a85f25b815a62b7e159ba1b483e166fb7951d270c49c1d1cd5ba86e09"} Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.582189 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerDied","Data":"2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7"} Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.582232 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerDied","Data":"9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04"} Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.582244 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerDied","Data":"09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191"} Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.582255 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerDied","Data":"975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39"} Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.582266 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerDied","Data":"dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7"} Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.582304 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerDied","Data":"39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e"} Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.582318 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerDied","Data":"4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f"} Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.644229 4732 scope.go:117] "RemoveContainer" containerID="8438ae7262b1ff33b5168b83661bede090ebb04853882f438b8fad8650d131d1" Apr 02 13:53:08 crc kubenswrapper[4732]: E0402 13:53:08.688870 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8438ae7262b1ff33b5168b83661bede090ebb04853882f438b8fad8650d131d1\": container with ID starting with 8438ae7262b1ff33b5168b83661bede090ebb04853882f438b8fad8650d131d1 not found: ID does not exist" containerID="8438ae7262b1ff33b5168b83661bede090ebb04853882f438b8fad8650d131d1" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.691704 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qmgp_a6f5e483-7d6b-4d6d-be84-303d8f07643e/ovn-acl-logging/0.log" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.692284 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qmgp_a6f5e483-7d6b-4d6d-be84-303d8f07643e/ovn-controller/0.log" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.692736 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.711978 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-fhv6z" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745052 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ztk25"] Apr 02 13:53:08 crc kubenswrapper[4732]: E0402 13:53:08.745308 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="ovnkube-controller" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745325 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="ovnkube-controller" Apr 02 13:53:08 crc kubenswrapper[4732]: E0402 13:53:08.745334 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="kubecfg-setup" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745341 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="kubecfg-setup" Apr 02 13:53:08 crc kubenswrapper[4732]: E0402 13:53:08.745355 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="ovnkube-controller" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745362 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="ovnkube-controller" Apr 02 13:53:08 crc kubenswrapper[4732]: E0402 13:53:08.745369 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="nbdb" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745375 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="nbdb" Apr 02 13:53:08 crc kubenswrapper[4732]: E0402 13:53:08.745385 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="kube-rbac-proxy-ovn-metrics" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745391 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="kube-rbac-proxy-ovn-metrics" Apr 02 13:53:08 crc kubenswrapper[4732]: E0402 13:53:08.745398 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="sbdb" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745404 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="sbdb" Apr 02 13:53:08 crc kubenswrapper[4732]: E0402 13:53:08.745416 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="ovn-controller" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745422 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="ovn-controller" Apr 02 13:53:08 crc kubenswrapper[4732]: E0402 13:53:08.745431 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="ovnkube-controller" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745437 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="ovnkube-controller" Apr 02 13:53:08 crc kubenswrapper[4732]: E0402 13:53:08.745443 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="ovn-acl-logging" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745448 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="ovn-acl-logging" Apr 02 13:53:08 crc kubenswrapper[4732]: E0402 13:53:08.745460 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="kube-rbac-proxy-node" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745481 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="kube-rbac-proxy-node" Apr 02 13:53:08 crc kubenswrapper[4732]: E0402 13:53:08.745492 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="northd" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745498 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="northd" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745590 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="kube-rbac-proxy-ovn-metrics" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745603 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="northd" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745630 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="ovnkube-controller" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745642 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="sbdb" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745648 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="ovn-acl-logging" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745656 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="ovn-controller" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745661 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="ovnkube-controller" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745668 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="kube-rbac-proxy-node" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745676 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="ovnkube-controller" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745682 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="ovnkube-controller" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745692 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="nbdb" Apr 02 13:53:08 crc kubenswrapper[4732]: E0402 13:53:08.745773 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="ovnkube-controller" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745780 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="ovnkube-controller" Apr 02 13:53:08 crc kubenswrapper[4732]: E0402 13:53:08.745789 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="ovnkube-controller" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745795 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="ovnkube-controller" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.745895 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" containerName="ovnkube-controller" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.747781 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.847269 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-etc-openvswitch\") pod \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.847361 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "a6f5e483-7d6b-4d6d-be84-303d8f07643e" (UID: "a6f5e483-7d6b-4d6d-be84-303d8f07643e"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.847459 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jljzm\" (UniqueName: \"kubernetes.io/projected/a6f5e483-7d6b-4d6d-be84-303d8f07643e-kube-api-access-jljzm\") pod \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.847493 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-run-systemd\") pod \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.848301 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-systemd-units\") pod \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.848328 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-var-lib-openvswitch\") pod \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.848355 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-cni-netd\") pod \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.848376 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a6f5e483-7d6b-4d6d-be84-303d8f07643e-ovn-node-metrics-cert\") pod \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.848393 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-log-socket\") pod \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.848434 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a6f5e483-7d6b-4d6d-be84-303d8f07643e-env-overrides\") pod \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.848452 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "a6f5e483-7d6b-4d6d-be84-303d8f07643e" (UID: "a6f5e483-7d6b-4d6d-be84-303d8f07643e"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.848456 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-cni-bin\") pod \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.848453 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "a6f5e483-7d6b-4d6d-be84-303d8f07643e" (UID: "a6f5e483-7d6b-4d6d-be84-303d8f07643e"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.848505 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-run-netns\") pod \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.848480 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "a6f5e483-7d6b-4d6d-be84-303d8f07643e" (UID: "a6f5e483-7d6b-4d6d-be84-303d8f07643e"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.848529 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-run-openvswitch\") pod \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.848488 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "a6f5e483-7d6b-4d6d-be84-303d8f07643e" (UID: "a6f5e483-7d6b-4d6d-be84-303d8f07643e"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.848501 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-log-socket" (OuterVolumeSpecName: "log-socket") pod "a6f5e483-7d6b-4d6d-be84-303d8f07643e" (UID: "a6f5e483-7d6b-4d6d-be84-303d8f07643e"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.848547 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "a6f5e483-7d6b-4d6d-be84-303d8f07643e" (UID: "a6f5e483-7d6b-4d6d-be84-303d8f07643e"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.848567 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "a6f5e483-7d6b-4d6d-be84-303d8f07643e" (UID: "a6f5e483-7d6b-4d6d-be84-303d8f07643e"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.848683 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a6f5e483-7d6b-4d6d-be84-303d8f07643e-ovnkube-config\") pod \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.848720 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-run-ovn\") pod \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.848755 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a6f5e483-7d6b-4d6d-be84-303d8f07643e-ovnkube-script-lib\") pod \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.848774 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-node-log\") pod \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.848783 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "a6f5e483-7d6b-4d6d-be84-303d8f07643e" (UID: "a6f5e483-7d6b-4d6d-be84-303d8f07643e"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.848795 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-slash\") pod \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.848817 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-kubelet\") pod \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.848845 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.848869 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-run-ovn-kubernetes\") pod \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\" (UID: \"a6f5e483-7d6b-4d6d-be84-303d8f07643e\") " Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849010 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-host-cni-bin\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849043 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-host-run-netns\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849070 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-systemd-units\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849093 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/54d324c3-06b6-4fb8-9ed9-73691455d852-ovnkube-config\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849122 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-etc-openvswitch\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849150 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-run-systemd\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849183 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-host-slash\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849043 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6f5e483-7d6b-4d6d-be84-303d8f07643e-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "a6f5e483-7d6b-4d6d-be84-303d8f07643e" (UID: "a6f5e483-7d6b-4d6d-be84-303d8f07643e"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849193 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6f5e483-7d6b-4d6d-be84-303d8f07643e-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "a6f5e483-7d6b-4d6d-be84-303d8f07643e" (UID: "a6f5e483-7d6b-4d6d-be84-303d8f07643e"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849205 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-host-run-ovn-kubernetes\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849072 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "a6f5e483-7d6b-4d6d-be84-303d8f07643e" (UID: "a6f5e483-7d6b-4d6d-be84-303d8f07643e"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849093 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-node-log" (OuterVolumeSpecName: "node-log") pod "a6f5e483-7d6b-4d6d-be84-303d8f07643e" (UID: "a6f5e483-7d6b-4d6d-be84-303d8f07643e"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849113 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-slash" (OuterVolumeSpecName: "host-slash") pod "a6f5e483-7d6b-4d6d-be84-303d8f07643e" (UID: "a6f5e483-7d6b-4d6d-be84-303d8f07643e"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849131 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "a6f5e483-7d6b-4d6d-be84-303d8f07643e" (UID: "a6f5e483-7d6b-4d6d-be84-303d8f07643e"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849170 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "a6f5e483-7d6b-4d6d-be84-303d8f07643e" (UID: "a6f5e483-7d6b-4d6d-be84-303d8f07643e"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849181 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6f5e483-7d6b-4d6d-be84-303d8f07643e-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "a6f5e483-7d6b-4d6d-be84-303d8f07643e" (UID: "a6f5e483-7d6b-4d6d-be84-303d8f07643e"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849412 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-host-kubelet\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849478 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/54d324c3-06b6-4fb8-9ed9-73691455d852-ovnkube-script-lib\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849539 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-run-ovn\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849577 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-log-socket\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849640 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/54d324c3-06b6-4fb8-9ed9-73691455d852-ovn-node-metrics-cert\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849657 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-host-cni-netd\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849717 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-node-log\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849743 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw4j5\" (UniqueName: \"kubernetes.io/projected/54d324c3-06b6-4fb8-9ed9-73691455d852-kube-api-access-cw4j5\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849804 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849847 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/54d324c3-06b6-4fb8-9ed9-73691455d852-env-overrides\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849881 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-var-lib-openvswitch\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.849899 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-run-openvswitch\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.850141 4732 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a6f5e483-7d6b-4d6d-be84-303d8f07643e-ovnkube-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.850163 4732 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-run-ovn\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.850172 4732 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a6f5e483-7d6b-4d6d-be84-303d8f07643e-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.850181 4732 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-node-log\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.850190 4732 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-kubelet\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.850198 4732 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-slash\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.850208 4732 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.850219 4732 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.850228 4732 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.850237 4732 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-systemd-units\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.850245 4732 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.850254 4732 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-cni-netd\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.850262 4732 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-log-socket\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.850271 4732 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a6f5e483-7d6b-4d6d-be84-303d8f07643e-env-overrides\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.850280 4732 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-cni-bin\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.850288 4732 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-host-run-netns\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.850297 4732 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-run-openvswitch\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.852740 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6f5e483-7d6b-4d6d-be84-303d8f07643e-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "a6f5e483-7d6b-4d6d-be84-303d8f07643e" (UID: "a6f5e483-7d6b-4d6d-be84-303d8f07643e"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.852871 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6f5e483-7d6b-4d6d-be84-303d8f07643e-kube-api-access-jljzm" (OuterVolumeSpecName: "kube-api-access-jljzm") pod "a6f5e483-7d6b-4d6d-be84-303d8f07643e" (UID: "a6f5e483-7d6b-4d6d-be84-303d8f07643e"). InnerVolumeSpecName "kube-api-access-jljzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.860571 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "a6f5e483-7d6b-4d6d-be84-303d8f07643e" (UID: "a6f5e483-7d6b-4d6d-be84-303d8f07643e"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.951421 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-host-kubelet\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.951489 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/54d324c3-06b6-4fb8-9ed9-73691455d852-ovnkube-script-lib\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.951514 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-run-ovn\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.951544 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-log-socket\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.951567 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/54d324c3-06b6-4fb8-9ed9-73691455d852-ovn-node-metrics-cert\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.951584 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-host-cni-netd\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.951603 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-node-log\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.951658 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw4j5\" (UniqueName: \"kubernetes.io/projected/54d324c3-06b6-4fb8-9ed9-73691455d852-kube-api-access-cw4j5\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.951684 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/54d324c3-06b6-4fb8-9ed9-73691455d852-env-overrides\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.951682 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-log-socket\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.951757 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.951804 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-run-ovn\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.951701 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.951870 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-node-log\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.951875 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-var-lib-openvswitch\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.951900 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-var-lib-openvswitch\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.951924 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-run-openvswitch\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.951937 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-host-cni-netd\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.951955 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-host-cni-bin\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.951996 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-host-run-netns\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.952026 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-systemd-units\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.952049 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/54d324c3-06b6-4fb8-9ed9-73691455d852-ovnkube-config\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.952081 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-etc-openvswitch\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.952114 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-run-systemd\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.952163 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-host-slash\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.952184 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-host-run-ovn-kubernetes\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.952256 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/54d324c3-06b6-4fb8-9ed9-73691455d852-ovnkube-script-lib\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.952277 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jljzm\" (UniqueName: \"kubernetes.io/projected/a6f5e483-7d6b-4d6d-be84-303d8f07643e-kube-api-access-jljzm\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.952270 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-systemd-units\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.952297 4732 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a6f5e483-7d6b-4d6d-be84-303d8f07643e-run-systemd\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.952311 4732 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a6f5e483-7d6b-4d6d-be84-303d8f07643e-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.952313 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-host-cni-bin\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.952291 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-run-openvswitch\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.952360 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-host-run-netns\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.952391 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-run-systemd\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.952423 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-etc-openvswitch\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.952467 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-host-slash\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.952494 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-host-run-ovn-kubernetes\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.952632 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/54d324c3-06b6-4fb8-9ed9-73691455d852-env-overrides\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.952878 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/54d324c3-06b6-4fb8-9ed9-73691455d852-ovnkube-config\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.952932 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/54d324c3-06b6-4fb8-9ed9-73691455d852-host-kubelet\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.955386 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/54d324c3-06b6-4fb8-9ed9-73691455d852-ovn-node-metrics-cert\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:08 crc kubenswrapper[4732]: I0402 13:53:08.968546 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw4j5\" (UniqueName: \"kubernetes.io/projected/54d324c3-06b6-4fb8-9ed9-73691455d852-kube-api-access-cw4j5\") pod \"ovnkube-node-ztk25\" (UID: \"54d324c3-06b6-4fb8-9ed9-73691455d852\") " pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:09 crc kubenswrapper[4732]: I0402 13:53:09.060872 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:09 crc kubenswrapper[4732]: W0402 13:53:09.077535 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54d324c3_06b6_4fb8_9ed9_73691455d852.slice/crio-c4462f901358fb687dedce9713842701254d4068fdd4d4dd8a386f45d8bb6524 WatchSource:0}: Error finding container c4462f901358fb687dedce9713842701254d4068fdd4d4dd8a386f45d8bb6524: Status 404 returned error can't find the container with id c4462f901358fb687dedce9713842701254d4068fdd4d4dd8a386f45d8bb6524 Apr 02 13:53:09 crc kubenswrapper[4732]: I0402 13:53:09.589247 4732 generic.go:334] "Generic (PLEG): container finished" podID="54d324c3-06b6-4fb8-9ed9-73691455d852" containerID="c83c340447b35c13baf49f1d58a15d75d7a5601032165e0ed3da6cf398093571" exitCode=0 Apr 02 13:53:09 crc kubenswrapper[4732]: I0402 13:53:09.589347 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" event={"ID":"54d324c3-06b6-4fb8-9ed9-73691455d852","Type":"ContainerDied","Data":"c83c340447b35c13baf49f1d58a15d75d7a5601032165e0ed3da6cf398093571"} Apr 02 13:53:09 crc kubenswrapper[4732]: I0402 13:53:09.589668 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" event={"ID":"54d324c3-06b6-4fb8-9ed9-73691455d852","Type":"ContainerStarted","Data":"c4462f901358fb687dedce9713842701254d4068fdd4d4dd8a386f45d8bb6524"} Apr 02 13:53:09 crc kubenswrapper[4732]: I0402 13:53:09.591906 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s52gj_ad206957-df5c-4b3e-bd35-e798a07d2f4e/kube-multus/2.log" Apr 02 13:53:09 crc kubenswrapper[4732]: I0402 13:53:09.592027 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s52gj" event={"ID":"ad206957-df5c-4b3e-bd35-e798a07d2f4e","Type":"ContainerStarted","Data":"70d0783f96c2ee9f2a3b29f9af4eeeba05abc5b8805d0f88803bc94e5efa3fce"} Apr 02 13:53:09 crc kubenswrapper[4732]: I0402 13:53:09.597069 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qmgp_a6f5e483-7d6b-4d6d-be84-303d8f07643e/ovn-acl-logging/0.log" Apr 02 13:53:09 crc kubenswrapper[4732]: I0402 13:53:09.597478 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qmgp_a6f5e483-7d6b-4d6d-be84-303d8f07643e/ovn-controller/0.log" Apr 02 13:53:09 crc kubenswrapper[4732]: I0402 13:53:09.597807 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" event={"ID":"a6f5e483-7d6b-4d6d-be84-303d8f07643e","Type":"ContainerDied","Data":"e6254ca05250a47e6db0dc5189460fd0b2288dbb8109713f6c0c6d1c1d07b06a"} Apr 02 13:53:09 crc kubenswrapper[4732]: I0402 13:53:09.597853 4732 scope.go:117] "RemoveContainer" containerID="3e2ff95a85f25b815a62b7e159ba1b483e166fb7951d270c49c1d1cd5ba86e09" Apr 02 13:53:09 crc kubenswrapper[4732]: I0402 13:53:09.597977 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8qmgp" Apr 02 13:53:09 crc kubenswrapper[4732]: I0402 13:53:09.619868 4732 scope.go:117] "RemoveContainer" containerID="2b9f2f9d419baeac3be35c4c0c40c012b721c99c2f89fb2638c635c41c5e93d7" Apr 02 13:53:09 crc kubenswrapper[4732]: I0402 13:53:09.663118 4732 scope.go:117] "RemoveContainer" containerID="9f988d8b8bd1da89ec27da36eafa964c8d1b06874110b4791aeadc05cb297b04" Apr 02 13:53:09 crc kubenswrapper[4732]: I0402 13:53:09.667036 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8qmgp"] Apr 02 13:53:09 crc kubenswrapper[4732]: I0402 13:53:09.672910 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8qmgp"] Apr 02 13:53:09 crc kubenswrapper[4732]: I0402 13:53:09.678386 4732 scope.go:117] "RemoveContainer" containerID="09f4ea085c7a422d9d9378925b89d18cd60661c399eb45f69d1e2d39c3aa2191" Apr 02 13:53:09 crc kubenswrapper[4732]: I0402 13:53:09.691947 4732 scope.go:117] "RemoveContainer" containerID="975b73da57fe61d6945d87d2a8cb9502c67a6d8fe43a80722342271e2a251f39" Apr 02 13:53:09 crc kubenswrapper[4732]: I0402 13:53:09.707225 4732 scope.go:117] "RemoveContainer" containerID="dde53c6f70593ab376e01989eadf8b52b7010c91ca4be564e5e36a781f1333f7" Apr 02 13:53:09 crc kubenswrapper[4732]: I0402 13:53:09.719127 4732 scope.go:117] "RemoveContainer" containerID="39950e10da9295ee76088f08ae9d4977f5ff9f8d8440188f777855f40fb8049e" Apr 02 13:53:09 crc kubenswrapper[4732]: I0402 13:53:09.735315 4732 scope.go:117] "RemoveContainer" containerID="4e0b635201003f7f1a0fc544667e8376b4283c202dde45398518f99b96ce2c4f" Apr 02 13:53:09 crc kubenswrapper[4732]: I0402 13:53:09.753869 4732 scope.go:117] "RemoveContainer" containerID="b90e2dbff4d4b955f0d12a454b3463c55541ffa21937cda423517d31435683bd" Apr 02 13:53:10 crc kubenswrapper[4732]: I0402 13:53:10.609300 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" event={"ID":"54d324c3-06b6-4fb8-9ed9-73691455d852","Type":"ContainerStarted","Data":"70e4f1d410a6ebef8d79ac3f2664cdd1ed05f3db9459ff76a656c0926430784c"} Apr 02 13:53:10 crc kubenswrapper[4732]: I0402 13:53:10.610578 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" event={"ID":"54d324c3-06b6-4fb8-9ed9-73691455d852","Type":"ContainerStarted","Data":"92a6aaad2d68aa13ac8c36533f8c041fc3e373f446666a6867636517bef1ca50"} Apr 02 13:53:10 crc kubenswrapper[4732]: I0402 13:53:10.610700 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" event={"ID":"54d324c3-06b6-4fb8-9ed9-73691455d852","Type":"ContainerStarted","Data":"9dac11712a477edb41ecb49f4d9caf55b57ed7dad8643b3c1b92494a24a94881"} Apr 02 13:53:10 crc kubenswrapper[4732]: I0402 13:53:10.610779 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" event={"ID":"54d324c3-06b6-4fb8-9ed9-73691455d852","Type":"ContainerStarted","Data":"7fec55d0d386c21260bb9f67d67777d9ebdff143eca09f912c882b02a2148bf0"} Apr 02 13:53:10 crc kubenswrapper[4732]: I0402 13:53:10.610857 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" event={"ID":"54d324c3-06b6-4fb8-9ed9-73691455d852","Type":"ContainerStarted","Data":"34492164d185221a832391e1aebb2ee91ee9aa97d8ab9efc16642d300ee8de33"} Apr 02 13:53:10 crc kubenswrapper[4732]: I0402 13:53:10.610916 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" event={"ID":"54d324c3-06b6-4fb8-9ed9-73691455d852","Type":"ContainerStarted","Data":"93093588179d7fa2ae26b94f84402c20c503aeadcb7cd9b33e425c732f410d62"} Apr 02 13:53:10 crc kubenswrapper[4732]: I0402 13:53:10.690488 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6f5e483-7d6b-4d6d-be84-303d8f07643e" path="/var/lib/kubelet/pods/a6f5e483-7d6b-4d6d-be84-303d8f07643e/volumes" Apr 02 13:53:11 crc kubenswrapper[4732]: I0402 13:53:11.897549 4732 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xzg9j container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Apr 02 13:53:11 crc kubenswrapper[4732]: I0402 13:53:11.898017 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.34:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.34:8443: connect: connection refused" Apr 02 13:53:13 crc kubenswrapper[4732]: I0402 13:53:13.628302 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" event={"ID":"54d324c3-06b6-4fb8-9ed9-73691455d852","Type":"ContainerStarted","Data":"7e8760e499ba6724d52a24f87162ff0bf239cf80dbedb1c27d39f80d44dce421"} Apr 02 13:53:15 crc kubenswrapper[4732]: I0402 13:53:15.648260 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" event={"ID":"54d324c3-06b6-4fb8-9ed9-73691455d852","Type":"ContainerStarted","Data":"abb401bbe420827463c06474f88ff56a9341a13b4960c741c35d54d167fb5cb9"} Apr 02 13:53:15 crc kubenswrapper[4732]: I0402 13:53:15.648998 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:15 crc kubenswrapper[4732]: I0402 13:53:15.649018 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:15 crc kubenswrapper[4732]: I0402 13:53:15.681383 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" podStartSLOduration=7.681359452 podStartE2EDuration="7.681359452s" podCreationTimestamp="2026-04-02 13:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:53:15.675921005 +0000 UTC m=+952.580328578" watchObservedRunningTime="2026-04-02 13:53:15.681359452 +0000 UTC m=+952.585767005" Apr 02 13:53:15 crc kubenswrapper[4732]: I0402 13:53:15.682333 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:16 crc kubenswrapper[4732]: I0402 13:53:16.654018 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:16 crc kubenswrapper[4732]: I0402 13:53:16.686001 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:16 crc kubenswrapper[4732]: I0402 13:53:16.897626 4732 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xzg9j container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Apr 02 13:53:16 crc kubenswrapper[4732]: I0402 13:53:16.897681 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.34:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.34:8443: connect: connection refused" Apr 02 13:53:21 crc kubenswrapper[4732]: I0402 13:53:21.897494 4732 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xzg9j container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Apr 02 13:53:21 crc kubenswrapper[4732]: I0402 13:53:21.898068 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.34:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.34:8443: connect: connection refused" Apr 02 13:53:26 crc kubenswrapper[4732]: I0402 13:53:26.897699 4732 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xzg9j container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Apr 02 13:53:26 crc kubenswrapper[4732]: I0402 13:53:26.898188 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.34:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.34:8443: connect: connection refused" Apr 02 13:53:31 crc kubenswrapper[4732]: I0402 13:53:31.897377 4732 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xzg9j container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Apr 02 13:53:31 crc kubenswrapper[4732]: I0402 13:53:31.897723 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.34:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.34:8443: connect: connection refused" Apr 02 13:53:31 crc kubenswrapper[4732]: I0402 13:53:31.924741 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 13:53:31 crc kubenswrapper[4732]: I0402 13:53:31.924813 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 13:53:31 crc kubenswrapper[4732]: I0402 13:53:31.924862 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 13:53:31 crc kubenswrapper[4732]: I0402 13:53:31.925459 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7878e048a4d64163e8a31b7ae9f684fbec512dadbd638965377c01f618d3ee60"} pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 02 13:53:31 crc kubenswrapper[4732]: I0402 13:53:31.925539 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" containerID="cri-o://7878e048a4d64163e8a31b7ae9f684fbec512dadbd638965377c01f618d3ee60" gracePeriod=600 Apr 02 13:53:32 crc kubenswrapper[4732]: I0402 13:53:32.749505 4732 generic.go:334] "Generic (PLEG): container finished" podID="38409e5e-4545-49da-8f6c-4bfb30582878" containerID="7878e048a4d64163e8a31b7ae9f684fbec512dadbd638965377c01f618d3ee60" exitCode=0 Apr 02 13:53:32 crc kubenswrapper[4732]: I0402 13:53:32.749623 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerDied","Data":"7878e048a4d64163e8a31b7ae9f684fbec512dadbd638965377c01f618d3ee60"} Apr 02 13:53:32 crc kubenswrapper[4732]: I0402 13:53:32.749891 4732 scope.go:117] "RemoveContainer" containerID="af70e3cad0ce1292e5dbf16bae2fa3fb252384621cc6f2a55ab8798328ebbc50" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.712966 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.747242 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-785476f7d-zh79j"] Apr 02 13:53:33 crc kubenswrapper[4732]: E0402 13:53:33.747459 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="fix-audit-permissions" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.747472 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="fix-audit-permissions" Apr 02 13:53:33 crc kubenswrapper[4732]: E0402 13:53:33.747486 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver-check-endpoints" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.747492 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver-check-endpoints" Apr 02 13:53:33 crc kubenswrapper[4732]: E0402 13:53:33.747511 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.747518 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.747603 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.747636 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea530987-3884-4994-9574-b73fc76fcdde" containerName="openshift-apiserver-check-endpoints" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.748329 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.761143 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-785476f7d-zh79j"] Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.782459 4732 generic.go:334] "Generic (PLEG): container finished" podID="ea530987-3884-4994-9574-b73fc76fcdde" containerID="97d6c3ee10c0b8caf6d9b9655805d29e3f2e2a039f4b7e5e1c23bc1c3e534800" exitCode=0 Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.782537 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" event={"ID":"ea530987-3884-4994-9574-b73fc76fcdde","Type":"ContainerDied","Data":"97d6c3ee10c0b8caf6d9b9655805d29e3f2e2a039f4b7e5e1c23bc1c3e534800"} Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.782570 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" event={"ID":"ea530987-3884-4994-9574-b73fc76fcdde","Type":"ContainerDied","Data":"98d365218263d02c3a66886f2016109cec8a6fc84a5db66cb0647bd906d66fd7"} Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.782596 4732 scope.go:117] "RemoveContainer" containerID="ccff9c2b817177ce6f647ac40187504767ebaef3aa02d263c169e6f13c255e1c" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.782820 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xzg9j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.787058 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerStarted","Data":"6beef2fa99836ab6f985ec458e30c6e22b8f1d0b42722462a9fe13d02e226853"} Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.803450 4732 scope.go:117] "RemoveContainer" containerID="97d6c3ee10c0b8caf6d9b9655805d29e3f2e2a039f4b7e5e1c23bc1c3e534800" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.821945 4732 scope.go:117] "RemoveContainer" containerID="d0aa32342845cdf2f26edec7f7697e0928ef564195a539e052a57ddd8339f80f" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.839479 4732 scope.go:117] "RemoveContainer" containerID="ccff9c2b817177ce6f647ac40187504767ebaef3aa02d263c169e6f13c255e1c" Apr 02 13:53:33 crc kubenswrapper[4732]: E0402 13:53:33.839914 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccff9c2b817177ce6f647ac40187504767ebaef3aa02d263c169e6f13c255e1c\": container with ID starting with ccff9c2b817177ce6f647ac40187504767ebaef3aa02d263c169e6f13c255e1c not found: ID does not exist" containerID="ccff9c2b817177ce6f647ac40187504767ebaef3aa02d263c169e6f13c255e1c" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.839941 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccff9c2b817177ce6f647ac40187504767ebaef3aa02d263c169e6f13c255e1c"} err="failed to get container status \"ccff9c2b817177ce6f647ac40187504767ebaef3aa02d263c169e6f13c255e1c\": rpc error: code = NotFound desc = could not find container \"ccff9c2b817177ce6f647ac40187504767ebaef3aa02d263c169e6f13c255e1c\": container with ID starting with ccff9c2b817177ce6f647ac40187504767ebaef3aa02d263c169e6f13c255e1c not found: ID does not exist" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.839965 4732 scope.go:117] "RemoveContainer" containerID="97d6c3ee10c0b8caf6d9b9655805d29e3f2e2a039f4b7e5e1c23bc1c3e534800" Apr 02 13:53:33 crc kubenswrapper[4732]: E0402 13:53:33.840293 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97d6c3ee10c0b8caf6d9b9655805d29e3f2e2a039f4b7e5e1c23bc1c3e534800\": container with ID starting with 97d6c3ee10c0b8caf6d9b9655805d29e3f2e2a039f4b7e5e1c23bc1c3e534800 not found: ID does not exist" containerID="97d6c3ee10c0b8caf6d9b9655805d29e3f2e2a039f4b7e5e1c23bc1c3e534800" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.840335 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97d6c3ee10c0b8caf6d9b9655805d29e3f2e2a039f4b7e5e1c23bc1c3e534800"} err="failed to get container status \"97d6c3ee10c0b8caf6d9b9655805d29e3f2e2a039f4b7e5e1c23bc1c3e534800\": rpc error: code = NotFound desc = could not find container \"97d6c3ee10c0b8caf6d9b9655805d29e3f2e2a039f4b7e5e1c23bc1c3e534800\": container with ID starting with 97d6c3ee10c0b8caf6d9b9655805d29e3f2e2a039f4b7e5e1c23bc1c3e534800 not found: ID does not exist" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.840360 4732 scope.go:117] "RemoveContainer" containerID="d0aa32342845cdf2f26edec7f7697e0928ef564195a539e052a57ddd8339f80f" Apr 02 13:53:33 crc kubenswrapper[4732]: E0402 13:53:33.840750 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0aa32342845cdf2f26edec7f7697e0928ef564195a539e052a57ddd8339f80f\": container with ID starting with d0aa32342845cdf2f26edec7f7697e0928ef564195a539e052a57ddd8339f80f not found: ID does not exist" containerID="d0aa32342845cdf2f26edec7f7697e0928ef564195a539e052a57ddd8339f80f" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.840783 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0aa32342845cdf2f26edec7f7697e0928ef564195a539e052a57ddd8339f80f"} err="failed to get container status \"d0aa32342845cdf2f26edec7f7697e0928ef564195a539e052a57ddd8339f80f\": rpc error: code = NotFound desc = could not find container \"d0aa32342845cdf2f26edec7f7697e0928ef564195a539e052a57ddd8339f80f\": container with ID starting with d0aa32342845cdf2f26edec7f7697e0928ef564195a539e052a57ddd8339f80f not found: ID does not exist" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.876531 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ea530987-3884-4994-9574-b73fc76fcdde-etcd-client\") pod \"ea530987-3884-4994-9574-b73fc76fcdde\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.876579 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ea530987-3884-4994-9574-b73fc76fcdde-node-pullsecrets\") pod \"ea530987-3884-4994-9574-b73fc76fcdde\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.876632 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-trusted-ca-bundle\") pod \"ea530987-3884-4994-9574-b73fc76fcdde\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.876656 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ea530987-3884-4994-9574-b73fc76fcdde-audit-dir\") pod \"ea530987-3884-4994-9574-b73fc76fcdde\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.876671 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-config\") pod \"ea530987-3884-4994-9574-b73fc76fcdde\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.876694 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea530987-3884-4994-9574-b73fc76fcdde-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "ea530987-3884-4994-9574-b73fc76fcdde" (UID: "ea530987-3884-4994-9574-b73fc76fcdde"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.876708 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-etcd-serving-ca\") pod \"ea530987-3884-4994-9574-b73fc76fcdde\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.876742 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea530987-3884-4994-9574-b73fc76fcdde-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "ea530987-3884-4994-9574-b73fc76fcdde" (UID: "ea530987-3884-4994-9574-b73fc76fcdde"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.876768 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea530987-3884-4994-9574-b73fc76fcdde-serving-cert\") pod \"ea530987-3884-4994-9574-b73fc76fcdde\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.876792 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ea530987-3884-4994-9574-b73fc76fcdde-encryption-config\") pod \"ea530987-3884-4994-9574-b73fc76fcdde\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.876824 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gjnv\" (UniqueName: \"kubernetes.io/projected/ea530987-3884-4994-9574-b73fc76fcdde-kube-api-access-5gjnv\") pod \"ea530987-3884-4994-9574-b73fc76fcdde\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.876846 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-image-import-ca\") pod \"ea530987-3884-4994-9574-b73fc76fcdde\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.876878 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-audit\") pod \"ea530987-3884-4994-9574-b73fc76fcdde\" (UID: \"ea530987-3884-4994-9574-b73fc76fcdde\") " Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.877004 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88263904-09c5-4053-82ec-4ae1e2feb94c-serving-cert\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.877038 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/88263904-09c5-4053-82ec-4ae1e2feb94c-node-pullsecrets\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.877056 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88263904-09c5-4053-82ec-4ae1e2feb94c-config\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.877100 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88263904-09c5-4053-82ec-4ae1e2feb94c-trusted-ca-bundle\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.877324 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-audit" (OuterVolumeSpecName: "audit") pod "ea530987-3884-4994-9574-b73fc76fcdde" (UID: "ea530987-3884-4994-9574-b73fc76fcdde"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.877362 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "ea530987-3884-4994-9574-b73fc76fcdde" (UID: "ea530987-3884-4994-9574-b73fc76fcdde"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.877427 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-config" (OuterVolumeSpecName: "config") pod "ea530987-3884-4994-9574-b73fc76fcdde" (UID: "ea530987-3884-4994-9574-b73fc76fcdde"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.877437 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/88263904-09c5-4053-82ec-4ae1e2feb94c-etcd-serving-ca\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.877478 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/88263904-09c5-4053-82ec-4ae1e2feb94c-image-import-ca\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.877501 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/88263904-09c5-4053-82ec-4ae1e2feb94c-audit\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.877587 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88263904-09c5-4053-82ec-4ae1e2feb94c-audit-dir\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.877646 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/88263904-09c5-4053-82ec-4ae1e2feb94c-etcd-client\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.877666 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/88263904-09c5-4053-82ec-4ae1e2feb94c-encryption-config\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.877697 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjjpk\" (UniqueName: \"kubernetes.io/projected/88263904-09c5-4053-82ec-4ae1e2feb94c-kube-api-access-wjjpk\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.877745 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ea530987-3884-4994-9574-b73fc76fcdde" (UID: "ea530987-3884-4994-9574-b73fc76fcdde"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.877736 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "ea530987-3884-4994-9574-b73fc76fcdde" (UID: "ea530987-3884-4994-9574-b73fc76fcdde"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.877857 4732 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-audit\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.877880 4732 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ea530987-3884-4994-9574-b73fc76fcdde-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.877892 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.877901 4732 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ea530987-3884-4994-9574-b73fc76fcdde-audit-dir\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.877911 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.877919 4732 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.881562 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea530987-3884-4994-9574-b73fc76fcdde-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ea530987-3884-4994-9574-b73fc76fcdde" (UID: "ea530987-3884-4994-9574-b73fc76fcdde"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.881650 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea530987-3884-4994-9574-b73fc76fcdde-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "ea530987-3884-4994-9574-b73fc76fcdde" (UID: "ea530987-3884-4994-9574-b73fc76fcdde"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.881737 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea530987-3884-4994-9574-b73fc76fcdde-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "ea530987-3884-4994-9574-b73fc76fcdde" (UID: "ea530987-3884-4994-9574-b73fc76fcdde"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.881987 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea530987-3884-4994-9574-b73fc76fcdde-kube-api-access-5gjnv" (OuterVolumeSpecName: "kube-api-access-5gjnv") pod "ea530987-3884-4994-9574-b73fc76fcdde" (UID: "ea530987-3884-4994-9574-b73fc76fcdde"). InnerVolumeSpecName "kube-api-access-5gjnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.978995 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88263904-09c5-4053-82ec-4ae1e2feb94c-serving-cert\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.979055 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/88263904-09c5-4053-82ec-4ae1e2feb94c-node-pullsecrets\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.979076 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88263904-09c5-4053-82ec-4ae1e2feb94c-config\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.979108 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88263904-09c5-4053-82ec-4ae1e2feb94c-trusted-ca-bundle\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.979126 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/88263904-09c5-4053-82ec-4ae1e2feb94c-etcd-serving-ca\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.979149 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/88263904-09c5-4053-82ec-4ae1e2feb94c-image-import-ca\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.979202 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/88263904-09c5-4053-82ec-4ae1e2feb94c-audit\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.979221 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88263904-09c5-4053-82ec-4ae1e2feb94c-audit-dir\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.979256 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/88263904-09c5-4053-82ec-4ae1e2feb94c-etcd-client\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.979282 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/88263904-09c5-4053-82ec-4ae1e2feb94c-encryption-config\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.979294 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/88263904-09c5-4053-82ec-4ae1e2feb94c-node-pullsecrets\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.979366 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88263904-09c5-4053-82ec-4ae1e2feb94c-audit-dir\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.979318 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjjpk\" (UniqueName: \"kubernetes.io/projected/88263904-09c5-4053-82ec-4ae1e2feb94c-kube-api-access-wjjpk\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.980141 4732 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ea530987-3884-4994-9574-b73fc76fcdde-etcd-client\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.980230 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea530987-3884-4994-9574-b73fc76fcdde-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.980298 4732 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ea530987-3884-4994-9574-b73fc76fcdde-encryption-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.980363 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gjnv\" (UniqueName: \"kubernetes.io/projected/ea530987-3884-4994-9574-b73fc76fcdde-kube-api-access-5gjnv\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.980442 4732 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ea530987-3884-4994-9574-b73fc76fcdde-image-import-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.980376 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88263904-09c5-4053-82ec-4ae1e2feb94c-config\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.980366 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/88263904-09c5-4053-82ec-4ae1e2feb94c-audit\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.980836 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/88263904-09c5-4053-82ec-4ae1e2feb94c-etcd-serving-ca\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.980907 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88263904-09c5-4053-82ec-4ae1e2feb94c-trusted-ca-bundle\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.981074 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/88263904-09c5-4053-82ec-4ae1e2feb94c-image-import-ca\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.982097 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88263904-09c5-4053-82ec-4ae1e2feb94c-serving-cert\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.984068 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/88263904-09c5-4053-82ec-4ae1e2feb94c-etcd-client\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.984186 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/88263904-09c5-4053-82ec-4ae1e2feb94c-encryption-config\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:33 crc kubenswrapper[4732]: I0402 13:53:33.995199 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjjpk\" (UniqueName: \"kubernetes.io/projected/88263904-09c5-4053-82ec-4ae1e2feb94c-kube-api-access-wjjpk\") pod \"apiserver-785476f7d-zh79j\" (UID: \"88263904-09c5-4053-82ec-4ae1e2feb94c\") " pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:34 crc kubenswrapper[4732]: I0402 13:53:34.111722 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:34 crc kubenswrapper[4732]: I0402 13:53:34.114116 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xzg9j"] Apr 02 13:53:34 crc kubenswrapper[4732]: I0402 13:53:34.117988 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xzg9j"] Apr 02 13:53:34 crc kubenswrapper[4732]: I0402 13:53:34.280431 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-785476f7d-zh79j"] Apr 02 13:53:34 crc kubenswrapper[4732]: W0402 13:53:34.286925 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88263904_09c5_4053_82ec_4ae1e2feb94c.slice/crio-40313759d64d741c6bc1dbe16abdf74eb73424fc7e4aa713676af0d8259ed9d1 WatchSource:0}: Error finding container 40313759d64d741c6bc1dbe16abdf74eb73424fc7e4aa713676af0d8259ed9d1: Status 404 returned error can't find the container with id 40313759d64d741c6bc1dbe16abdf74eb73424fc7e4aa713676af0d8259ed9d1 Apr 02 13:53:34 crc kubenswrapper[4732]: I0402 13:53:34.687362 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea530987-3884-4994-9574-b73fc76fcdde" path="/var/lib/kubelet/pods/ea530987-3884-4994-9574-b73fc76fcdde/volumes" Apr 02 13:53:34 crc kubenswrapper[4732]: I0402 13:53:34.795756 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-785476f7d-zh79j" event={"ID":"88263904-09c5-4053-82ec-4ae1e2feb94c","Type":"ContainerStarted","Data":"40313759d64d741c6bc1dbe16abdf74eb73424fc7e4aa713676af0d8259ed9d1"} Apr 02 13:53:35 crc kubenswrapper[4732]: I0402 13:53:35.805000 4732 generic.go:334] "Generic (PLEG): container finished" podID="88263904-09c5-4053-82ec-4ae1e2feb94c" containerID="1d2334935ead1a0108c0f5d65bb4a2853712cfd15214a40c2605c9051452a9e2" exitCode=0 Apr 02 13:53:35 crc kubenswrapper[4732]: I0402 13:53:35.805067 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-785476f7d-zh79j" event={"ID":"88263904-09c5-4053-82ec-4ae1e2feb94c","Type":"ContainerDied","Data":"1d2334935ead1a0108c0f5d65bb4a2853712cfd15214a40c2605c9051452a9e2"} Apr 02 13:53:36 crc kubenswrapper[4732]: I0402 13:53:36.814998 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-785476f7d-zh79j" event={"ID":"88263904-09c5-4053-82ec-4ae1e2feb94c","Type":"ContainerStarted","Data":"18a6202bb92649cbdb77c73289b7734e57aea27a2646030731200be336268c47"} Apr 02 13:53:37 crc kubenswrapper[4732]: I0402 13:53:37.824486 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-785476f7d-zh79j" event={"ID":"88263904-09c5-4053-82ec-4ae1e2feb94c","Type":"ContainerStarted","Data":"51b7dfe8429c84c23d03ed53ae839ea24a5d790825e0853ea1d4df7e89e11562"} Apr 02 13:53:37 crc kubenswrapper[4732]: I0402 13:53:37.858143 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-785476f7d-zh79j" podStartSLOduration=114.858115676 podStartE2EDuration="1m54.858115676s" podCreationTimestamp="2026-04-02 13:51:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:53:37.855311491 +0000 UTC m=+974.759719064" watchObservedRunningTime="2026-04-02 13:53:37.858115676 +0000 UTC m=+974.762523249" Apr 02 13:53:39 crc kubenswrapper[4732]: I0402 13:53:39.087725 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ztk25" Apr 02 13:53:39 crc kubenswrapper[4732]: I0402 13:53:39.112024 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:39 crc kubenswrapper[4732]: I0402 13:53:39.112080 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:39 crc kubenswrapper[4732]: I0402 13:53:39.124208 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:39 crc kubenswrapper[4732]: I0402 13:53:39.842140 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-785476f7d-zh79j" Apr 02 13:53:47 crc kubenswrapper[4732]: I0402 13:53:47.634048 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg"] Apr 02 13:53:47 crc kubenswrapper[4732]: I0402 13:53:47.635470 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg" Apr 02 13:53:47 crc kubenswrapper[4732]: I0402 13:53:47.637043 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Apr 02 13:53:47 crc kubenswrapper[4732]: I0402 13:53:47.644890 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg"] Apr 02 13:53:47 crc kubenswrapper[4732]: I0402 13:53:47.657989 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c632108-b88a-4b6c-9368-49aacc9c04ec-util\") pod \"5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg\" (UID: \"6c632108-b88a-4b6c-9368-49aacc9c04ec\") " pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg" Apr 02 13:53:47 crc kubenswrapper[4732]: I0402 13:53:47.658065 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvggx\" (UniqueName: \"kubernetes.io/projected/6c632108-b88a-4b6c-9368-49aacc9c04ec-kube-api-access-fvggx\") pod \"5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg\" (UID: \"6c632108-b88a-4b6c-9368-49aacc9c04ec\") " pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg" Apr 02 13:53:47 crc kubenswrapper[4732]: I0402 13:53:47.658121 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c632108-b88a-4b6c-9368-49aacc9c04ec-bundle\") pod \"5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg\" (UID: \"6c632108-b88a-4b6c-9368-49aacc9c04ec\") " pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg" Apr 02 13:53:47 crc kubenswrapper[4732]: I0402 13:53:47.759143 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c632108-b88a-4b6c-9368-49aacc9c04ec-bundle\") pod \"5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg\" (UID: \"6c632108-b88a-4b6c-9368-49aacc9c04ec\") " pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg" Apr 02 13:53:47 crc kubenswrapper[4732]: I0402 13:53:47.759271 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c632108-b88a-4b6c-9368-49aacc9c04ec-util\") pod \"5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg\" (UID: \"6c632108-b88a-4b6c-9368-49aacc9c04ec\") " pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg" Apr 02 13:53:47 crc kubenswrapper[4732]: I0402 13:53:47.759315 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvggx\" (UniqueName: \"kubernetes.io/projected/6c632108-b88a-4b6c-9368-49aacc9c04ec-kube-api-access-fvggx\") pod \"5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg\" (UID: \"6c632108-b88a-4b6c-9368-49aacc9c04ec\") " pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg" Apr 02 13:53:47 crc kubenswrapper[4732]: I0402 13:53:47.759588 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c632108-b88a-4b6c-9368-49aacc9c04ec-bundle\") pod \"5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg\" (UID: \"6c632108-b88a-4b6c-9368-49aacc9c04ec\") " pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg" Apr 02 13:53:47 crc kubenswrapper[4732]: I0402 13:53:47.759781 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c632108-b88a-4b6c-9368-49aacc9c04ec-util\") pod \"5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg\" (UID: \"6c632108-b88a-4b6c-9368-49aacc9c04ec\") " pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg" Apr 02 13:53:47 crc kubenswrapper[4732]: I0402 13:53:47.780835 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvggx\" (UniqueName: \"kubernetes.io/projected/6c632108-b88a-4b6c-9368-49aacc9c04ec-kube-api-access-fvggx\") pod \"5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg\" (UID: \"6c632108-b88a-4b6c-9368-49aacc9c04ec\") " pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg" Apr 02 13:53:47 crc kubenswrapper[4732]: I0402 13:53:47.956650 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg" Apr 02 13:53:48 crc kubenswrapper[4732]: I0402 13:53:48.347276 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg"] Apr 02 13:53:48 crc kubenswrapper[4732]: I0402 13:53:48.887426 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg" event={"ID":"6c632108-b88a-4b6c-9368-49aacc9c04ec","Type":"ContainerStarted","Data":"06a9de45282312c2f544931f9d76ace30f0852445e7d3e2ee996bc75c519ab86"} Apr 02 13:53:49 crc kubenswrapper[4732]: I0402 13:53:49.793174 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hdfsn"] Apr 02 13:53:49 crc kubenswrapper[4732]: I0402 13:53:49.794311 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hdfsn" Apr 02 13:53:49 crc kubenswrapper[4732]: I0402 13:53:49.817698 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hdfsn"] Apr 02 13:53:49 crc kubenswrapper[4732]: I0402 13:53:49.885431 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e482963-6625-4a6a-a614-c82a1b90d4b3-catalog-content\") pod \"redhat-operators-hdfsn\" (UID: \"3e482963-6625-4a6a-a614-c82a1b90d4b3\") " pod="openshift-marketplace/redhat-operators-hdfsn" Apr 02 13:53:49 crc kubenswrapper[4732]: I0402 13:53:49.885496 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwttq\" (UniqueName: \"kubernetes.io/projected/3e482963-6625-4a6a-a614-c82a1b90d4b3-kube-api-access-dwttq\") pod \"redhat-operators-hdfsn\" (UID: \"3e482963-6625-4a6a-a614-c82a1b90d4b3\") " pod="openshift-marketplace/redhat-operators-hdfsn" Apr 02 13:53:49 crc kubenswrapper[4732]: I0402 13:53:49.885588 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e482963-6625-4a6a-a614-c82a1b90d4b3-utilities\") pod \"redhat-operators-hdfsn\" (UID: \"3e482963-6625-4a6a-a614-c82a1b90d4b3\") " pod="openshift-marketplace/redhat-operators-hdfsn" Apr 02 13:53:49 crc kubenswrapper[4732]: I0402 13:53:49.895552 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg" event={"ID":"6c632108-b88a-4b6c-9368-49aacc9c04ec","Type":"ContainerStarted","Data":"d243c9ab70ad32f69943686d5ddfa9851fedeacd9b7cdffb56f650a787678667"} Apr 02 13:53:49 crc kubenswrapper[4732]: I0402 13:53:49.986108 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e482963-6625-4a6a-a614-c82a1b90d4b3-utilities\") pod \"redhat-operators-hdfsn\" (UID: \"3e482963-6625-4a6a-a614-c82a1b90d4b3\") " pod="openshift-marketplace/redhat-operators-hdfsn" Apr 02 13:53:49 crc kubenswrapper[4732]: I0402 13:53:49.986196 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e482963-6625-4a6a-a614-c82a1b90d4b3-catalog-content\") pod \"redhat-operators-hdfsn\" (UID: \"3e482963-6625-4a6a-a614-c82a1b90d4b3\") " pod="openshift-marketplace/redhat-operators-hdfsn" Apr 02 13:53:49 crc kubenswrapper[4732]: I0402 13:53:49.986234 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwttq\" (UniqueName: \"kubernetes.io/projected/3e482963-6625-4a6a-a614-c82a1b90d4b3-kube-api-access-dwttq\") pod \"redhat-operators-hdfsn\" (UID: \"3e482963-6625-4a6a-a614-c82a1b90d4b3\") " pod="openshift-marketplace/redhat-operators-hdfsn" Apr 02 13:53:49 crc kubenswrapper[4732]: I0402 13:53:49.986717 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e482963-6625-4a6a-a614-c82a1b90d4b3-utilities\") pod \"redhat-operators-hdfsn\" (UID: \"3e482963-6625-4a6a-a614-c82a1b90d4b3\") " pod="openshift-marketplace/redhat-operators-hdfsn" Apr 02 13:53:49 crc kubenswrapper[4732]: I0402 13:53:49.986849 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e482963-6625-4a6a-a614-c82a1b90d4b3-catalog-content\") pod \"redhat-operators-hdfsn\" (UID: \"3e482963-6625-4a6a-a614-c82a1b90d4b3\") " pod="openshift-marketplace/redhat-operators-hdfsn" Apr 02 13:53:50 crc kubenswrapper[4732]: I0402 13:53:50.013530 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwttq\" (UniqueName: \"kubernetes.io/projected/3e482963-6625-4a6a-a614-c82a1b90d4b3-kube-api-access-dwttq\") pod \"redhat-operators-hdfsn\" (UID: \"3e482963-6625-4a6a-a614-c82a1b90d4b3\") " pod="openshift-marketplace/redhat-operators-hdfsn" Apr 02 13:53:50 crc kubenswrapper[4732]: I0402 13:53:50.123896 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hdfsn" Apr 02 13:53:50 crc kubenswrapper[4732]: I0402 13:53:50.556362 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hdfsn"] Apr 02 13:53:50 crc kubenswrapper[4732]: I0402 13:53:50.902643 4732 generic.go:334] "Generic (PLEG): container finished" podID="6c632108-b88a-4b6c-9368-49aacc9c04ec" containerID="d243c9ab70ad32f69943686d5ddfa9851fedeacd9b7cdffb56f650a787678667" exitCode=0 Apr 02 13:53:50 crc kubenswrapper[4732]: I0402 13:53:50.902820 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg" event={"ID":"6c632108-b88a-4b6c-9368-49aacc9c04ec","Type":"ContainerDied","Data":"d243c9ab70ad32f69943686d5ddfa9851fedeacd9b7cdffb56f650a787678667"} Apr 02 13:53:50 crc kubenswrapper[4732]: I0402 13:53:50.905539 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdfsn" event={"ID":"3e482963-6625-4a6a-a614-c82a1b90d4b3","Type":"ContainerStarted","Data":"eac3c14b9e89739da94f57db05b3af41eb867c8117cf7d070df0f31fa869521b"} Apr 02 13:53:51 crc kubenswrapper[4732]: I0402 13:53:51.913422 4732 generic.go:334] "Generic (PLEG): container finished" podID="3e482963-6625-4a6a-a614-c82a1b90d4b3" containerID="6e2e00128778abfb9ad77d5d16293f4bf43c21e4b816bb6c3828d25a683157ff" exitCode=0 Apr 02 13:53:51 crc kubenswrapper[4732]: I0402 13:53:51.913557 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdfsn" event={"ID":"3e482963-6625-4a6a-a614-c82a1b90d4b3","Type":"ContainerDied","Data":"6e2e00128778abfb9ad77d5d16293f4bf43c21e4b816bb6c3828d25a683157ff"} Apr 02 13:53:54 crc kubenswrapper[4732]: I0402 13:53:54.937596 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdfsn" event={"ID":"3e482963-6625-4a6a-a614-c82a1b90d4b3","Type":"ContainerStarted","Data":"0b514cffdc73c01da66ab5d28ad8c9c13dd91c6a1a2cf2703cd5d27fdaa440e7"} Apr 02 13:53:55 crc kubenswrapper[4732]: E0402 13:53:55.489045 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e482963_6625_4a6a_a614_c82a1b90d4b3.slice/crio-conmon-0b514cffdc73c01da66ab5d28ad8c9c13dd91c6a1a2cf2703cd5d27fdaa440e7.scope\": RecentStats: unable to find data in memory cache]" Apr 02 13:53:55 crc kubenswrapper[4732]: I0402 13:53:55.945766 4732 generic.go:334] "Generic (PLEG): container finished" podID="3e482963-6625-4a6a-a614-c82a1b90d4b3" containerID="0b514cffdc73c01da66ab5d28ad8c9c13dd91c6a1a2cf2703cd5d27fdaa440e7" exitCode=0 Apr 02 13:53:55 crc kubenswrapper[4732]: I0402 13:53:55.945803 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdfsn" event={"ID":"3e482963-6625-4a6a-a614-c82a1b90d4b3","Type":"ContainerDied","Data":"0b514cffdc73c01da66ab5d28ad8c9c13dd91c6a1a2cf2703cd5d27fdaa440e7"} Apr 02 13:53:56 crc kubenswrapper[4732]: I0402 13:53:56.955225 4732 generic.go:334] "Generic (PLEG): container finished" podID="6c632108-b88a-4b6c-9368-49aacc9c04ec" containerID="883c0b267db63bbfc2c34d32b595e44aac300eff68c5605fc763f1ed6684c2ce" exitCode=0 Apr 02 13:53:56 crc kubenswrapper[4732]: I0402 13:53:56.955288 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg" event={"ID":"6c632108-b88a-4b6c-9368-49aacc9c04ec","Type":"ContainerDied","Data":"883c0b267db63bbfc2c34d32b595e44aac300eff68c5605fc763f1ed6684c2ce"} Apr 02 13:53:56 crc kubenswrapper[4732]: I0402 13:53:56.958381 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdfsn" event={"ID":"3e482963-6625-4a6a-a614-c82a1b90d4b3","Type":"ContainerStarted","Data":"050acba982003019261fb4339f804b98163a21adb4ebff3c9e664d333af6a026"} Apr 02 13:53:56 crc kubenswrapper[4732]: I0402 13:53:56.996296 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hdfsn" podStartSLOduration=3.351823931 podStartE2EDuration="7.996278334s" podCreationTimestamp="2026-04-02 13:53:49 +0000 UTC" firstStartedPulling="2026-04-02 13:53:51.915228485 +0000 UTC m=+988.819636058" lastFinishedPulling="2026-04-02 13:53:56.559682908 +0000 UTC m=+993.464090461" observedRunningTime="2026-04-02 13:53:56.994776484 +0000 UTC m=+993.899184057" watchObservedRunningTime="2026-04-02 13:53:56.996278334 +0000 UTC m=+993.900685887" Apr 02 13:53:57 crc kubenswrapper[4732]: I0402 13:53:57.965151 4732 generic.go:334] "Generic (PLEG): container finished" podID="6c632108-b88a-4b6c-9368-49aacc9c04ec" containerID="b7fafdc7628ea78062898ddbb9e81267b9f1e483e833c65647b6d02407f1ea8a" exitCode=0 Apr 02 13:53:57 crc kubenswrapper[4732]: I0402 13:53:57.965247 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg" event={"ID":"6c632108-b88a-4b6c-9368-49aacc9c04ec","Type":"ContainerDied","Data":"b7fafdc7628ea78062898ddbb9e81267b9f1e483e833c65647b6d02407f1ea8a"} Apr 02 13:53:59 crc kubenswrapper[4732]: I0402 13:53:59.285437 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg" Apr 02 13:53:59 crc kubenswrapper[4732]: I0402 13:53:59.402096 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c632108-b88a-4b6c-9368-49aacc9c04ec-util\") pod \"6c632108-b88a-4b6c-9368-49aacc9c04ec\" (UID: \"6c632108-b88a-4b6c-9368-49aacc9c04ec\") " Apr 02 13:53:59 crc kubenswrapper[4732]: I0402 13:53:59.402441 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvggx\" (UniqueName: \"kubernetes.io/projected/6c632108-b88a-4b6c-9368-49aacc9c04ec-kube-api-access-fvggx\") pod \"6c632108-b88a-4b6c-9368-49aacc9c04ec\" (UID: \"6c632108-b88a-4b6c-9368-49aacc9c04ec\") " Apr 02 13:53:59 crc kubenswrapper[4732]: I0402 13:53:59.402599 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c632108-b88a-4b6c-9368-49aacc9c04ec-bundle\") pod \"6c632108-b88a-4b6c-9368-49aacc9c04ec\" (UID: \"6c632108-b88a-4b6c-9368-49aacc9c04ec\") " Apr 02 13:53:59 crc kubenswrapper[4732]: I0402 13:53:59.403829 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c632108-b88a-4b6c-9368-49aacc9c04ec-bundle" (OuterVolumeSpecName: "bundle") pod "6c632108-b88a-4b6c-9368-49aacc9c04ec" (UID: "6c632108-b88a-4b6c-9368-49aacc9c04ec"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:53:59 crc kubenswrapper[4732]: I0402 13:53:59.410192 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c632108-b88a-4b6c-9368-49aacc9c04ec-kube-api-access-fvggx" (OuterVolumeSpecName: "kube-api-access-fvggx") pod "6c632108-b88a-4b6c-9368-49aacc9c04ec" (UID: "6c632108-b88a-4b6c-9368-49aacc9c04ec"). InnerVolumeSpecName "kube-api-access-fvggx". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:53:59 crc kubenswrapper[4732]: I0402 13:53:59.423907 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c632108-b88a-4b6c-9368-49aacc9c04ec-util" (OuterVolumeSpecName: "util") pod "6c632108-b88a-4b6c-9368-49aacc9c04ec" (UID: "6c632108-b88a-4b6c-9368-49aacc9c04ec"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:53:59 crc kubenswrapper[4732]: I0402 13:53:59.503655 4732 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c632108-b88a-4b6c-9368-49aacc9c04ec-util\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:59 crc kubenswrapper[4732]: I0402 13:53:59.503701 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvggx\" (UniqueName: \"kubernetes.io/projected/6c632108-b88a-4b6c-9368-49aacc9c04ec-kube-api-access-fvggx\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:59 crc kubenswrapper[4732]: I0402 13:53:59.503713 4732 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c632108-b88a-4b6c-9368-49aacc9c04ec-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 13:53:59 crc kubenswrapper[4732]: I0402 13:53:59.984730 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg" event={"ID":"6c632108-b88a-4b6c-9368-49aacc9c04ec","Type":"ContainerDied","Data":"06a9de45282312c2f544931f9d76ace30f0852445e7d3e2ee996bc75c519ab86"} Apr 02 13:53:59 crc kubenswrapper[4732]: I0402 13:53:59.984798 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06a9de45282312c2f544931f9d76ace30f0852445e7d3e2ee996bc75c519ab86" Apr 02 13:53:59 crc kubenswrapper[4732]: I0402 13:53:59.984854 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg" Apr 02 13:54:00 crc kubenswrapper[4732]: I0402 13:54:00.124694 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hdfsn" Apr 02 13:54:00 crc kubenswrapper[4732]: I0402 13:54:00.125013 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hdfsn" Apr 02 13:54:00 crc kubenswrapper[4732]: I0402 13:54:00.132428 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585634-7tqn2"] Apr 02 13:54:00 crc kubenswrapper[4732]: E0402 13:54:00.132905 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c632108-b88a-4b6c-9368-49aacc9c04ec" containerName="pull" Apr 02 13:54:00 crc kubenswrapper[4732]: I0402 13:54:00.133018 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c632108-b88a-4b6c-9368-49aacc9c04ec" containerName="pull" Apr 02 13:54:00 crc kubenswrapper[4732]: E0402 13:54:00.133135 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c632108-b88a-4b6c-9368-49aacc9c04ec" containerName="util" Apr 02 13:54:00 crc kubenswrapper[4732]: I0402 13:54:00.133216 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c632108-b88a-4b6c-9368-49aacc9c04ec" containerName="util" Apr 02 13:54:00 crc kubenswrapper[4732]: E0402 13:54:00.133321 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c632108-b88a-4b6c-9368-49aacc9c04ec" containerName="extract" Apr 02 13:54:00 crc kubenswrapper[4732]: I0402 13:54:00.133418 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c632108-b88a-4b6c-9368-49aacc9c04ec" containerName="extract" Apr 02 13:54:00 crc kubenswrapper[4732]: I0402 13:54:00.133640 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c632108-b88a-4b6c-9368-49aacc9c04ec" containerName="extract" Apr 02 13:54:00 crc kubenswrapper[4732]: I0402 13:54:00.134227 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585634-7tqn2" Apr 02 13:54:00 crc kubenswrapper[4732]: I0402 13:54:00.139065 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 13:54:00 crc kubenswrapper[4732]: I0402 13:54:00.139819 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 13:54:00 crc kubenswrapper[4732]: I0402 13:54:00.139877 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 13:54:00 crc kubenswrapper[4732]: I0402 13:54:00.142717 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585634-7tqn2"] Apr 02 13:54:00 crc kubenswrapper[4732]: I0402 13:54:00.214314 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrkv5\" (UniqueName: \"kubernetes.io/projected/761913ea-6b4c-408f-a0ec-8d9a0179832a-kube-api-access-jrkv5\") pod \"auto-csr-approver-29585634-7tqn2\" (UID: \"761913ea-6b4c-408f-a0ec-8d9a0179832a\") " pod="openshift-infra/auto-csr-approver-29585634-7tqn2" Apr 02 13:54:00 crc kubenswrapper[4732]: I0402 13:54:00.315281 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrkv5\" (UniqueName: \"kubernetes.io/projected/761913ea-6b4c-408f-a0ec-8d9a0179832a-kube-api-access-jrkv5\") pod \"auto-csr-approver-29585634-7tqn2\" (UID: \"761913ea-6b4c-408f-a0ec-8d9a0179832a\") " pod="openshift-infra/auto-csr-approver-29585634-7tqn2" Apr 02 13:54:00 crc kubenswrapper[4732]: I0402 13:54:00.343883 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrkv5\" (UniqueName: \"kubernetes.io/projected/761913ea-6b4c-408f-a0ec-8d9a0179832a-kube-api-access-jrkv5\") pod \"auto-csr-approver-29585634-7tqn2\" (UID: \"761913ea-6b4c-408f-a0ec-8d9a0179832a\") " pod="openshift-infra/auto-csr-approver-29585634-7tqn2" Apr 02 13:54:00 crc kubenswrapper[4732]: I0402 13:54:00.459659 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585634-7tqn2" Apr 02 13:54:00 crc kubenswrapper[4732]: I0402 13:54:00.661398 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585634-7tqn2"] Apr 02 13:54:00 crc kubenswrapper[4732]: W0402 13:54:00.664662 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod761913ea_6b4c_408f_a0ec_8d9a0179832a.slice/crio-46ebcdea3c7bb38e92a1199ea612b5a447dbe5613469d55d5f90575afdbb6ab9 WatchSource:0}: Error finding container 46ebcdea3c7bb38e92a1199ea612b5a447dbe5613469d55d5f90575afdbb6ab9: Status 404 returned error can't find the container with id 46ebcdea3c7bb38e92a1199ea612b5a447dbe5613469d55d5f90575afdbb6ab9 Apr 02 13:54:00 crc kubenswrapper[4732]: I0402 13:54:00.991905 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585634-7tqn2" event={"ID":"761913ea-6b4c-408f-a0ec-8d9a0179832a","Type":"ContainerStarted","Data":"46ebcdea3c7bb38e92a1199ea612b5a447dbe5613469d55d5f90575afdbb6ab9"} Apr 02 13:54:01 crc kubenswrapper[4732]: I0402 13:54:01.164124 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hdfsn" podUID="3e482963-6625-4a6a-a614-c82a1b90d4b3" containerName="registry-server" probeResult="failure" output=< Apr 02 13:54:01 crc kubenswrapper[4732]: timeout: failed to connect service ":50051" within 1s Apr 02 13:54:01 crc kubenswrapper[4732]: > Apr 02 13:54:03 crc kubenswrapper[4732]: I0402 13:54:03.005219 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585634-7tqn2" event={"ID":"761913ea-6b4c-408f-a0ec-8d9a0179832a","Type":"ContainerStarted","Data":"654a1e24249d45635d250229ae3e0174a0763ce4c01ca462e084011d1cce4241"} Apr 02 13:54:04 crc kubenswrapper[4732]: I0402 13:54:04.012926 4732 generic.go:334] "Generic (PLEG): container finished" podID="761913ea-6b4c-408f-a0ec-8d9a0179832a" containerID="654a1e24249d45635d250229ae3e0174a0763ce4c01ca462e084011d1cce4241" exitCode=0 Apr 02 13:54:04 crc kubenswrapper[4732]: I0402 13:54:04.012995 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585634-7tqn2" event={"ID":"761913ea-6b4c-408f-a0ec-8d9a0179832a","Type":"ContainerDied","Data":"654a1e24249d45635d250229ae3e0174a0763ce4c01ca462e084011d1cce4241"} Apr 02 13:54:04 crc kubenswrapper[4732]: I0402 13:54:04.246687 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-6b8c6447b-p77cg"] Apr 02 13:54:04 crc kubenswrapper[4732]: I0402 13:54:04.248464 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6b8c6447b-p77cg" Apr 02 13:54:04 crc kubenswrapper[4732]: I0402 13:54:04.251232 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-wzxgc" Apr 02 13:54:04 crc kubenswrapper[4732]: I0402 13:54:04.251774 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Apr 02 13:54:04 crc kubenswrapper[4732]: I0402 13:54:04.252023 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Apr 02 13:54:04 crc kubenswrapper[4732]: I0402 13:54:04.254845 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6b8c6447b-p77cg"] Apr 02 13:54:04 crc kubenswrapper[4732]: I0402 13:54:04.291681 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttrv6\" (UniqueName: \"kubernetes.io/projected/59b4decd-a99c-4637-bc8e-2a95d017696d-kube-api-access-ttrv6\") pod \"nmstate-operator-6b8c6447b-p77cg\" (UID: \"59b4decd-a99c-4637-bc8e-2a95d017696d\") " pod="openshift-nmstate/nmstate-operator-6b8c6447b-p77cg" Apr 02 13:54:04 crc kubenswrapper[4732]: I0402 13:54:04.392784 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttrv6\" (UniqueName: \"kubernetes.io/projected/59b4decd-a99c-4637-bc8e-2a95d017696d-kube-api-access-ttrv6\") pod \"nmstate-operator-6b8c6447b-p77cg\" (UID: \"59b4decd-a99c-4637-bc8e-2a95d017696d\") " pod="openshift-nmstate/nmstate-operator-6b8c6447b-p77cg" Apr 02 13:54:04 crc kubenswrapper[4732]: I0402 13:54:04.416562 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttrv6\" (UniqueName: \"kubernetes.io/projected/59b4decd-a99c-4637-bc8e-2a95d017696d-kube-api-access-ttrv6\") pod \"nmstate-operator-6b8c6447b-p77cg\" (UID: \"59b4decd-a99c-4637-bc8e-2a95d017696d\") " pod="openshift-nmstate/nmstate-operator-6b8c6447b-p77cg" Apr 02 13:54:04 crc kubenswrapper[4732]: I0402 13:54:04.565262 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6b8c6447b-p77cg" Apr 02 13:54:05 crc kubenswrapper[4732]: I0402 13:54:04.998607 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6b8c6447b-p77cg"] Apr 02 13:54:05 crc kubenswrapper[4732]: I0402 13:54:05.026749 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6b8c6447b-p77cg" event={"ID":"59b4decd-a99c-4637-bc8e-2a95d017696d","Type":"ContainerStarted","Data":"007d018681143194bfce3624d2ae74983af262942f43eb9afcd159f36a77c72b"} Apr 02 13:54:05 crc kubenswrapper[4732]: I0402 13:54:05.301313 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585634-7tqn2" Apr 02 13:54:05 crc kubenswrapper[4732]: I0402 13:54:05.444345 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrkv5\" (UniqueName: \"kubernetes.io/projected/761913ea-6b4c-408f-a0ec-8d9a0179832a-kube-api-access-jrkv5\") pod \"761913ea-6b4c-408f-a0ec-8d9a0179832a\" (UID: \"761913ea-6b4c-408f-a0ec-8d9a0179832a\") " Apr 02 13:54:05 crc kubenswrapper[4732]: I0402 13:54:05.449698 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/761913ea-6b4c-408f-a0ec-8d9a0179832a-kube-api-access-jrkv5" (OuterVolumeSpecName: "kube-api-access-jrkv5") pod "761913ea-6b4c-408f-a0ec-8d9a0179832a" (UID: "761913ea-6b4c-408f-a0ec-8d9a0179832a"). InnerVolumeSpecName "kube-api-access-jrkv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:54:05 crc kubenswrapper[4732]: I0402 13:54:05.545318 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrkv5\" (UniqueName: \"kubernetes.io/projected/761913ea-6b4c-408f-a0ec-8d9a0179832a-kube-api-access-jrkv5\") on node \"crc\" DevicePath \"\"" Apr 02 13:54:06 crc kubenswrapper[4732]: I0402 13:54:06.034845 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585634-7tqn2" event={"ID":"761913ea-6b4c-408f-a0ec-8d9a0179832a","Type":"ContainerDied","Data":"46ebcdea3c7bb38e92a1199ea612b5a447dbe5613469d55d5f90575afdbb6ab9"} Apr 02 13:54:06 crc kubenswrapper[4732]: I0402 13:54:06.035126 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46ebcdea3c7bb38e92a1199ea612b5a447dbe5613469d55d5f90575afdbb6ab9" Apr 02 13:54:06 crc kubenswrapper[4732]: I0402 13:54:06.034898 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585634-7tqn2" Apr 02 13:54:06 crc kubenswrapper[4732]: I0402 13:54:06.348997 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585628-h96tl"] Apr 02 13:54:06 crc kubenswrapper[4732]: I0402 13:54:06.353599 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585628-h96tl"] Apr 02 13:54:06 crc kubenswrapper[4732]: I0402 13:54:06.687345 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cde86b49-a33e-43f2-95ec-2b2864d315d5" path="/var/lib/kubelet/pods/cde86b49-a33e-43f2-95ec-2b2864d315d5/volumes" Apr 02 13:54:10 crc kubenswrapper[4732]: I0402 13:54:10.192449 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hdfsn" Apr 02 13:54:10 crc kubenswrapper[4732]: I0402 13:54:10.247397 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hdfsn" Apr 02 13:54:12 crc kubenswrapper[4732]: I0402 13:54:12.400525 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hdfsn"] Apr 02 13:54:12 crc kubenswrapper[4732]: I0402 13:54:12.400814 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hdfsn" podUID="3e482963-6625-4a6a-a614-c82a1b90d4b3" containerName="registry-server" containerID="cri-o://050acba982003019261fb4339f804b98163a21adb4ebff3c9e664d333af6a026" gracePeriod=2 Apr 02 13:54:13 crc kubenswrapper[4732]: I0402 13:54:13.084973 4732 generic.go:334] "Generic (PLEG): container finished" podID="3e482963-6625-4a6a-a614-c82a1b90d4b3" containerID="050acba982003019261fb4339f804b98163a21adb4ebff3c9e664d333af6a026" exitCode=0 Apr 02 13:54:13 crc kubenswrapper[4732]: I0402 13:54:13.085067 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdfsn" event={"ID":"3e482963-6625-4a6a-a614-c82a1b90d4b3","Type":"ContainerDied","Data":"050acba982003019261fb4339f804b98163a21adb4ebff3c9e664d333af6a026"} Apr 02 13:54:13 crc kubenswrapper[4732]: I0402 13:54:13.713562 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hdfsn" Apr 02 13:54:13 crc kubenswrapper[4732]: I0402 13:54:13.857806 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwttq\" (UniqueName: \"kubernetes.io/projected/3e482963-6625-4a6a-a614-c82a1b90d4b3-kube-api-access-dwttq\") pod \"3e482963-6625-4a6a-a614-c82a1b90d4b3\" (UID: \"3e482963-6625-4a6a-a614-c82a1b90d4b3\") " Apr 02 13:54:13 crc kubenswrapper[4732]: I0402 13:54:13.857976 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e482963-6625-4a6a-a614-c82a1b90d4b3-catalog-content\") pod \"3e482963-6625-4a6a-a614-c82a1b90d4b3\" (UID: \"3e482963-6625-4a6a-a614-c82a1b90d4b3\") " Apr 02 13:54:13 crc kubenswrapper[4732]: I0402 13:54:13.858179 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e482963-6625-4a6a-a614-c82a1b90d4b3-utilities\") pod \"3e482963-6625-4a6a-a614-c82a1b90d4b3\" (UID: \"3e482963-6625-4a6a-a614-c82a1b90d4b3\") " Apr 02 13:54:13 crc kubenswrapper[4732]: I0402 13:54:13.858886 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e482963-6625-4a6a-a614-c82a1b90d4b3-utilities" (OuterVolumeSpecName: "utilities") pod "3e482963-6625-4a6a-a614-c82a1b90d4b3" (UID: "3e482963-6625-4a6a-a614-c82a1b90d4b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:54:13 crc kubenswrapper[4732]: I0402 13:54:13.863737 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e482963-6625-4a6a-a614-c82a1b90d4b3-kube-api-access-dwttq" (OuterVolumeSpecName: "kube-api-access-dwttq") pod "3e482963-6625-4a6a-a614-c82a1b90d4b3" (UID: "3e482963-6625-4a6a-a614-c82a1b90d4b3"). InnerVolumeSpecName "kube-api-access-dwttq". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:54:13 crc kubenswrapper[4732]: I0402 13:54:13.960278 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwttq\" (UniqueName: \"kubernetes.io/projected/3e482963-6625-4a6a-a614-c82a1b90d4b3-kube-api-access-dwttq\") on node \"crc\" DevicePath \"\"" Apr 02 13:54:13 crc kubenswrapper[4732]: I0402 13:54:13.960309 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e482963-6625-4a6a-a614-c82a1b90d4b3-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 13:54:13 crc kubenswrapper[4732]: I0402 13:54:13.991427 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e482963-6625-4a6a-a614-c82a1b90d4b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e482963-6625-4a6a-a614-c82a1b90d4b3" (UID: "3e482963-6625-4a6a-a614-c82a1b90d4b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:54:14 crc kubenswrapper[4732]: I0402 13:54:14.061594 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e482963-6625-4a6a-a614-c82a1b90d4b3-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 13:54:14 crc kubenswrapper[4732]: I0402 13:54:14.100829 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdfsn" event={"ID":"3e482963-6625-4a6a-a614-c82a1b90d4b3","Type":"ContainerDied","Data":"eac3c14b9e89739da94f57db05b3af41eb867c8117cf7d070df0f31fa869521b"} Apr 02 13:54:14 crc kubenswrapper[4732]: I0402 13:54:14.100885 4732 scope.go:117] "RemoveContainer" containerID="050acba982003019261fb4339f804b98163a21adb4ebff3c9e664d333af6a026" Apr 02 13:54:14 crc kubenswrapper[4732]: I0402 13:54:14.101090 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hdfsn" Apr 02 13:54:14 crc kubenswrapper[4732]: I0402 13:54:14.138467 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hdfsn"] Apr 02 13:54:14 crc kubenswrapper[4732]: I0402 13:54:14.142503 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hdfsn"] Apr 02 13:54:14 crc kubenswrapper[4732]: I0402 13:54:14.340970 4732 scope.go:117] "RemoveContainer" containerID="0b514cffdc73c01da66ab5d28ad8c9c13dd91c6a1a2cf2703cd5d27fdaa440e7" Apr 02 13:54:14 crc kubenswrapper[4732]: I0402 13:54:14.392978 4732 scope.go:117] "RemoveContainer" containerID="6e2e00128778abfb9ad77d5d16293f4bf43c21e4b816bb6c3828d25a683157ff" Apr 02 13:54:14 crc kubenswrapper[4732]: I0402 13:54:14.686676 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e482963-6625-4a6a-a614-c82a1b90d4b3" path="/var/lib/kubelet/pods/3e482963-6625-4a6a-a614-c82a1b90d4b3/volumes" Apr 02 13:54:15 crc kubenswrapper[4732]: I0402 13:54:15.109878 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6b8c6447b-p77cg" event={"ID":"59b4decd-a99c-4637-bc8e-2a95d017696d","Type":"ContainerStarted","Data":"b2e22501086c6c24b46747e7cf141bb19fdcf114e8f7f6daf4038b34c527cee4"} Apr 02 13:54:15 crc kubenswrapper[4732]: I0402 13:54:15.132089 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-6b8c6447b-p77cg" podStartSLOduration=1.748474605 podStartE2EDuration="11.132054532s" podCreationTimestamp="2026-04-02 13:54:04 +0000 UTC" firstStartedPulling="2026-04-02 13:54:05.010875482 +0000 UTC m=+1001.915283035" lastFinishedPulling="2026-04-02 13:54:14.394455409 +0000 UTC m=+1011.298862962" observedRunningTime="2026-04-02 13:54:15.129907725 +0000 UTC m=+1012.034315378" watchObservedRunningTime="2026-04-02 13:54:15.132054532 +0000 UTC m=+1012.036462125" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.035510 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-cq88s"] Apr 02 13:54:16 crc kubenswrapper[4732]: E0402 13:54:16.035945 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="761913ea-6b4c-408f-a0ec-8d9a0179832a" containerName="oc" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.035989 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="761913ea-6b4c-408f-a0ec-8d9a0179832a" containerName="oc" Apr 02 13:54:16 crc kubenswrapper[4732]: E0402 13:54:16.036023 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e482963-6625-4a6a-a614-c82a1b90d4b3" containerName="extract-utilities" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.036042 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e482963-6625-4a6a-a614-c82a1b90d4b3" containerName="extract-utilities" Apr 02 13:54:16 crc kubenswrapper[4732]: E0402 13:54:16.036096 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e482963-6625-4a6a-a614-c82a1b90d4b3" containerName="registry-server" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.036115 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e482963-6625-4a6a-a614-c82a1b90d4b3" containerName="registry-server" Apr 02 13:54:16 crc kubenswrapper[4732]: E0402 13:54:16.036144 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e482963-6625-4a6a-a614-c82a1b90d4b3" containerName="extract-content" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.036160 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e482963-6625-4a6a-a614-c82a1b90d4b3" containerName="extract-content" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.036387 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="761913ea-6b4c-408f-a0ec-8d9a0179832a" containerName="oc" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.036428 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e482963-6625-4a6a-a614-c82a1b90d4b3" containerName="registry-server" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.037582 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cq88s" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.039699 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-qqltx" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.043493 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-pqqzz"] Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.044672 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-pqqzz" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.048298 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-cq88s"] Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.049654 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.056772 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-pqqzz"] Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.082760 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-6mg2l"] Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.083396 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6mg2l" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.165953 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-rpk7t"] Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.166762 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-rpk7t" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.179790 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.179872 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.179879 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-zpqd6" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.184361 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-rpk7t"] Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.190350 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/55a456a8-9ff7-4d10-a126-d662f361b74d-nmstate-lock\") pod \"nmstate-handler-6mg2l\" (UID: \"55a456a8-9ff7-4d10-a126-d662f361b74d\") " pod="openshift-nmstate/nmstate-handler-6mg2l" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.190392 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/55a456a8-9ff7-4d10-a126-d662f361b74d-ovs-socket\") pod \"nmstate-handler-6mg2l\" (UID: \"55a456a8-9ff7-4d10-a126-d662f361b74d\") " pod="openshift-nmstate/nmstate-handler-6mg2l" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.190444 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qthl\" (UniqueName: \"kubernetes.io/projected/fb4efdc1-4ea6-4068-bea0-8f961de0328b-kube-api-access-6qthl\") pod \"nmstate-metrics-9b8c8685d-cq88s\" (UID: \"fb4efdc1-4ea6-4068-bea0-8f961de0328b\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cq88s" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.190464 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdl85\" (UniqueName: \"kubernetes.io/projected/2669e31a-19bb-42df-a5dd-5886b22e7674-kube-api-access-rdl85\") pod \"nmstate-webhook-5f558f5558-pqqzz\" (UID: \"2669e31a-19bb-42df-a5dd-5886b22e7674\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-pqqzz" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.190489 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmckq\" (UniqueName: \"kubernetes.io/projected/55a456a8-9ff7-4d10-a126-d662f361b74d-kube-api-access-wmckq\") pod \"nmstate-handler-6mg2l\" (UID: \"55a456a8-9ff7-4d10-a126-d662f361b74d\") " pod="openshift-nmstate/nmstate-handler-6mg2l" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.190509 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/55a456a8-9ff7-4d10-a126-d662f361b74d-dbus-socket\") pod \"nmstate-handler-6mg2l\" (UID: \"55a456a8-9ff7-4d10-a126-d662f361b74d\") " pod="openshift-nmstate/nmstate-handler-6mg2l" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.190531 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2669e31a-19bb-42df-a5dd-5886b22e7674-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-pqqzz\" (UID: \"2669e31a-19bb-42df-a5dd-5886b22e7674\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-pqqzz" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.291447 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/eae3a07b-45a3-4f0e-8fd6-ac653ab24deb-plugin-serving-cert\") pod \"nmstate-console-plugin-7b5ddc4dc7-rpk7t\" (UID: \"eae3a07b-45a3-4f0e-8fd6-ac653ab24deb\") " pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-rpk7t" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.291513 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvnkz\" (UniqueName: \"kubernetes.io/projected/eae3a07b-45a3-4f0e-8fd6-ac653ab24deb-kube-api-access-rvnkz\") pod \"nmstate-console-plugin-7b5ddc4dc7-rpk7t\" (UID: \"eae3a07b-45a3-4f0e-8fd6-ac653ab24deb\") " pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-rpk7t" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.291550 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/eae3a07b-45a3-4f0e-8fd6-ac653ab24deb-nginx-conf\") pod \"nmstate-console-plugin-7b5ddc4dc7-rpk7t\" (UID: \"eae3a07b-45a3-4f0e-8fd6-ac653ab24deb\") " pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-rpk7t" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.291641 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qthl\" (UniqueName: \"kubernetes.io/projected/fb4efdc1-4ea6-4068-bea0-8f961de0328b-kube-api-access-6qthl\") pod \"nmstate-metrics-9b8c8685d-cq88s\" (UID: \"fb4efdc1-4ea6-4068-bea0-8f961de0328b\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cq88s" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.291669 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdl85\" (UniqueName: \"kubernetes.io/projected/2669e31a-19bb-42df-a5dd-5886b22e7674-kube-api-access-rdl85\") pod \"nmstate-webhook-5f558f5558-pqqzz\" (UID: \"2669e31a-19bb-42df-a5dd-5886b22e7674\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-pqqzz" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.291695 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmckq\" (UniqueName: \"kubernetes.io/projected/55a456a8-9ff7-4d10-a126-d662f361b74d-kube-api-access-wmckq\") pod \"nmstate-handler-6mg2l\" (UID: \"55a456a8-9ff7-4d10-a126-d662f361b74d\") " pod="openshift-nmstate/nmstate-handler-6mg2l" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.291733 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/55a456a8-9ff7-4d10-a126-d662f361b74d-dbus-socket\") pod \"nmstate-handler-6mg2l\" (UID: \"55a456a8-9ff7-4d10-a126-d662f361b74d\") " pod="openshift-nmstate/nmstate-handler-6mg2l" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.291765 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2669e31a-19bb-42df-a5dd-5886b22e7674-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-pqqzz\" (UID: \"2669e31a-19bb-42df-a5dd-5886b22e7674\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-pqqzz" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.291815 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/55a456a8-9ff7-4d10-a126-d662f361b74d-nmstate-lock\") pod \"nmstate-handler-6mg2l\" (UID: \"55a456a8-9ff7-4d10-a126-d662f361b74d\") " pod="openshift-nmstate/nmstate-handler-6mg2l" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.291841 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/55a456a8-9ff7-4d10-a126-d662f361b74d-ovs-socket\") pod \"nmstate-handler-6mg2l\" (UID: \"55a456a8-9ff7-4d10-a126-d662f361b74d\") " pod="openshift-nmstate/nmstate-handler-6mg2l" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.291920 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/55a456a8-9ff7-4d10-a126-d662f361b74d-ovs-socket\") pod \"nmstate-handler-6mg2l\" (UID: \"55a456a8-9ff7-4d10-a126-d662f361b74d\") " pod="openshift-nmstate/nmstate-handler-6mg2l" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.292213 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/55a456a8-9ff7-4d10-a126-d662f361b74d-dbus-socket\") pod \"nmstate-handler-6mg2l\" (UID: \"55a456a8-9ff7-4d10-a126-d662f361b74d\") " pod="openshift-nmstate/nmstate-handler-6mg2l" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.292266 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/55a456a8-9ff7-4d10-a126-d662f361b74d-nmstate-lock\") pod \"nmstate-handler-6mg2l\" (UID: \"55a456a8-9ff7-4d10-a126-d662f361b74d\") " pod="openshift-nmstate/nmstate-handler-6mg2l" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.314654 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2669e31a-19bb-42df-a5dd-5886b22e7674-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-pqqzz\" (UID: \"2669e31a-19bb-42df-a5dd-5886b22e7674\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-pqqzz" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.317504 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdl85\" (UniqueName: \"kubernetes.io/projected/2669e31a-19bb-42df-a5dd-5886b22e7674-kube-api-access-rdl85\") pod \"nmstate-webhook-5f558f5558-pqqzz\" (UID: \"2669e31a-19bb-42df-a5dd-5886b22e7674\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-pqqzz" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.321384 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qthl\" (UniqueName: \"kubernetes.io/projected/fb4efdc1-4ea6-4068-bea0-8f961de0328b-kube-api-access-6qthl\") pod \"nmstate-metrics-9b8c8685d-cq88s\" (UID: \"fb4efdc1-4ea6-4068-bea0-8f961de0328b\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cq88s" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.326401 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmckq\" (UniqueName: \"kubernetes.io/projected/55a456a8-9ff7-4d10-a126-d662f361b74d-kube-api-access-wmckq\") pod \"nmstate-handler-6mg2l\" (UID: \"55a456a8-9ff7-4d10-a126-d662f361b74d\") " pod="openshift-nmstate/nmstate-handler-6mg2l" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.349825 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-8d499f5f5-dzmld"] Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.350846 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.362561 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cq88s" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.371152 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-pqqzz" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.379709 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8d499f5f5-dzmld"] Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.393736 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/eae3a07b-45a3-4f0e-8fd6-ac653ab24deb-nginx-conf\") pod \"nmstate-console-plugin-7b5ddc4dc7-rpk7t\" (UID: \"eae3a07b-45a3-4f0e-8fd6-ac653ab24deb\") " pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-rpk7t" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.393845 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/eae3a07b-45a3-4f0e-8fd6-ac653ab24deb-plugin-serving-cert\") pod \"nmstate-console-plugin-7b5ddc4dc7-rpk7t\" (UID: \"eae3a07b-45a3-4f0e-8fd6-ac653ab24deb\") " pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-rpk7t" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.393868 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvnkz\" (UniqueName: \"kubernetes.io/projected/eae3a07b-45a3-4f0e-8fd6-ac653ab24deb-kube-api-access-rvnkz\") pod \"nmstate-console-plugin-7b5ddc4dc7-rpk7t\" (UID: \"eae3a07b-45a3-4f0e-8fd6-ac653ab24deb\") " pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-rpk7t" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.394843 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/eae3a07b-45a3-4f0e-8fd6-ac653ab24deb-nginx-conf\") pod \"nmstate-console-plugin-7b5ddc4dc7-rpk7t\" (UID: \"eae3a07b-45a3-4f0e-8fd6-ac653ab24deb\") " pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-rpk7t" Apr 02 13:54:16 crc kubenswrapper[4732]: E0402 13:54:16.394914 4732 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 02 13:54:16 crc kubenswrapper[4732]: E0402 13:54:16.394953 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eae3a07b-45a3-4f0e-8fd6-ac653ab24deb-plugin-serving-cert podName:eae3a07b-45a3-4f0e-8fd6-ac653ab24deb nodeName:}" failed. No retries permitted until 2026-04-02 13:54:16.894938491 +0000 UTC m=+1013.799346044 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/eae3a07b-45a3-4f0e-8fd6-ac653ab24deb-plugin-serving-cert") pod "nmstate-console-plugin-7b5ddc4dc7-rpk7t" (UID: "eae3a07b-45a3-4f0e-8fd6-ac653ab24deb") : secret "plugin-serving-cert" not found Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.397712 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6mg2l" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.434497 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvnkz\" (UniqueName: \"kubernetes.io/projected/eae3a07b-45a3-4f0e-8fd6-ac653ab24deb-kube-api-access-rvnkz\") pod \"nmstate-console-plugin-7b5ddc4dc7-rpk7t\" (UID: \"eae3a07b-45a3-4f0e-8fd6-ac653ab24deb\") " pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-rpk7t" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.495290 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4fa5023-6e70-49f8-926c-eacf37a57881-console-serving-cert\") pod \"console-8d499f5f5-dzmld\" (UID: \"f4fa5023-6e70-49f8-926c-eacf37a57881\") " pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.495480 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4fa5023-6e70-49f8-926c-eacf37a57881-trusted-ca-bundle\") pod \"console-8d499f5f5-dzmld\" (UID: \"f4fa5023-6e70-49f8-926c-eacf37a57881\") " pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.495544 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f4fa5023-6e70-49f8-926c-eacf37a57881-console-config\") pod \"console-8d499f5f5-dzmld\" (UID: \"f4fa5023-6e70-49f8-926c-eacf37a57881\") " pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.495634 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f4fa5023-6e70-49f8-926c-eacf37a57881-oauth-serving-cert\") pod \"console-8d499f5f5-dzmld\" (UID: \"f4fa5023-6e70-49f8-926c-eacf37a57881\") " pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.495695 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v47c2\" (UniqueName: \"kubernetes.io/projected/f4fa5023-6e70-49f8-926c-eacf37a57881-kube-api-access-v47c2\") pod \"console-8d499f5f5-dzmld\" (UID: \"f4fa5023-6e70-49f8-926c-eacf37a57881\") " pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.495784 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f4fa5023-6e70-49f8-926c-eacf37a57881-service-ca\") pod \"console-8d499f5f5-dzmld\" (UID: \"f4fa5023-6e70-49f8-926c-eacf37a57881\") " pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.495855 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f4fa5023-6e70-49f8-926c-eacf37a57881-console-oauth-config\") pod \"console-8d499f5f5-dzmld\" (UID: \"f4fa5023-6e70-49f8-926c-eacf37a57881\") " pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.598214 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f4fa5023-6e70-49f8-926c-eacf37a57881-service-ca\") pod \"console-8d499f5f5-dzmld\" (UID: \"f4fa5023-6e70-49f8-926c-eacf37a57881\") " pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.598489 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f4fa5023-6e70-49f8-926c-eacf37a57881-console-oauth-config\") pod \"console-8d499f5f5-dzmld\" (UID: \"f4fa5023-6e70-49f8-926c-eacf37a57881\") " pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.598548 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4fa5023-6e70-49f8-926c-eacf37a57881-console-serving-cert\") pod \"console-8d499f5f5-dzmld\" (UID: \"f4fa5023-6e70-49f8-926c-eacf37a57881\") " pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.598578 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4fa5023-6e70-49f8-926c-eacf37a57881-trusted-ca-bundle\") pod \"console-8d499f5f5-dzmld\" (UID: \"f4fa5023-6e70-49f8-926c-eacf37a57881\") " pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.598598 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f4fa5023-6e70-49f8-926c-eacf37a57881-console-config\") pod \"console-8d499f5f5-dzmld\" (UID: \"f4fa5023-6e70-49f8-926c-eacf37a57881\") " pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.598630 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f4fa5023-6e70-49f8-926c-eacf37a57881-oauth-serving-cert\") pod \"console-8d499f5f5-dzmld\" (UID: \"f4fa5023-6e70-49f8-926c-eacf37a57881\") " pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.598665 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v47c2\" (UniqueName: \"kubernetes.io/projected/f4fa5023-6e70-49f8-926c-eacf37a57881-kube-api-access-v47c2\") pod \"console-8d499f5f5-dzmld\" (UID: \"f4fa5023-6e70-49f8-926c-eacf37a57881\") " pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.599243 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f4fa5023-6e70-49f8-926c-eacf37a57881-service-ca\") pod \"console-8d499f5f5-dzmld\" (UID: \"f4fa5023-6e70-49f8-926c-eacf37a57881\") " pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.600091 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4fa5023-6e70-49f8-926c-eacf37a57881-trusted-ca-bundle\") pod \"console-8d499f5f5-dzmld\" (UID: \"f4fa5023-6e70-49f8-926c-eacf37a57881\") " pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.601084 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f4fa5023-6e70-49f8-926c-eacf37a57881-oauth-serving-cert\") pod \"console-8d499f5f5-dzmld\" (UID: \"f4fa5023-6e70-49f8-926c-eacf37a57881\") " pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.601430 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f4fa5023-6e70-49f8-926c-eacf37a57881-console-config\") pod \"console-8d499f5f5-dzmld\" (UID: \"f4fa5023-6e70-49f8-926c-eacf37a57881\") " pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.602971 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f4fa5023-6e70-49f8-926c-eacf37a57881-console-oauth-config\") pod \"console-8d499f5f5-dzmld\" (UID: \"f4fa5023-6e70-49f8-926c-eacf37a57881\") " pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.603370 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4fa5023-6e70-49f8-926c-eacf37a57881-console-serving-cert\") pod \"console-8d499f5f5-dzmld\" (UID: \"f4fa5023-6e70-49f8-926c-eacf37a57881\") " pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.617634 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v47c2\" (UniqueName: \"kubernetes.io/projected/f4fa5023-6e70-49f8-926c-eacf37a57881-kube-api-access-v47c2\") pod \"console-8d499f5f5-dzmld\" (UID: \"f4fa5023-6e70-49f8-926c-eacf37a57881\") " pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.675709 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-pqqzz"] Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.676027 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.846705 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-cq88s"] Apr 02 13:54:16 crc kubenswrapper[4732]: W0402 13:54:16.855356 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb4efdc1_4ea6_4068_bea0_8f961de0328b.slice/crio-5f918584a18a0cad1323c5b591a972d3dad18b2176d29954073c63d714b62019 WatchSource:0}: Error finding container 5f918584a18a0cad1323c5b591a972d3dad18b2176d29954073c63d714b62019: Status 404 returned error can't find the container with id 5f918584a18a0cad1323c5b591a972d3dad18b2176d29954073c63d714b62019 Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.873785 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8d499f5f5-dzmld"] Apr 02 13:54:16 crc kubenswrapper[4732]: W0402 13:54:16.880043 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4fa5023_6e70_49f8_926c_eacf37a57881.slice/crio-77bd44f96247433a478bb8a00018a9c15f1eaa0300917719eb9746986ffe7ec2 WatchSource:0}: Error finding container 77bd44f96247433a478bb8a00018a9c15f1eaa0300917719eb9746986ffe7ec2: Status 404 returned error can't find the container with id 77bd44f96247433a478bb8a00018a9c15f1eaa0300917719eb9746986ffe7ec2 Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.902809 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/eae3a07b-45a3-4f0e-8fd6-ac653ab24deb-plugin-serving-cert\") pod \"nmstate-console-plugin-7b5ddc4dc7-rpk7t\" (UID: \"eae3a07b-45a3-4f0e-8fd6-ac653ab24deb\") " pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-rpk7t" Apr 02 13:54:16 crc kubenswrapper[4732]: I0402 13:54:16.908357 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/eae3a07b-45a3-4f0e-8fd6-ac653ab24deb-plugin-serving-cert\") pod \"nmstate-console-plugin-7b5ddc4dc7-rpk7t\" (UID: \"eae3a07b-45a3-4f0e-8fd6-ac653ab24deb\") " pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-rpk7t" Apr 02 13:54:17 crc kubenswrapper[4732]: I0402 13:54:17.090928 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-rpk7t" Apr 02 13:54:17 crc kubenswrapper[4732]: I0402 13:54:17.123279 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cq88s" event={"ID":"fb4efdc1-4ea6-4068-bea0-8f961de0328b","Type":"ContainerStarted","Data":"5f918584a18a0cad1323c5b591a972d3dad18b2176d29954073c63d714b62019"} Apr 02 13:54:17 crc kubenswrapper[4732]: I0402 13:54:17.124525 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6mg2l" event={"ID":"55a456a8-9ff7-4d10-a126-d662f361b74d","Type":"ContainerStarted","Data":"4955e9b86dd1c9b390d15c262f00a708096a14f5463739d9b607aa3c075545b2"} Apr 02 13:54:17 crc kubenswrapper[4732]: I0402 13:54:17.126393 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8d499f5f5-dzmld" event={"ID":"f4fa5023-6e70-49f8-926c-eacf37a57881","Type":"ContainerStarted","Data":"3d07aa129483b0b77ec34a95839af43496b8e83578e1c2c606e0f9bf0f0f269d"} Apr 02 13:54:17 crc kubenswrapper[4732]: I0402 13:54:17.126465 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8d499f5f5-dzmld" event={"ID":"f4fa5023-6e70-49f8-926c-eacf37a57881","Type":"ContainerStarted","Data":"77bd44f96247433a478bb8a00018a9c15f1eaa0300917719eb9746986ffe7ec2"} Apr 02 13:54:17 crc kubenswrapper[4732]: I0402 13:54:17.128402 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-pqqzz" event={"ID":"2669e31a-19bb-42df-a5dd-5886b22e7674","Type":"ContainerStarted","Data":"f300f146be53069949de314882fb5c190f4b6cf85ddf476eb55a282c632e3bbe"} Apr 02 13:54:17 crc kubenswrapper[4732]: I0402 13:54:17.295339 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-rpk7t"] Apr 02 13:54:17 crc kubenswrapper[4732]: W0402 13:54:17.303638 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeae3a07b_45a3_4f0e_8fd6_ac653ab24deb.slice/crio-ed30b376fdd6664a74657144b0f05b5dbff5622623a28c1f7ec3f522ec7f6302 WatchSource:0}: Error finding container ed30b376fdd6664a74657144b0f05b5dbff5622623a28c1f7ec3f522ec7f6302: Status 404 returned error can't find the container with id ed30b376fdd6664a74657144b0f05b5dbff5622623a28c1f7ec3f522ec7f6302 Apr 02 13:54:18 crc kubenswrapper[4732]: I0402 13:54:18.136988 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-rpk7t" event={"ID":"eae3a07b-45a3-4f0e-8fd6-ac653ab24deb","Type":"ContainerStarted","Data":"ed30b376fdd6664a74657144b0f05b5dbff5622623a28c1f7ec3f522ec7f6302"} Apr 02 13:54:18 crc kubenswrapper[4732]: I0402 13:54:18.166246 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8d499f5f5-dzmld" podStartSLOduration=2.166221064 podStartE2EDuration="2.166221064s" podCreationTimestamp="2026-04-02 13:54:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:54:18.15975186 +0000 UTC m=+1015.064159443" watchObservedRunningTime="2026-04-02 13:54:18.166221064 +0000 UTC m=+1015.070628617" Apr 02 13:54:24 crc kubenswrapper[4732]: I0402 13:54:24.187205 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6mg2l" event={"ID":"55a456a8-9ff7-4d10-a126-d662f361b74d","Type":"ContainerStarted","Data":"444409bebc55078bce6c82b10a1ca433166051de642bf63ed8d53fd3951e5b6c"} Apr 02 13:54:24 crc kubenswrapper[4732]: I0402 13:54:24.188040 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-6mg2l" Apr 02 13:54:24 crc kubenswrapper[4732]: I0402 13:54:24.189843 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-pqqzz" event={"ID":"2669e31a-19bb-42df-a5dd-5886b22e7674","Type":"ContainerStarted","Data":"d48c0802d63c65684478bde3e5e9bcd2118756c113baa1965b41c59e35249ecb"} Apr 02 13:54:24 crc kubenswrapper[4732]: I0402 13:54:24.189984 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-pqqzz" Apr 02 13:54:24 crc kubenswrapper[4732]: I0402 13:54:24.194146 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cq88s" event={"ID":"fb4efdc1-4ea6-4068-bea0-8f961de0328b","Type":"ContainerStarted","Data":"e74d2931c51f516699223e7560a2fec01ba1b42bcc3ac5c1851185f1da6c6064"} Apr 02 13:54:24 crc kubenswrapper[4732]: I0402 13:54:24.212265 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-6mg2l" podStartSLOduration=1.070201262 podStartE2EDuration="8.212248607s" podCreationTimestamp="2026-04-02 13:54:16 +0000 UTC" firstStartedPulling="2026-04-02 13:54:16.476957932 +0000 UTC m=+1013.381365485" lastFinishedPulling="2026-04-02 13:54:23.619005277 +0000 UTC m=+1020.523412830" observedRunningTime="2026-04-02 13:54:24.208120226 +0000 UTC m=+1021.112527789" watchObservedRunningTime="2026-04-02 13:54:24.212248607 +0000 UTC m=+1021.116656160" Apr 02 13:54:24 crc kubenswrapper[4732]: I0402 13:54:24.233897 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-pqqzz" podStartSLOduration=1.300271285 podStartE2EDuration="8.233873187s" podCreationTimestamp="2026-04-02 13:54:16 +0000 UTC" firstStartedPulling="2026-04-02 13:54:16.68817586 +0000 UTC m=+1013.592583413" lastFinishedPulling="2026-04-02 13:54:23.621777762 +0000 UTC m=+1020.526185315" observedRunningTime="2026-04-02 13:54:24.230091966 +0000 UTC m=+1021.134499529" watchObservedRunningTime="2026-04-02 13:54:24.233873187 +0000 UTC m=+1021.138280760" Apr 02 13:54:25 crc kubenswrapper[4732]: I0402 13:54:25.203816 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-rpk7t" event={"ID":"eae3a07b-45a3-4f0e-8fd6-ac653ab24deb","Type":"ContainerStarted","Data":"355a0e2195f4839e09d040f487be0a399d89c9552e67ad02e44a94ae5e46d138"} Apr 02 13:54:25 crc kubenswrapper[4732]: I0402 13:54:25.224994 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-rpk7t" podStartSLOduration=2.117592008 podStartE2EDuration="9.224977153s" podCreationTimestamp="2026-04-02 13:54:16 +0000 UTC" firstStartedPulling="2026-04-02 13:54:17.304830348 +0000 UTC m=+1014.209237901" lastFinishedPulling="2026-04-02 13:54:24.412215493 +0000 UTC m=+1021.316623046" observedRunningTime="2026-04-02 13:54:25.220254217 +0000 UTC m=+1022.124661770" watchObservedRunningTime="2026-04-02 13:54:25.224977153 +0000 UTC m=+1022.129384706" Apr 02 13:54:26 crc kubenswrapper[4732]: I0402 13:54:26.679332 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:26 crc kubenswrapper[4732]: I0402 13:54:26.690407 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:26 crc kubenswrapper[4732]: I0402 13:54:26.690490 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:27 crc kubenswrapper[4732]: I0402 13:54:27.225760 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8d499f5f5-dzmld" Apr 02 13:54:27 crc kubenswrapper[4732]: I0402 13:54:27.274662 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68b6f48864-m96db"] Apr 02 13:54:28 crc kubenswrapper[4732]: I0402 13:54:28.232519 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cq88s" event={"ID":"fb4efdc1-4ea6-4068-bea0-8f961de0328b","Type":"ContainerStarted","Data":"0bf2d1b2a8f1fb8206d608b57de58852bfca76a65219fd40e6af6fb6ce4bfef0"} Apr 02 13:54:28 crc kubenswrapper[4732]: I0402 13:54:28.254283 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cq88s" podStartSLOduration=1.3984297190000001 podStartE2EDuration="12.254262753s" podCreationTimestamp="2026-04-02 13:54:16 +0000 UTC" firstStartedPulling="2026-04-02 13:54:16.859075286 +0000 UTC m=+1013.763482839" lastFinishedPulling="2026-04-02 13:54:27.71490832 +0000 UTC m=+1024.619315873" observedRunningTime="2026-04-02 13:54:28.254012556 +0000 UTC m=+1025.158420129" watchObservedRunningTime="2026-04-02 13:54:28.254262753 +0000 UTC m=+1025.158670306" Apr 02 13:54:31 crc kubenswrapper[4732]: I0402 13:54:31.423388 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-6mg2l" Apr 02 13:54:36 crc kubenswrapper[4732]: I0402 13:54:36.376813 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-pqqzz" Apr 02 13:54:45 crc kubenswrapper[4732]: I0402 13:54:45.738296 4732 scope.go:117] "RemoveContainer" containerID="cf6d40837f21f6428d52f3bb75f667c0ebf856b0af510889a9695093c21b6885" Apr 02 13:54:45 crc kubenswrapper[4732]: I0402 13:54:45.759272 4732 scope.go:117] "RemoveContainer" containerID="73c0dd13f9fc718080de9764a6d5dbfc2998a838fb22d3778e131c10889b83ec" Apr 02 13:54:45 crc kubenswrapper[4732]: I0402 13:54:45.777924 4732 scope.go:117] "RemoveContainer" containerID="2c42808e60f473901e6fb42946d6504b0c3182d7fc3b6eeb7b9bc6e4a6e3a37e" Apr 02 13:54:45 crc kubenswrapper[4732]: I0402 13:54:45.809892 4732 scope.go:117] "RemoveContainer" containerID="5c716ed2094065ff92ec97fb32bca6973e69e1cd0018780759102dde6b4df062" Apr 02 13:54:45 crc kubenswrapper[4732]: I0402 13:54:45.827260 4732 scope.go:117] "RemoveContainer" containerID="07bfdaa7a1fbf8b3c33ccd90050537f2c5efaab4563f16feace73117b1bcdb91" Apr 02 13:54:51 crc kubenswrapper[4732]: I0402 13:54:51.785698 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx"] Apr 02 13:54:51 crc kubenswrapper[4732]: I0402 13:54:51.787275 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx" Apr 02 13:54:51 crc kubenswrapper[4732]: I0402 13:54:51.789243 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Apr 02 13:54:51 crc kubenswrapper[4732]: I0402 13:54:51.797972 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx"] Apr 02 13:54:51 crc kubenswrapper[4732]: I0402 13:54:51.881334 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77a3925c-07b0-47ea-950f-524ce995edbf-util\") pod \"4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx\" (UID: \"77a3925c-07b0-47ea-950f-524ce995edbf\") " pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx" Apr 02 13:54:51 crc kubenswrapper[4732]: I0402 13:54:51.881457 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgd54\" (UniqueName: \"kubernetes.io/projected/77a3925c-07b0-47ea-950f-524ce995edbf-kube-api-access-kgd54\") pod \"4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx\" (UID: \"77a3925c-07b0-47ea-950f-524ce995edbf\") " pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx" Apr 02 13:54:51 crc kubenswrapper[4732]: I0402 13:54:51.881604 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77a3925c-07b0-47ea-950f-524ce995edbf-bundle\") pod \"4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx\" (UID: \"77a3925c-07b0-47ea-950f-524ce995edbf\") " pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx" Apr 02 13:54:51 crc kubenswrapper[4732]: I0402 13:54:51.982639 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77a3925c-07b0-47ea-950f-524ce995edbf-bundle\") pod \"4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx\" (UID: \"77a3925c-07b0-47ea-950f-524ce995edbf\") " pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx" Apr 02 13:54:51 crc kubenswrapper[4732]: I0402 13:54:51.982685 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77a3925c-07b0-47ea-950f-524ce995edbf-util\") pod \"4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx\" (UID: \"77a3925c-07b0-47ea-950f-524ce995edbf\") " pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx" Apr 02 13:54:51 crc kubenswrapper[4732]: I0402 13:54:51.982755 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgd54\" (UniqueName: \"kubernetes.io/projected/77a3925c-07b0-47ea-950f-524ce995edbf-kube-api-access-kgd54\") pod \"4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx\" (UID: \"77a3925c-07b0-47ea-950f-524ce995edbf\") " pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx" Apr 02 13:54:51 crc kubenswrapper[4732]: I0402 13:54:51.983193 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77a3925c-07b0-47ea-950f-524ce995edbf-bundle\") pod \"4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx\" (UID: \"77a3925c-07b0-47ea-950f-524ce995edbf\") " pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx" Apr 02 13:54:51 crc kubenswrapper[4732]: I0402 13:54:51.983317 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77a3925c-07b0-47ea-950f-524ce995edbf-util\") pod \"4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx\" (UID: \"77a3925c-07b0-47ea-950f-524ce995edbf\") " pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx" Apr 02 13:54:52 crc kubenswrapper[4732]: I0402 13:54:52.005829 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgd54\" (UniqueName: \"kubernetes.io/projected/77a3925c-07b0-47ea-950f-524ce995edbf-kube-api-access-kgd54\") pod \"4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx\" (UID: \"77a3925c-07b0-47ea-950f-524ce995edbf\") " pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx" Apr 02 13:54:52 crc kubenswrapper[4732]: I0402 13:54:52.102733 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx" Apr 02 13:54:52 crc kubenswrapper[4732]: I0402 13:54:52.325393 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-68b6f48864-m96db" podUID="a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828" containerName="console" containerID="cri-o://6c36131e366d0172e2e61f924497f33b1ecfc3b51ebfdbbb9cc2bae1361382ce" gracePeriod=15 Apr 02 13:54:52 crc kubenswrapper[4732]: I0402 13:54:52.489431 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx"] Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.260109 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68b6f48864-m96db_a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828/console/0.log" Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.260212 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.405199 4732 generic.go:334] "Generic (PLEG): container finished" podID="77a3925c-07b0-47ea-950f-524ce995edbf" containerID="910c403e23f92c7e04b9faebd8d5b0c91a4dcdbe25d6892855ccd6ab4330eec9" exitCode=0 Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.405296 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx" event={"ID":"77a3925c-07b0-47ea-950f-524ce995edbf","Type":"ContainerDied","Data":"910c403e23f92c7e04b9faebd8d5b0c91a4dcdbe25d6892855ccd6ab4330eec9"} Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.405333 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx" event={"ID":"77a3925c-07b0-47ea-950f-524ce995edbf","Type":"ContainerStarted","Data":"cb1778828abea2e5e927d106aef3f34ae2f7e9cfd442a5379cca39f2403b8ef9"} Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.405570 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-console-config\") pod \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.405647 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdbtg\" (UniqueName: \"kubernetes.io/projected/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-kube-api-access-gdbtg\") pod \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.405709 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-oauth-serving-cert\") pod \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.405724 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-service-ca\") pod \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.405745 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-console-oauth-config\") pod \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.405790 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-console-serving-cert\") pod \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.405829 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-trusted-ca-bundle\") pod \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\" (UID: \"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828\") " Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.406388 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-console-config" (OuterVolumeSpecName: "console-config") pod "a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828" (UID: "a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.406655 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-service-ca" (OuterVolumeSpecName: "service-ca") pod "a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828" (UID: "a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.406684 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828" (UID: "a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.406704 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828" (UID: "a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.410328 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68b6f48864-m96db_a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828/console/0.log" Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.410386 4732 generic.go:334] "Generic (PLEG): container finished" podID="a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828" containerID="6c36131e366d0172e2e61f924497f33b1ecfc3b51ebfdbbb9cc2bae1361382ce" exitCode=2 Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.410416 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68b6f48864-m96db" event={"ID":"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828","Type":"ContainerDied","Data":"6c36131e366d0172e2e61f924497f33b1ecfc3b51ebfdbbb9cc2bae1361382ce"} Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.410441 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68b6f48864-m96db" event={"ID":"a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828","Type":"ContainerDied","Data":"00fb02962ffae3b81c5144caecff72c5ddef198999b02158944d81f60d844b61"} Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.410484 4732 scope.go:117] "RemoveContainer" containerID="6c36131e366d0172e2e61f924497f33b1ecfc3b51ebfdbbb9cc2bae1361382ce" Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.410651 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68b6f48864-m96db" Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.411354 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828" (UID: "a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.414017 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-kube-api-access-gdbtg" (OuterVolumeSpecName: "kube-api-access-gdbtg") pod "a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828" (UID: "a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828"). InnerVolumeSpecName "kube-api-access-gdbtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.420300 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828" (UID: "a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.459959 4732 scope.go:117] "RemoveContainer" containerID="6c36131e366d0172e2e61f924497f33b1ecfc3b51ebfdbbb9cc2bae1361382ce" Apr 02 13:54:53 crc kubenswrapper[4732]: E0402 13:54:53.460451 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c36131e366d0172e2e61f924497f33b1ecfc3b51ebfdbbb9cc2bae1361382ce\": container with ID starting with 6c36131e366d0172e2e61f924497f33b1ecfc3b51ebfdbbb9cc2bae1361382ce not found: ID does not exist" containerID="6c36131e366d0172e2e61f924497f33b1ecfc3b51ebfdbbb9cc2bae1361382ce" Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.460489 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c36131e366d0172e2e61f924497f33b1ecfc3b51ebfdbbb9cc2bae1361382ce"} err="failed to get container status \"6c36131e366d0172e2e61f924497f33b1ecfc3b51ebfdbbb9cc2bae1361382ce\": rpc error: code = NotFound desc = could not find container \"6c36131e366d0172e2e61f924497f33b1ecfc3b51ebfdbbb9cc2bae1361382ce\": container with ID starting with 6c36131e366d0172e2e61f924497f33b1ecfc3b51ebfdbbb9cc2bae1361382ce not found: ID does not exist" Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.507702 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.507739 4732 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-console-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.507749 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdbtg\" (UniqueName: \"kubernetes.io/projected/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-kube-api-access-gdbtg\") on node \"crc\" DevicePath \"\"" Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.507760 4732 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.507772 4732 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-service-ca\") on node \"crc\" DevicePath \"\"" Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.507781 4732 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-console-oauth-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.507789 4732 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828-console-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.737182 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68b6f48864-m96db"] Apr 02 13:54:53 crc kubenswrapper[4732]: I0402 13:54:53.741743 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-68b6f48864-m96db"] Apr 02 13:54:54 crc kubenswrapper[4732]: I0402 13:54:54.688632 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828" path="/var/lib/kubelet/pods/a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828/volumes" Apr 02 13:55:00 crc kubenswrapper[4732]: I0402 13:55:00.458413 4732 generic.go:334] "Generic (PLEG): container finished" podID="77a3925c-07b0-47ea-950f-524ce995edbf" containerID="2b31babdd26136c9b8780d7225626e138089469e2738f99996a59d6468c223c1" exitCode=0 Apr 02 13:55:00 crc kubenswrapper[4732]: I0402 13:55:00.458769 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx" event={"ID":"77a3925c-07b0-47ea-950f-524ce995edbf","Type":"ContainerDied","Data":"2b31babdd26136c9b8780d7225626e138089469e2738f99996a59d6468c223c1"} Apr 02 13:55:01 crc kubenswrapper[4732]: I0402 13:55:01.468736 4732 generic.go:334] "Generic (PLEG): container finished" podID="77a3925c-07b0-47ea-950f-524ce995edbf" containerID="d60d55f2875de912994b0c0811af7c4ff06a9171e4947762a9be9a125b004dc2" exitCode=0 Apr 02 13:55:01 crc kubenswrapper[4732]: I0402 13:55:01.468791 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx" event={"ID":"77a3925c-07b0-47ea-950f-524ce995edbf","Type":"ContainerDied","Data":"d60d55f2875de912994b0c0811af7c4ff06a9171e4947762a9be9a125b004dc2"} Apr 02 13:55:02 crc kubenswrapper[4732]: I0402 13:55:02.742792 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx" Apr 02 13:55:02 crc kubenswrapper[4732]: I0402 13:55:02.826678 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77a3925c-07b0-47ea-950f-524ce995edbf-util\") pod \"77a3925c-07b0-47ea-950f-524ce995edbf\" (UID: \"77a3925c-07b0-47ea-950f-524ce995edbf\") " Apr 02 13:55:02 crc kubenswrapper[4732]: I0402 13:55:02.826724 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77a3925c-07b0-47ea-950f-524ce995edbf-bundle\") pod \"77a3925c-07b0-47ea-950f-524ce995edbf\" (UID: \"77a3925c-07b0-47ea-950f-524ce995edbf\") " Apr 02 13:55:02 crc kubenswrapper[4732]: I0402 13:55:02.826751 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgd54\" (UniqueName: \"kubernetes.io/projected/77a3925c-07b0-47ea-950f-524ce995edbf-kube-api-access-kgd54\") pod \"77a3925c-07b0-47ea-950f-524ce995edbf\" (UID: \"77a3925c-07b0-47ea-950f-524ce995edbf\") " Apr 02 13:55:02 crc kubenswrapper[4732]: I0402 13:55:02.827857 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77a3925c-07b0-47ea-950f-524ce995edbf-bundle" (OuterVolumeSpecName: "bundle") pod "77a3925c-07b0-47ea-950f-524ce995edbf" (UID: "77a3925c-07b0-47ea-950f-524ce995edbf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:55:02 crc kubenswrapper[4732]: I0402 13:55:02.832256 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77a3925c-07b0-47ea-950f-524ce995edbf-kube-api-access-kgd54" (OuterVolumeSpecName: "kube-api-access-kgd54") pod "77a3925c-07b0-47ea-950f-524ce995edbf" (UID: "77a3925c-07b0-47ea-950f-524ce995edbf"). InnerVolumeSpecName "kube-api-access-kgd54". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:55:02 crc kubenswrapper[4732]: I0402 13:55:02.838263 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77a3925c-07b0-47ea-950f-524ce995edbf-util" (OuterVolumeSpecName: "util") pod "77a3925c-07b0-47ea-950f-524ce995edbf" (UID: "77a3925c-07b0-47ea-950f-524ce995edbf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:55:02 crc kubenswrapper[4732]: I0402 13:55:02.927969 4732 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77a3925c-07b0-47ea-950f-524ce995edbf-util\") on node \"crc\" DevicePath \"\"" Apr 02 13:55:02 crc kubenswrapper[4732]: I0402 13:55:02.928023 4732 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77a3925c-07b0-47ea-950f-524ce995edbf-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 13:55:02 crc kubenswrapper[4732]: I0402 13:55:02.928035 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgd54\" (UniqueName: \"kubernetes.io/projected/77a3925c-07b0-47ea-950f-524ce995edbf-kube-api-access-kgd54\") on node \"crc\" DevicePath \"\"" Apr 02 13:55:03 crc kubenswrapper[4732]: I0402 13:55:03.484799 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx" event={"ID":"77a3925c-07b0-47ea-950f-524ce995edbf","Type":"ContainerDied","Data":"cb1778828abea2e5e927d106aef3f34ae2f7e9cfd442a5379cca39f2403b8ef9"} Apr 02 13:55:03 crc kubenswrapper[4732]: I0402 13:55:03.484840 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb1778828abea2e5e927d106aef3f34ae2f7e9cfd442a5379cca39f2403b8ef9" Apr 02 13:55:03 crc kubenswrapper[4732]: I0402 13:55:03.484879 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.034673 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-86c87c56d7-qlfzr"] Apr 02 13:55:15 crc kubenswrapper[4732]: E0402 13:55:15.035504 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77a3925c-07b0-47ea-950f-524ce995edbf" containerName="pull" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.035519 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="77a3925c-07b0-47ea-950f-524ce995edbf" containerName="pull" Apr 02 13:55:15 crc kubenswrapper[4732]: E0402 13:55:15.035533 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77a3925c-07b0-47ea-950f-524ce995edbf" containerName="extract" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.035540 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="77a3925c-07b0-47ea-950f-524ce995edbf" containerName="extract" Apr 02 13:55:15 crc kubenswrapper[4732]: E0402 13:55:15.035560 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828" containerName="console" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.035568 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828" containerName="console" Apr 02 13:55:15 crc kubenswrapper[4732]: E0402 13:55:15.035585 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77a3925c-07b0-47ea-950f-524ce995edbf" containerName="util" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.035593 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="77a3925c-07b0-47ea-950f-524ce995edbf" containerName="util" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.035748 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ca8cc0-a0bc-47a8-bd4a-d4b69a3f0828" containerName="console" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.035769 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="77a3925c-07b0-47ea-950f-524ce995edbf" containerName="extract" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.036291 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-86c87c56d7-qlfzr" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.038422 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-w6m9g" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.039390 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.039482 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.039554 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.040900 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.057155 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-86c87c56d7-qlfzr"] Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.187634 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c7ll\" (UniqueName: \"kubernetes.io/projected/bd386538-6696-4b2c-96e4-6f8e4b949364-kube-api-access-9c7ll\") pod \"metallb-operator-controller-manager-86c87c56d7-qlfzr\" (UID: \"bd386538-6696-4b2c-96e4-6f8e4b949364\") " pod="metallb-system/metallb-operator-controller-manager-86c87c56d7-qlfzr" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.188019 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd386538-6696-4b2c-96e4-6f8e4b949364-apiservice-cert\") pod \"metallb-operator-controller-manager-86c87c56d7-qlfzr\" (UID: \"bd386538-6696-4b2c-96e4-6f8e4b949364\") " pod="metallb-system/metallb-operator-controller-manager-86c87c56d7-qlfzr" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.188067 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bd386538-6696-4b2c-96e4-6f8e4b949364-webhook-cert\") pod \"metallb-operator-controller-manager-86c87c56d7-qlfzr\" (UID: \"bd386538-6696-4b2c-96e4-6f8e4b949364\") " pod="metallb-system/metallb-operator-controller-manager-86c87c56d7-qlfzr" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.272128 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6948d8cf8d-vd8rt"] Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.273013 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6948d8cf8d-vd8rt" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.274640 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.275429 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.275486 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-rt5rc" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.290022 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c7ll\" (UniqueName: \"kubernetes.io/projected/bd386538-6696-4b2c-96e4-6f8e4b949364-kube-api-access-9c7ll\") pod \"metallb-operator-controller-manager-86c87c56d7-qlfzr\" (UID: \"bd386538-6696-4b2c-96e4-6f8e4b949364\") " pod="metallb-system/metallb-operator-controller-manager-86c87c56d7-qlfzr" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.290082 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd386538-6696-4b2c-96e4-6f8e4b949364-apiservice-cert\") pod \"metallb-operator-controller-manager-86c87c56d7-qlfzr\" (UID: \"bd386538-6696-4b2c-96e4-6f8e4b949364\") " pod="metallb-system/metallb-operator-controller-manager-86c87c56d7-qlfzr" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.290140 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bd386538-6696-4b2c-96e4-6f8e4b949364-webhook-cert\") pod \"metallb-operator-controller-manager-86c87c56d7-qlfzr\" (UID: \"bd386538-6696-4b2c-96e4-6f8e4b949364\") " pod="metallb-system/metallb-operator-controller-manager-86c87c56d7-qlfzr" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.290189 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/329919c3-94d2-43c2-94a8-2ba9518b98fa-apiservice-cert\") pod \"metallb-operator-webhook-server-6948d8cf8d-vd8rt\" (UID: \"329919c3-94d2-43c2-94a8-2ba9518b98fa\") " pod="metallb-system/metallb-operator-webhook-server-6948d8cf8d-vd8rt" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.290255 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sqv8\" (UniqueName: \"kubernetes.io/projected/329919c3-94d2-43c2-94a8-2ba9518b98fa-kube-api-access-5sqv8\") pod \"metallb-operator-webhook-server-6948d8cf8d-vd8rt\" (UID: \"329919c3-94d2-43c2-94a8-2ba9518b98fa\") " pod="metallb-system/metallb-operator-webhook-server-6948d8cf8d-vd8rt" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.290282 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/329919c3-94d2-43c2-94a8-2ba9518b98fa-webhook-cert\") pod \"metallb-operator-webhook-server-6948d8cf8d-vd8rt\" (UID: \"329919c3-94d2-43c2-94a8-2ba9518b98fa\") " pod="metallb-system/metallb-operator-webhook-server-6948d8cf8d-vd8rt" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.296420 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd386538-6696-4b2c-96e4-6f8e4b949364-apiservice-cert\") pod \"metallb-operator-controller-manager-86c87c56d7-qlfzr\" (UID: \"bd386538-6696-4b2c-96e4-6f8e4b949364\") " pod="metallb-system/metallb-operator-controller-manager-86c87c56d7-qlfzr" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.296507 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bd386538-6696-4b2c-96e4-6f8e4b949364-webhook-cert\") pod \"metallb-operator-controller-manager-86c87c56d7-qlfzr\" (UID: \"bd386538-6696-4b2c-96e4-6f8e4b949364\") " pod="metallb-system/metallb-operator-controller-manager-86c87c56d7-qlfzr" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.315408 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6948d8cf8d-vd8rt"] Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.330746 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c7ll\" (UniqueName: \"kubernetes.io/projected/bd386538-6696-4b2c-96e4-6f8e4b949364-kube-api-access-9c7ll\") pod \"metallb-operator-controller-manager-86c87c56d7-qlfzr\" (UID: \"bd386538-6696-4b2c-96e4-6f8e4b949364\") " pod="metallb-system/metallb-operator-controller-manager-86c87c56d7-qlfzr" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.353318 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-86c87c56d7-qlfzr" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.393908 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/329919c3-94d2-43c2-94a8-2ba9518b98fa-apiservice-cert\") pod \"metallb-operator-webhook-server-6948d8cf8d-vd8rt\" (UID: \"329919c3-94d2-43c2-94a8-2ba9518b98fa\") " pod="metallb-system/metallb-operator-webhook-server-6948d8cf8d-vd8rt" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.393977 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sqv8\" (UniqueName: \"kubernetes.io/projected/329919c3-94d2-43c2-94a8-2ba9518b98fa-kube-api-access-5sqv8\") pod \"metallb-operator-webhook-server-6948d8cf8d-vd8rt\" (UID: \"329919c3-94d2-43c2-94a8-2ba9518b98fa\") " pod="metallb-system/metallb-operator-webhook-server-6948d8cf8d-vd8rt" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.393998 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/329919c3-94d2-43c2-94a8-2ba9518b98fa-webhook-cert\") pod \"metallb-operator-webhook-server-6948d8cf8d-vd8rt\" (UID: \"329919c3-94d2-43c2-94a8-2ba9518b98fa\") " pod="metallb-system/metallb-operator-webhook-server-6948d8cf8d-vd8rt" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.398190 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/329919c3-94d2-43c2-94a8-2ba9518b98fa-webhook-cert\") pod \"metallb-operator-webhook-server-6948d8cf8d-vd8rt\" (UID: \"329919c3-94d2-43c2-94a8-2ba9518b98fa\") " pod="metallb-system/metallb-operator-webhook-server-6948d8cf8d-vd8rt" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.400871 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/329919c3-94d2-43c2-94a8-2ba9518b98fa-apiservice-cert\") pod \"metallb-operator-webhook-server-6948d8cf8d-vd8rt\" (UID: \"329919c3-94d2-43c2-94a8-2ba9518b98fa\") " pod="metallb-system/metallb-operator-webhook-server-6948d8cf8d-vd8rt" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.438099 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sqv8\" (UniqueName: \"kubernetes.io/projected/329919c3-94d2-43c2-94a8-2ba9518b98fa-kube-api-access-5sqv8\") pod \"metallb-operator-webhook-server-6948d8cf8d-vd8rt\" (UID: \"329919c3-94d2-43c2-94a8-2ba9518b98fa\") " pod="metallb-system/metallb-operator-webhook-server-6948d8cf8d-vd8rt" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.587405 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6948d8cf8d-vd8rt" Apr 02 13:55:15 crc kubenswrapper[4732]: I0402 13:55:15.710477 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-86c87c56d7-qlfzr"] Apr 02 13:55:16 crc kubenswrapper[4732]: I0402 13:55:16.040941 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6948d8cf8d-vd8rt"] Apr 02 13:55:16 crc kubenswrapper[4732]: W0402 13:55:16.043158 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod329919c3_94d2_43c2_94a8_2ba9518b98fa.slice/crio-3a5b813654106e3f0020663a051e5411d2a4659a32ca7d58792eaacace97bada WatchSource:0}: Error finding container 3a5b813654106e3f0020663a051e5411d2a4659a32ca7d58792eaacace97bada: Status 404 returned error can't find the container with id 3a5b813654106e3f0020663a051e5411d2a4659a32ca7d58792eaacace97bada Apr 02 13:55:16 crc kubenswrapper[4732]: I0402 13:55:16.563316 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-86c87c56d7-qlfzr" event={"ID":"bd386538-6696-4b2c-96e4-6f8e4b949364","Type":"ContainerStarted","Data":"b4db3b6533bd0b8601d5004f433dbebd4afd1160bd06352fb59bd43d5264c153"} Apr 02 13:55:16 crc kubenswrapper[4732]: I0402 13:55:16.564830 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6948d8cf8d-vd8rt" event={"ID":"329919c3-94d2-43c2-94a8-2ba9518b98fa","Type":"ContainerStarted","Data":"3a5b813654106e3f0020663a051e5411d2a4659a32ca7d58792eaacace97bada"} Apr 02 13:55:21 crc kubenswrapper[4732]: I0402 13:55:21.600965 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-86c87c56d7-qlfzr" event={"ID":"bd386538-6696-4b2c-96e4-6f8e4b949364","Type":"ContainerStarted","Data":"4739d47e29b5d24e7315798fa92419e9d25a8d4e022e246c2485469f7a9ab78f"} Apr 02 13:55:21 crc kubenswrapper[4732]: I0402 13:55:21.601568 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-86c87c56d7-qlfzr" Apr 02 13:55:21 crc kubenswrapper[4732]: I0402 13:55:21.629194 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-86c87c56d7-qlfzr" podStartSLOduration=1.18292631 podStartE2EDuration="6.629174353s" podCreationTimestamp="2026-04-02 13:55:15 +0000 UTC" firstStartedPulling="2026-04-02 13:55:15.725534855 +0000 UTC m=+1072.629942408" lastFinishedPulling="2026-04-02 13:55:21.171782898 +0000 UTC m=+1078.076190451" observedRunningTime="2026-04-02 13:55:21.620806788 +0000 UTC m=+1078.525214371" watchObservedRunningTime="2026-04-02 13:55:21.629174353 +0000 UTC m=+1078.533581906" Apr 02 13:55:24 crc kubenswrapper[4732]: I0402 13:55:24.619568 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6948d8cf8d-vd8rt" event={"ID":"329919c3-94d2-43c2-94a8-2ba9518b98fa","Type":"ContainerStarted","Data":"8eea6943b64bf2d361fa7efd9671634bb0d51fa33dc0abdbeef835c64883c299"} Apr 02 13:55:24 crc kubenswrapper[4732]: I0402 13:55:24.620205 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6948d8cf8d-vd8rt" Apr 02 13:55:24 crc kubenswrapper[4732]: I0402 13:55:24.640793 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6948d8cf8d-vd8rt" podStartSLOduration=1.56568917 podStartE2EDuration="9.640768159s" podCreationTimestamp="2026-04-02 13:55:15 +0000 UTC" firstStartedPulling="2026-04-02 13:55:16.045114826 +0000 UTC m=+1072.949522379" lastFinishedPulling="2026-04-02 13:55:24.120193815 +0000 UTC m=+1081.024601368" observedRunningTime="2026-04-02 13:55:24.635019014 +0000 UTC m=+1081.539426587" watchObservedRunningTime="2026-04-02 13:55:24.640768159 +0000 UTC m=+1081.545175732" Apr 02 13:55:35 crc kubenswrapper[4732]: I0402 13:55:35.592953 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6948d8cf8d-vd8rt" Apr 02 13:55:55 crc kubenswrapper[4732]: I0402 13:55:55.356288 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-86c87c56d7-qlfzr" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.147056 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-hjkgp"] Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.150393 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.153361 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-vk878" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.153529 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.155646 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-rrzxj"] Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.156595 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rrzxj" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.160592 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.161129 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-rrzxj"] Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.162279 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.243101 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-ng8gx"] Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.244368 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-ng8gx" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.247561 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.247855 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-wqccl" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.248744 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.249721 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.259008 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5bb64cd5d7-5795k"] Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.260159 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bb64cd5d7-5795k" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.262857 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.270932 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bb64cd5d7-5795k"] Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.320858 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6dm8\" (UniqueName: \"kubernetes.io/projected/c89914a1-adc8-4baa-ae73-ec02091fca58-kube-api-access-k6dm8\") pod \"speaker-ng8gx\" (UID: \"c89914a1-adc8-4baa-ae73-ec02091fca58\") " pod="metallb-system/speaker-ng8gx" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.320908 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/268c93ae-fdc1-424e-806c-a4272b6e6ba0-frr-startup\") pod \"frr-k8s-hjkgp\" (UID: \"268c93ae-fdc1-424e-806c-a4272b6e6ba0\") " pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.320946 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/268c93ae-fdc1-424e-806c-a4272b6e6ba0-metrics\") pod \"frr-k8s-hjkgp\" (UID: \"268c93ae-fdc1-424e-806c-a4272b6e6ba0\") " pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.320962 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15b5f14a-3755-4967-b789-555f8ac970a2-cert\") pod \"controller-5bb64cd5d7-5795k\" (UID: \"15b5f14a-3755-4967-b789-555f8ac970a2\") " pod="metallb-system/controller-5bb64cd5d7-5795k" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.320981 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/268c93ae-fdc1-424e-806c-a4272b6e6ba0-frr-sockets\") pod \"frr-k8s-hjkgp\" (UID: \"268c93ae-fdc1-424e-806c-a4272b6e6ba0\") " pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.320999 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/268c93ae-fdc1-424e-806c-a4272b6e6ba0-metrics-certs\") pod \"frr-k8s-hjkgp\" (UID: \"268c93ae-fdc1-424e-806c-a4272b6e6ba0\") " pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.321017 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c89914a1-adc8-4baa-ae73-ec02091fca58-metrics-certs\") pod \"speaker-ng8gx\" (UID: \"c89914a1-adc8-4baa-ae73-ec02091fca58\") " pod="metallb-system/speaker-ng8gx" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.321053 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/777a90c8-0e68-4362-a696-c92e0a49253f-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-rrzxj\" (UID: \"777a90c8-0e68-4362-a696-c92e0a49253f\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rrzxj" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.321068 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngzc4\" (UniqueName: \"kubernetes.io/projected/15b5f14a-3755-4967-b789-555f8ac970a2-kube-api-access-ngzc4\") pod \"controller-5bb64cd5d7-5795k\" (UID: \"15b5f14a-3755-4967-b789-555f8ac970a2\") " pod="metallb-system/controller-5bb64cd5d7-5795k" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.321087 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c89914a1-adc8-4baa-ae73-ec02091fca58-memberlist\") pod \"speaker-ng8gx\" (UID: \"c89914a1-adc8-4baa-ae73-ec02091fca58\") " pod="metallb-system/speaker-ng8gx" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.321102 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/268c93ae-fdc1-424e-806c-a4272b6e6ba0-frr-conf\") pod \"frr-k8s-hjkgp\" (UID: \"268c93ae-fdc1-424e-806c-a4272b6e6ba0\") " pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.321122 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c89914a1-adc8-4baa-ae73-ec02091fca58-metallb-excludel2\") pod \"speaker-ng8gx\" (UID: \"c89914a1-adc8-4baa-ae73-ec02091fca58\") " pod="metallb-system/speaker-ng8gx" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.321139 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bfff\" (UniqueName: \"kubernetes.io/projected/268c93ae-fdc1-424e-806c-a4272b6e6ba0-kube-api-access-4bfff\") pod \"frr-k8s-hjkgp\" (UID: \"268c93ae-fdc1-424e-806c-a4272b6e6ba0\") " pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.321156 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/268c93ae-fdc1-424e-806c-a4272b6e6ba0-reloader\") pod \"frr-k8s-hjkgp\" (UID: \"268c93ae-fdc1-424e-806c-a4272b6e6ba0\") " pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.321171 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/15b5f14a-3755-4967-b789-555f8ac970a2-metrics-certs\") pod \"controller-5bb64cd5d7-5795k\" (UID: \"15b5f14a-3755-4967-b789-555f8ac970a2\") " pod="metallb-system/controller-5bb64cd5d7-5795k" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.321189 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvxwx\" (UniqueName: \"kubernetes.io/projected/777a90c8-0e68-4362-a696-c92e0a49253f-kube-api-access-vvxwx\") pod \"frr-k8s-webhook-server-bcc4b6f68-rrzxj\" (UID: \"777a90c8-0e68-4362-a696-c92e0a49253f\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rrzxj" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.422335 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c89914a1-adc8-4baa-ae73-ec02091fca58-memberlist\") pod \"speaker-ng8gx\" (UID: \"c89914a1-adc8-4baa-ae73-ec02091fca58\") " pod="metallb-system/speaker-ng8gx" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.422378 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/268c93ae-fdc1-424e-806c-a4272b6e6ba0-frr-conf\") pod \"frr-k8s-hjkgp\" (UID: \"268c93ae-fdc1-424e-806c-a4272b6e6ba0\") " pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.422405 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c89914a1-adc8-4baa-ae73-ec02091fca58-metallb-excludel2\") pod \"speaker-ng8gx\" (UID: \"c89914a1-adc8-4baa-ae73-ec02091fca58\") " pod="metallb-system/speaker-ng8gx" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.422427 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bfff\" (UniqueName: \"kubernetes.io/projected/268c93ae-fdc1-424e-806c-a4272b6e6ba0-kube-api-access-4bfff\") pod \"frr-k8s-hjkgp\" (UID: \"268c93ae-fdc1-424e-806c-a4272b6e6ba0\") " pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.422449 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/268c93ae-fdc1-424e-806c-a4272b6e6ba0-reloader\") pod \"frr-k8s-hjkgp\" (UID: \"268c93ae-fdc1-424e-806c-a4272b6e6ba0\") " pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.422472 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/15b5f14a-3755-4967-b789-555f8ac970a2-metrics-certs\") pod \"controller-5bb64cd5d7-5795k\" (UID: \"15b5f14a-3755-4967-b789-555f8ac970a2\") " pod="metallb-system/controller-5bb64cd5d7-5795k" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.422497 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvxwx\" (UniqueName: \"kubernetes.io/projected/777a90c8-0e68-4362-a696-c92e0a49253f-kube-api-access-vvxwx\") pod \"frr-k8s-webhook-server-bcc4b6f68-rrzxj\" (UID: \"777a90c8-0e68-4362-a696-c92e0a49253f\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rrzxj" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.422527 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6dm8\" (UniqueName: \"kubernetes.io/projected/c89914a1-adc8-4baa-ae73-ec02091fca58-kube-api-access-k6dm8\") pod \"speaker-ng8gx\" (UID: \"c89914a1-adc8-4baa-ae73-ec02091fca58\") " pod="metallb-system/speaker-ng8gx" Apr 02 13:55:56 crc kubenswrapper[4732]: E0402 13:55:56.422545 4732 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Apr 02 13:55:56 crc kubenswrapper[4732]: E0402 13:55:56.422662 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c89914a1-adc8-4baa-ae73-ec02091fca58-memberlist podName:c89914a1-adc8-4baa-ae73-ec02091fca58 nodeName:}" failed. No retries permitted until 2026-04-02 13:55:56.922636607 +0000 UTC m=+1113.827044170 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c89914a1-adc8-4baa-ae73-ec02091fca58-memberlist") pod "speaker-ng8gx" (UID: "c89914a1-adc8-4baa-ae73-ec02091fca58") : secret "metallb-memberlist" not found Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.422554 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/268c93ae-fdc1-424e-806c-a4272b6e6ba0-frr-startup\") pod \"frr-k8s-hjkgp\" (UID: \"268c93ae-fdc1-424e-806c-a4272b6e6ba0\") " pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.422958 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/268c93ae-fdc1-424e-806c-a4272b6e6ba0-metrics\") pod \"frr-k8s-hjkgp\" (UID: \"268c93ae-fdc1-424e-806c-a4272b6e6ba0\") " pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.422985 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15b5f14a-3755-4967-b789-555f8ac970a2-cert\") pod \"controller-5bb64cd5d7-5795k\" (UID: \"15b5f14a-3755-4967-b789-555f8ac970a2\") " pod="metallb-system/controller-5bb64cd5d7-5795k" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.423010 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/268c93ae-fdc1-424e-806c-a4272b6e6ba0-frr-sockets\") pod \"frr-k8s-hjkgp\" (UID: \"268c93ae-fdc1-424e-806c-a4272b6e6ba0\") " pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.423033 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/268c93ae-fdc1-424e-806c-a4272b6e6ba0-metrics-certs\") pod \"frr-k8s-hjkgp\" (UID: \"268c93ae-fdc1-424e-806c-a4272b6e6ba0\") " pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.423053 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c89914a1-adc8-4baa-ae73-ec02091fca58-metrics-certs\") pod \"speaker-ng8gx\" (UID: \"c89914a1-adc8-4baa-ae73-ec02091fca58\") " pod="metallb-system/speaker-ng8gx" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.423098 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/777a90c8-0e68-4362-a696-c92e0a49253f-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-rrzxj\" (UID: \"777a90c8-0e68-4362-a696-c92e0a49253f\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rrzxj" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.423121 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngzc4\" (UniqueName: \"kubernetes.io/projected/15b5f14a-3755-4967-b789-555f8ac970a2-kube-api-access-ngzc4\") pod \"controller-5bb64cd5d7-5795k\" (UID: \"15b5f14a-3755-4967-b789-555f8ac970a2\") " pod="metallb-system/controller-5bb64cd5d7-5795k" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.423526 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/268c93ae-fdc1-424e-806c-a4272b6e6ba0-frr-startup\") pod \"frr-k8s-hjkgp\" (UID: \"268c93ae-fdc1-424e-806c-a4272b6e6ba0\") " pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.423726 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/268c93ae-fdc1-424e-806c-a4272b6e6ba0-metrics\") pod \"frr-k8s-hjkgp\" (UID: \"268c93ae-fdc1-424e-806c-a4272b6e6ba0\") " pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.423843 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/268c93ae-fdc1-424e-806c-a4272b6e6ba0-frr-conf\") pod \"frr-k8s-hjkgp\" (UID: \"268c93ae-fdc1-424e-806c-a4272b6e6ba0\") " pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.424015 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/268c93ae-fdc1-424e-806c-a4272b6e6ba0-frr-sockets\") pod \"frr-k8s-hjkgp\" (UID: \"268c93ae-fdc1-424e-806c-a4272b6e6ba0\") " pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.424503 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c89914a1-adc8-4baa-ae73-ec02091fca58-metallb-excludel2\") pod \"speaker-ng8gx\" (UID: \"c89914a1-adc8-4baa-ae73-ec02091fca58\") " pod="metallb-system/speaker-ng8gx" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.425013 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/268c93ae-fdc1-424e-806c-a4272b6e6ba0-reloader\") pod \"frr-k8s-hjkgp\" (UID: \"268c93ae-fdc1-424e-806c-a4272b6e6ba0\") " pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.427531 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.431392 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/777a90c8-0e68-4362-a696-c92e0a49253f-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-rrzxj\" (UID: \"777a90c8-0e68-4362-a696-c92e0a49253f\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rrzxj" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.431903 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/268c93ae-fdc1-424e-806c-a4272b6e6ba0-metrics-certs\") pod \"frr-k8s-hjkgp\" (UID: \"268c93ae-fdc1-424e-806c-a4272b6e6ba0\") " pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.437144 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/15b5f14a-3755-4967-b789-555f8ac970a2-metrics-certs\") pod \"controller-5bb64cd5d7-5795k\" (UID: \"15b5f14a-3755-4967-b789-555f8ac970a2\") " pod="metallb-system/controller-5bb64cd5d7-5795k" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.437519 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15b5f14a-3755-4967-b789-555f8ac970a2-cert\") pod \"controller-5bb64cd5d7-5795k\" (UID: \"15b5f14a-3755-4967-b789-555f8ac970a2\") " pod="metallb-system/controller-5bb64cd5d7-5795k" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.442495 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6dm8\" (UniqueName: \"kubernetes.io/projected/c89914a1-adc8-4baa-ae73-ec02091fca58-kube-api-access-k6dm8\") pod \"speaker-ng8gx\" (UID: \"c89914a1-adc8-4baa-ae73-ec02091fca58\") " pod="metallb-system/speaker-ng8gx" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.444260 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c89914a1-adc8-4baa-ae73-ec02091fca58-metrics-certs\") pod \"speaker-ng8gx\" (UID: \"c89914a1-adc8-4baa-ae73-ec02091fca58\") " pod="metallb-system/speaker-ng8gx" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.445703 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngzc4\" (UniqueName: \"kubernetes.io/projected/15b5f14a-3755-4967-b789-555f8ac970a2-kube-api-access-ngzc4\") pod \"controller-5bb64cd5d7-5795k\" (UID: \"15b5f14a-3755-4967-b789-555f8ac970a2\") " pod="metallb-system/controller-5bb64cd5d7-5795k" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.446101 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bfff\" (UniqueName: \"kubernetes.io/projected/268c93ae-fdc1-424e-806c-a4272b6e6ba0-kube-api-access-4bfff\") pod \"frr-k8s-hjkgp\" (UID: \"268c93ae-fdc1-424e-806c-a4272b6e6ba0\") " pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.447131 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvxwx\" (UniqueName: \"kubernetes.io/projected/777a90c8-0e68-4362-a696-c92e0a49253f-kube-api-access-vvxwx\") pod \"frr-k8s-webhook-server-bcc4b6f68-rrzxj\" (UID: \"777a90c8-0e68-4362-a696-c92e0a49253f\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rrzxj" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.469830 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.478218 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rrzxj" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.575894 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bb64cd5d7-5795k" Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.778152 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bb64cd5d7-5795k"] Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.824836 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bb64cd5d7-5795k" event={"ID":"15b5f14a-3755-4967-b789-555f8ac970a2","Type":"ContainerStarted","Data":"fe1c127a32cb4aaf54c3f10ff765e551d20c7f70ba931e4ef1c0fc9eae05855c"} Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.825956 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjkgp" event={"ID":"268c93ae-fdc1-424e-806c-a4272b6e6ba0","Type":"ContainerStarted","Data":"9338eba4091e43f148da9f0765c48d82f39f18d2ddd339368345b8524a847c7c"} Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.888826 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-rrzxj"] Apr 02 13:55:56 crc kubenswrapper[4732]: I0402 13:55:56.942234 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c89914a1-adc8-4baa-ae73-ec02091fca58-memberlist\") pod \"speaker-ng8gx\" (UID: \"c89914a1-adc8-4baa-ae73-ec02091fca58\") " pod="metallb-system/speaker-ng8gx" Apr 02 13:55:56 crc kubenswrapper[4732]: E0402 13:55:56.942371 4732 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Apr 02 13:55:56 crc kubenswrapper[4732]: E0402 13:55:56.942436 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c89914a1-adc8-4baa-ae73-ec02091fca58-memberlist podName:c89914a1-adc8-4baa-ae73-ec02091fca58 nodeName:}" failed. No retries permitted until 2026-04-02 13:55:57.942419459 +0000 UTC m=+1114.846827012 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c89914a1-adc8-4baa-ae73-ec02091fca58-memberlist") pod "speaker-ng8gx" (UID: "c89914a1-adc8-4baa-ae73-ec02091fca58") : secret "metallb-memberlist" not found Apr 02 13:55:57 crc kubenswrapper[4732]: I0402 13:55:57.833816 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rrzxj" event={"ID":"777a90c8-0e68-4362-a696-c92e0a49253f","Type":"ContainerStarted","Data":"bdbd196a699b03df2c05b0d1ac1d0d80c3d6fb0d1b970def74db74c353083d0a"} Apr 02 13:55:57 crc kubenswrapper[4732]: I0402 13:55:57.836378 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bb64cd5d7-5795k" event={"ID":"15b5f14a-3755-4967-b789-555f8ac970a2","Type":"ContainerStarted","Data":"63cce4881d9147c5c55ce70bacc9a697de540f454d6cbb67c67cc045e6f9876f"} Apr 02 13:55:57 crc kubenswrapper[4732]: I0402 13:55:57.836426 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bb64cd5d7-5795k" event={"ID":"15b5f14a-3755-4967-b789-555f8ac970a2","Type":"ContainerStarted","Data":"583d7bd32eacaf766419c5b591915e46923a6b73c3d598c9ad8dc18366b66d89"} Apr 02 13:55:57 crc kubenswrapper[4732]: I0402 13:55:57.836543 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5bb64cd5d7-5795k" Apr 02 13:55:57 crc kubenswrapper[4732]: I0402 13:55:57.854712 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5bb64cd5d7-5795k" podStartSLOduration=1.8546953529999999 podStartE2EDuration="1.854695353s" podCreationTimestamp="2026-04-02 13:55:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:55:57.85013863 +0000 UTC m=+1114.754546223" watchObservedRunningTime="2026-04-02 13:55:57.854695353 +0000 UTC m=+1114.759102906" Apr 02 13:55:57 crc kubenswrapper[4732]: I0402 13:55:57.955383 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c89914a1-adc8-4baa-ae73-ec02091fca58-memberlist\") pod \"speaker-ng8gx\" (UID: \"c89914a1-adc8-4baa-ae73-ec02091fca58\") " pod="metallb-system/speaker-ng8gx" Apr 02 13:55:57 crc kubenswrapper[4732]: I0402 13:55:57.960178 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c89914a1-adc8-4baa-ae73-ec02091fca58-memberlist\") pod \"speaker-ng8gx\" (UID: \"c89914a1-adc8-4baa-ae73-ec02091fca58\") " pod="metallb-system/speaker-ng8gx" Apr 02 13:55:58 crc kubenswrapper[4732]: I0402 13:55:58.057891 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-ng8gx" Apr 02 13:55:58 crc kubenswrapper[4732]: W0402 13:55:58.081502 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc89914a1_adc8_4baa_ae73_ec02091fca58.slice/crio-986e054ddf4408abc46f585ebae20814f515e8e9e38ca7e87bdcb12b02c61507 WatchSource:0}: Error finding container 986e054ddf4408abc46f585ebae20814f515e8e9e38ca7e87bdcb12b02c61507: Status 404 returned error can't find the container with id 986e054ddf4408abc46f585ebae20814f515e8e9e38ca7e87bdcb12b02c61507 Apr 02 13:55:58 crc kubenswrapper[4732]: I0402 13:55:58.852468 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ng8gx" event={"ID":"c89914a1-adc8-4baa-ae73-ec02091fca58","Type":"ContainerStarted","Data":"150bb5425a0cb97bda293cc449219d17ea8f6bf5060e37e4afbc98080561d541"} Apr 02 13:55:58 crc kubenswrapper[4732]: I0402 13:55:58.852760 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ng8gx" event={"ID":"c89914a1-adc8-4baa-ae73-ec02091fca58","Type":"ContainerStarted","Data":"80bf796905e5b6cb0114b0cdc82e5aeced2c3966b3ef8e5c881e3d1a12cbaa6d"} Apr 02 13:55:58 crc kubenswrapper[4732]: I0402 13:55:58.852770 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ng8gx" event={"ID":"c89914a1-adc8-4baa-ae73-ec02091fca58","Type":"ContainerStarted","Data":"986e054ddf4408abc46f585ebae20814f515e8e9e38ca7e87bdcb12b02c61507"} Apr 02 13:55:58 crc kubenswrapper[4732]: I0402 13:55:58.853331 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-ng8gx" Apr 02 13:55:58 crc kubenswrapper[4732]: I0402 13:55:58.877402 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-ng8gx" podStartSLOduration=2.877379344 podStartE2EDuration="2.877379344s" podCreationTimestamp="2026-04-02 13:55:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:55:58.871182228 +0000 UTC m=+1115.775589801" watchObservedRunningTime="2026-04-02 13:55:58.877379344 +0000 UTC m=+1115.781786907" Apr 02 13:56:00 crc kubenswrapper[4732]: I0402 13:56:00.129929 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585636-nrxqm"] Apr 02 13:56:00 crc kubenswrapper[4732]: I0402 13:56:00.130825 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585636-nrxqm" Apr 02 13:56:00 crc kubenswrapper[4732]: I0402 13:56:00.134991 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 13:56:00 crc kubenswrapper[4732]: I0402 13:56:00.135040 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 13:56:00 crc kubenswrapper[4732]: I0402 13:56:00.135942 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 13:56:00 crc kubenswrapper[4732]: I0402 13:56:00.155481 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585636-nrxqm"] Apr 02 13:56:00 crc kubenswrapper[4732]: I0402 13:56:00.186548 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj89v\" (UniqueName: \"kubernetes.io/projected/62eb3313-3f3d-4378-9427-ed1985cffffe-kube-api-access-nj89v\") pod \"auto-csr-approver-29585636-nrxqm\" (UID: \"62eb3313-3f3d-4378-9427-ed1985cffffe\") " pod="openshift-infra/auto-csr-approver-29585636-nrxqm" Apr 02 13:56:00 crc kubenswrapper[4732]: I0402 13:56:00.288400 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj89v\" (UniqueName: \"kubernetes.io/projected/62eb3313-3f3d-4378-9427-ed1985cffffe-kube-api-access-nj89v\") pod \"auto-csr-approver-29585636-nrxqm\" (UID: \"62eb3313-3f3d-4378-9427-ed1985cffffe\") " pod="openshift-infra/auto-csr-approver-29585636-nrxqm" Apr 02 13:56:00 crc kubenswrapper[4732]: I0402 13:56:00.307041 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj89v\" (UniqueName: \"kubernetes.io/projected/62eb3313-3f3d-4378-9427-ed1985cffffe-kube-api-access-nj89v\") pod \"auto-csr-approver-29585636-nrxqm\" (UID: \"62eb3313-3f3d-4378-9427-ed1985cffffe\") " pod="openshift-infra/auto-csr-approver-29585636-nrxqm" Apr 02 13:56:00 crc kubenswrapper[4732]: I0402 13:56:00.455050 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585636-nrxqm" Apr 02 13:56:01 crc kubenswrapper[4732]: I0402 13:56:01.038314 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585636-nrxqm"] Apr 02 13:56:01 crc kubenswrapper[4732]: W0402 13:56:01.067960 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62eb3313_3f3d_4378_9427_ed1985cffffe.slice/crio-4e06dd7122b34c21e98857fc7af823fdc31b0ca0bae2e7f70876e29372087f39 WatchSource:0}: Error finding container 4e06dd7122b34c21e98857fc7af823fdc31b0ca0bae2e7f70876e29372087f39: Status 404 returned error can't find the container with id 4e06dd7122b34c21e98857fc7af823fdc31b0ca0bae2e7f70876e29372087f39 Apr 02 13:56:01 crc kubenswrapper[4732]: I0402 13:56:01.877688 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585636-nrxqm" event={"ID":"62eb3313-3f3d-4378-9427-ed1985cffffe","Type":"ContainerStarted","Data":"4e06dd7122b34c21e98857fc7af823fdc31b0ca0bae2e7f70876e29372087f39"} Apr 02 13:56:01 crc kubenswrapper[4732]: I0402 13:56:01.937159 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 13:56:01 crc kubenswrapper[4732]: I0402 13:56:01.937214 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 13:56:05 crc kubenswrapper[4732]: I0402 13:56:05.912676 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585636-nrxqm" event={"ID":"62eb3313-3f3d-4378-9427-ed1985cffffe","Type":"ContainerStarted","Data":"bf88c2f3afd541dea90d3b34e2a4249e0c2216413015300809f098f9a9c8ecd1"} Apr 02 13:56:05 crc kubenswrapper[4732]: I0402 13:56:05.914318 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rrzxj" event={"ID":"777a90c8-0e68-4362-a696-c92e0a49253f","Type":"ContainerStarted","Data":"ba54857c73e844bf6e8e737df6b25839a9b44c86b94e63b93c53a820b07dbfaf"} Apr 02 13:56:05 crc kubenswrapper[4732]: I0402 13:56:05.914448 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rrzxj" Apr 02 13:56:05 crc kubenswrapper[4732]: I0402 13:56:05.916502 4732 generic.go:334] "Generic (PLEG): container finished" podID="268c93ae-fdc1-424e-806c-a4272b6e6ba0" containerID="836d3246e844782659a3a0cd188161744c6e92a5f19dfd8995e60403cbb3bdee" exitCode=0 Apr 02 13:56:05 crc kubenswrapper[4732]: I0402 13:56:05.916541 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjkgp" event={"ID":"268c93ae-fdc1-424e-806c-a4272b6e6ba0","Type":"ContainerDied","Data":"836d3246e844782659a3a0cd188161744c6e92a5f19dfd8995e60403cbb3bdee"} Apr 02 13:56:05 crc kubenswrapper[4732]: I0402 13:56:05.922742 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29585636-nrxqm" podStartSLOduration=1.395602617 podStartE2EDuration="5.922724014s" podCreationTimestamp="2026-04-02 13:56:00 +0000 UTC" firstStartedPulling="2026-04-02 13:56:01.070419906 +0000 UTC m=+1117.974827459" lastFinishedPulling="2026-04-02 13:56:05.597541303 +0000 UTC m=+1122.501948856" observedRunningTime="2026-04-02 13:56:05.92217539 +0000 UTC m=+1122.826582953" watchObservedRunningTime="2026-04-02 13:56:05.922724014 +0000 UTC m=+1122.827131567" Apr 02 13:56:05 crc kubenswrapper[4732]: I0402 13:56:05.938266 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rrzxj" podStartSLOduration=1.230205305 podStartE2EDuration="9.938246072s" podCreationTimestamp="2026-04-02 13:55:56 +0000 UTC" firstStartedPulling="2026-04-02 13:55:56.890272657 +0000 UTC m=+1113.794680210" lastFinishedPulling="2026-04-02 13:56:05.598313424 +0000 UTC m=+1122.502720977" observedRunningTime="2026-04-02 13:56:05.936257628 +0000 UTC m=+1122.840665191" watchObservedRunningTime="2026-04-02 13:56:05.938246072 +0000 UTC m=+1122.842653615" Apr 02 13:56:06 crc kubenswrapper[4732]: I0402 13:56:06.581319 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5bb64cd5d7-5795k" Apr 02 13:56:06 crc kubenswrapper[4732]: I0402 13:56:06.925785 4732 generic.go:334] "Generic (PLEG): container finished" podID="268c93ae-fdc1-424e-806c-a4272b6e6ba0" containerID="f8565c6a06b2e846fa17340832956dff94695cf7305c43a0ae40461c63909b55" exitCode=0 Apr 02 13:56:06 crc kubenswrapper[4732]: I0402 13:56:06.925822 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjkgp" event={"ID":"268c93ae-fdc1-424e-806c-a4272b6e6ba0","Type":"ContainerDied","Data":"f8565c6a06b2e846fa17340832956dff94695cf7305c43a0ae40461c63909b55"} Apr 02 13:56:06 crc kubenswrapper[4732]: I0402 13:56:06.927322 4732 generic.go:334] "Generic (PLEG): container finished" podID="62eb3313-3f3d-4378-9427-ed1985cffffe" containerID="bf88c2f3afd541dea90d3b34e2a4249e0c2216413015300809f098f9a9c8ecd1" exitCode=0 Apr 02 13:56:06 crc kubenswrapper[4732]: I0402 13:56:06.927393 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585636-nrxqm" event={"ID":"62eb3313-3f3d-4378-9427-ed1985cffffe","Type":"ContainerDied","Data":"bf88c2f3afd541dea90d3b34e2a4249e0c2216413015300809f098f9a9c8ecd1"} Apr 02 13:56:07 crc kubenswrapper[4732]: I0402 13:56:07.936749 4732 generic.go:334] "Generic (PLEG): container finished" podID="268c93ae-fdc1-424e-806c-a4272b6e6ba0" containerID="4fb73e94843625d5518aa310c8d3cc81874ae8891f8205eaffe82045caec7084" exitCode=0 Apr 02 13:56:07 crc kubenswrapper[4732]: I0402 13:56:07.936812 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjkgp" event={"ID":"268c93ae-fdc1-424e-806c-a4272b6e6ba0","Type":"ContainerDied","Data":"4fb73e94843625d5518aa310c8d3cc81874ae8891f8205eaffe82045caec7084"} Apr 02 13:56:08 crc kubenswrapper[4732]: I0402 13:56:08.061940 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-ng8gx" Apr 02 13:56:08 crc kubenswrapper[4732]: I0402 13:56:08.292944 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585636-nrxqm" Apr 02 13:56:08 crc kubenswrapper[4732]: I0402 13:56:08.405906 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj89v\" (UniqueName: \"kubernetes.io/projected/62eb3313-3f3d-4378-9427-ed1985cffffe-kube-api-access-nj89v\") pod \"62eb3313-3f3d-4378-9427-ed1985cffffe\" (UID: \"62eb3313-3f3d-4378-9427-ed1985cffffe\") " Apr 02 13:56:08 crc kubenswrapper[4732]: I0402 13:56:08.410985 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62eb3313-3f3d-4378-9427-ed1985cffffe-kube-api-access-nj89v" (OuterVolumeSpecName: "kube-api-access-nj89v") pod "62eb3313-3f3d-4378-9427-ed1985cffffe" (UID: "62eb3313-3f3d-4378-9427-ed1985cffffe"). InnerVolumeSpecName "kube-api-access-nj89v". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:56:08 crc kubenswrapper[4732]: I0402 13:56:08.508382 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj89v\" (UniqueName: \"kubernetes.io/projected/62eb3313-3f3d-4378-9427-ed1985cffffe-kube-api-access-nj89v\") on node \"crc\" DevicePath \"\"" Apr 02 13:56:08 crc kubenswrapper[4732]: I0402 13:56:08.946170 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585636-nrxqm" Apr 02 13:56:08 crc kubenswrapper[4732]: I0402 13:56:08.946384 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585636-nrxqm" event={"ID":"62eb3313-3f3d-4378-9427-ed1985cffffe","Type":"ContainerDied","Data":"4e06dd7122b34c21e98857fc7af823fdc31b0ca0bae2e7f70876e29372087f39"} Apr 02 13:56:08 crc kubenswrapper[4732]: I0402 13:56:08.946430 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e06dd7122b34c21e98857fc7af823fdc31b0ca0bae2e7f70876e29372087f39" Apr 02 13:56:08 crc kubenswrapper[4732]: I0402 13:56:08.952880 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjkgp" event={"ID":"268c93ae-fdc1-424e-806c-a4272b6e6ba0","Type":"ContainerStarted","Data":"37973308235d755e066c4c3293dabc1e8c896fce9d04228d35e6e09624cb9343"} Apr 02 13:56:08 crc kubenswrapper[4732]: I0402 13:56:08.952930 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjkgp" event={"ID":"268c93ae-fdc1-424e-806c-a4272b6e6ba0","Type":"ContainerStarted","Data":"52df2197a6807bfddeb6bd648249147f70d7bd18cbeac86c07a839e52bb4a0bb"} Apr 02 13:56:08 crc kubenswrapper[4732]: I0402 13:56:08.952943 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjkgp" event={"ID":"268c93ae-fdc1-424e-806c-a4272b6e6ba0","Type":"ContainerStarted","Data":"6e6e36d1ffe03f18d408854d43d1f6a926c3e1548676085c50c65f5e4ece4474"} Apr 02 13:56:08 crc kubenswrapper[4732]: I0402 13:56:08.952954 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjkgp" event={"ID":"268c93ae-fdc1-424e-806c-a4272b6e6ba0","Type":"ContainerStarted","Data":"920dc1d8d59b2dc17c97977f1ede30b65cc8710afed780e03d5b67f9ea8f3ef0"} Apr 02 13:56:08 crc kubenswrapper[4732]: I0402 13:56:08.952965 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjkgp" event={"ID":"268c93ae-fdc1-424e-806c-a4272b6e6ba0","Type":"ContainerStarted","Data":"bdcb6f4b1f437ad30dd85cae29d9d2c17ecb1586eef52bf2991651ab446adfc9"} Apr 02 13:56:08 crc kubenswrapper[4732]: I0402 13:56:08.967693 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585630-gh7px"] Apr 02 13:56:08 crc kubenswrapper[4732]: I0402 13:56:08.975467 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585630-gh7px"] Apr 02 13:56:09 crc kubenswrapper[4732]: I0402 13:56:09.963332 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjkgp" event={"ID":"268c93ae-fdc1-424e-806c-a4272b6e6ba0","Type":"ContainerStarted","Data":"30eea8c362bf555b5a2c447a1082a5447b56d62b8d8f2844ca84e0379725d282"} Apr 02 13:56:09 crc kubenswrapper[4732]: I0402 13:56:09.963633 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:56:09 crc kubenswrapper[4732]: I0402 13:56:09.984427 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-hjkgp" podStartSLOduration=5.111255583 podStartE2EDuration="13.984406988s" podCreationTimestamp="2026-04-02 13:55:56 +0000 UTC" firstStartedPulling="2026-04-02 13:55:56.724943893 +0000 UTC m=+1113.629351446" lastFinishedPulling="2026-04-02 13:56:05.598095258 +0000 UTC m=+1122.502502851" observedRunningTime="2026-04-02 13:56:09.98077489 +0000 UTC m=+1126.885182463" watchObservedRunningTime="2026-04-02 13:56:09.984406988 +0000 UTC m=+1126.888814541" Apr 02 13:56:10 crc kubenswrapper[4732]: I0402 13:56:10.695177 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c553e4fa-2f84-4e9c-be8f-bd59677b63b5" path="/var/lib/kubelet/pods/c553e4fa-2f84-4e9c-be8f-bd59677b63b5/volumes" Apr 02 13:56:11 crc kubenswrapper[4732]: I0402 13:56:11.401639 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-r7tzm"] Apr 02 13:56:11 crc kubenswrapper[4732]: E0402 13:56:11.403248 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62eb3313-3f3d-4378-9427-ed1985cffffe" containerName="oc" Apr 02 13:56:11 crc kubenswrapper[4732]: I0402 13:56:11.403370 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="62eb3313-3f3d-4378-9427-ed1985cffffe" containerName="oc" Apr 02 13:56:11 crc kubenswrapper[4732]: I0402 13:56:11.403575 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="62eb3313-3f3d-4378-9427-ed1985cffffe" containerName="oc" Apr 02 13:56:11 crc kubenswrapper[4732]: I0402 13:56:11.404212 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-r7tzm" Apr 02 13:56:11 crc kubenswrapper[4732]: I0402 13:56:11.408914 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-r7tzm"] Apr 02 13:56:11 crc kubenswrapper[4732]: I0402 13:56:11.409102 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Apr 02 13:56:11 crc kubenswrapper[4732]: I0402 13:56:11.409360 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-gv2rz" Apr 02 13:56:11 crc kubenswrapper[4732]: I0402 13:56:11.409758 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Apr 02 13:56:11 crc kubenswrapper[4732]: I0402 13:56:11.443694 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr4j8\" (UniqueName: \"kubernetes.io/projected/7dea8805-bd1c-400b-bddf-d3ac2cd57617-kube-api-access-qr4j8\") pod \"openstack-operator-index-r7tzm\" (UID: \"7dea8805-bd1c-400b-bddf-d3ac2cd57617\") " pod="openstack-operators/openstack-operator-index-r7tzm" Apr 02 13:56:11 crc kubenswrapper[4732]: I0402 13:56:11.470698 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:56:11 crc kubenswrapper[4732]: I0402 13:56:11.513836 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:56:11 crc kubenswrapper[4732]: I0402 13:56:11.545137 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr4j8\" (UniqueName: \"kubernetes.io/projected/7dea8805-bd1c-400b-bddf-d3ac2cd57617-kube-api-access-qr4j8\") pod \"openstack-operator-index-r7tzm\" (UID: \"7dea8805-bd1c-400b-bddf-d3ac2cd57617\") " pod="openstack-operators/openstack-operator-index-r7tzm" Apr 02 13:56:11 crc kubenswrapper[4732]: I0402 13:56:11.572246 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr4j8\" (UniqueName: \"kubernetes.io/projected/7dea8805-bd1c-400b-bddf-d3ac2cd57617-kube-api-access-qr4j8\") pod \"openstack-operator-index-r7tzm\" (UID: \"7dea8805-bd1c-400b-bddf-d3ac2cd57617\") " pod="openstack-operators/openstack-operator-index-r7tzm" Apr 02 13:56:11 crc kubenswrapper[4732]: I0402 13:56:11.723567 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-r7tzm" Apr 02 13:56:11 crc kubenswrapper[4732]: I0402 13:56:11.941221 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-r7tzm"] Apr 02 13:56:11 crc kubenswrapper[4732]: I0402 13:56:11.984225 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-r7tzm" event={"ID":"7dea8805-bd1c-400b-bddf-d3ac2cd57617","Type":"ContainerStarted","Data":"fb5d7fcb9d269b6e1e71ef097523a562c560528ec7bf37cda705468384be5d6e"} Apr 02 13:56:16 crc kubenswrapper[4732]: I0402 13:56:16.483985 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rrzxj" Apr 02 13:56:18 crc kubenswrapper[4732]: I0402 13:56:18.018908 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-r7tzm" event={"ID":"7dea8805-bd1c-400b-bddf-d3ac2cd57617","Type":"ContainerStarted","Data":"3b990ceb0365e9098d26bf2079cf1655a7588b12a4b3e6140c9353c1b44db66d"} Apr 02 13:56:18 crc kubenswrapper[4732]: I0402 13:56:18.035941 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-r7tzm" podStartSLOduration=1.831835419 podStartE2EDuration="7.035916813s" podCreationTimestamp="2026-04-02 13:56:11 +0000 UTC" firstStartedPulling="2026-04-02 13:56:11.952629406 +0000 UTC m=+1128.857036969" lastFinishedPulling="2026-04-02 13:56:17.15671081 +0000 UTC m=+1134.061118363" observedRunningTime="2026-04-02 13:56:18.029891311 +0000 UTC m=+1134.934298874" watchObservedRunningTime="2026-04-02 13:56:18.035916813 +0000 UTC m=+1134.940324366" Apr 02 13:56:21 crc kubenswrapper[4732]: I0402 13:56:21.724662 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-r7tzm" Apr 02 13:56:21 crc kubenswrapper[4732]: I0402 13:56:21.725076 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-r7tzm" Apr 02 13:56:21 crc kubenswrapper[4732]: I0402 13:56:21.751807 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-r7tzm" Apr 02 13:56:22 crc kubenswrapper[4732]: I0402 13:56:22.062688 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-r7tzm" Apr 02 13:56:26 crc kubenswrapper[4732]: I0402 13:56:26.473480 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-hjkgp" Apr 02 13:56:28 crc kubenswrapper[4732]: I0402 13:56:28.412455 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf"] Apr 02 13:56:28 crc kubenswrapper[4732]: I0402 13:56:28.413840 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf" Apr 02 13:56:28 crc kubenswrapper[4732]: I0402 13:56:28.415971 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-cvscd" Apr 02 13:56:28 crc kubenswrapper[4732]: I0402 13:56:28.423156 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf"] Apr 02 13:56:28 crc kubenswrapper[4732]: I0402 13:56:28.490302 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85e10d53-2bd5-4a87-8ec8-89a9ef13f766-bundle\") pod \"59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf\" (UID: \"85e10d53-2bd5-4a87-8ec8-89a9ef13f766\") " pod="openstack-operators/59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf" Apr 02 13:56:28 crc kubenswrapper[4732]: I0402 13:56:28.490570 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85e10d53-2bd5-4a87-8ec8-89a9ef13f766-util\") pod \"59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf\" (UID: \"85e10d53-2bd5-4a87-8ec8-89a9ef13f766\") " pod="openstack-operators/59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf" Apr 02 13:56:28 crc kubenswrapper[4732]: I0402 13:56:28.490663 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55sng\" (UniqueName: \"kubernetes.io/projected/85e10d53-2bd5-4a87-8ec8-89a9ef13f766-kube-api-access-55sng\") pod \"59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf\" (UID: \"85e10d53-2bd5-4a87-8ec8-89a9ef13f766\") " pod="openstack-operators/59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf" Apr 02 13:56:28 crc kubenswrapper[4732]: I0402 13:56:28.592176 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85e10d53-2bd5-4a87-8ec8-89a9ef13f766-bundle\") pod \"59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf\" (UID: \"85e10d53-2bd5-4a87-8ec8-89a9ef13f766\") " pod="openstack-operators/59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf" Apr 02 13:56:28 crc kubenswrapper[4732]: I0402 13:56:28.592284 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85e10d53-2bd5-4a87-8ec8-89a9ef13f766-util\") pod \"59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf\" (UID: \"85e10d53-2bd5-4a87-8ec8-89a9ef13f766\") " pod="openstack-operators/59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf" Apr 02 13:56:28 crc kubenswrapper[4732]: I0402 13:56:28.592418 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55sng\" (UniqueName: \"kubernetes.io/projected/85e10d53-2bd5-4a87-8ec8-89a9ef13f766-kube-api-access-55sng\") pod \"59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf\" (UID: \"85e10d53-2bd5-4a87-8ec8-89a9ef13f766\") " pod="openstack-operators/59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf" Apr 02 13:56:28 crc kubenswrapper[4732]: I0402 13:56:28.592883 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85e10d53-2bd5-4a87-8ec8-89a9ef13f766-bundle\") pod \"59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf\" (UID: \"85e10d53-2bd5-4a87-8ec8-89a9ef13f766\") " pod="openstack-operators/59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf" Apr 02 13:56:28 crc kubenswrapper[4732]: I0402 13:56:28.592921 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85e10d53-2bd5-4a87-8ec8-89a9ef13f766-util\") pod \"59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf\" (UID: \"85e10d53-2bd5-4a87-8ec8-89a9ef13f766\") " pod="openstack-operators/59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf" Apr 02 13:56:28 crc kubenswrapper[4732]: I0402 13:56:28.611478 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55sng\" (UniqueName: \"kubernetes.io/projected/85e10d53-2bd5-4a87-8ec8-89a9ef13f766-kube-api-access-55sng\") pod \"59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf\" (UID: \"85e10d53-2bd5-4a87-8ec8-89a9ef13f766\") " pod="openstack-operators/59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf" Apr 02 13:56:28 crc kubenswrapper[4732]: I0402 13:56:28.742745 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf" Apr 02 13:56:29 crc kubenswrapper[4732]: I0402 13:56:29.150705 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf"] Apr 02 13:56:30 crc kubenswrapper[4732]: I0402 13:56:30.090967 4732 generic.go:334] "Generic (PLEG): container finished" podID="85e10d53-2bd5-4a87-8ec8-89a9ef13f766" containerID="a4834f9f27f56517ba8f19d29eccd5e4efd4838cceb383dd7aa95f0423c2c8bb" exitCode=0 Apr 02 13:56:30 crc kubenswrapper[4732]: I0402 13:56:30.091014 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf" event={"ID":"85e10d53-2bd5-4a87-8ec8-89a9ef13f766","Type":"ContainerDied","Data":"a4834f9f27f56517ba8f19d29eccd5e4efd4838cceb383dd7aa95f0423c2c8bb"} Apr 02 13:56:30 crc kubenswrapper[4732]: I0402 13:56:30.091041 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf" event={"ID":"85e10d53-2bd5-4a87-8ec8-89a9ef13f766","Type":"ContainerStarted","Data":"2ae08aad08b6ce09929dfe9aba85673734edb5ba778156e6268c62ae890ee0cb"} Apr 02 13:56:31 crc kubenswrapper[4732]: I0402 13:56:31.098960 4732 generic.go:334] "Generic (PLEG): container finished" podID="85e10d53-2bd5-4a87-8ec8-89a9ef13f766" containerID="1656da318f5014bd901233bea565a9a616480c59b67309a511e1b1fc041bd0ac" exitCode=0 Apr 02 13:56:31 crc kubenswrapper[4732]: I0402 13:56:31.099158 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf" event={"ID":"85e10d53-2bd5-4a87-8ec8-89a9ef13f766","Type":"ContainerDied","Data":"1656da318f5014bd901233bea565a9a616480c59b67309a511e1b1fc041bd0ac"} Apr 02 13:56:31 crc kubenswrapper[4732]: I0402 13:56:31.924643 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 13:56:31 crc kubenswrapper[4732]: I0402 13:56:31.924999 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 13:56:32 crc kubenswrapper[4732]: I0402 13:56:32.111163 4732 generic.go:334] "Generic (PLEG): container finished" podID="85e10d53-2bd5-4a87-8ec8-89a9ef13f766" containerID="d37ec8cb1a5932d81a4ef7dc6e2e1e1b855552d694d19ec9d6048ff2b47cce3c" exitCode=0 Apr 02 13:56:32 crc kubenswrapper[4732]: I0402 13:56:32.111272 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf" event={"ID":"85e10d53-2bd5-4a87-8ec8-89a9ef13f766","Type":"ContainerDied","Data":"d37ec8cb1a5932d81a4ef7dc6e2e1e1b855552d694d19ec9d6048ff2b47cce3c"} Apr 02 13:56:33 crc kubenswrapper[4732]: I0402 13:56:33.389479 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf" Apr 02 13:56:33 crc kubenswrapper[4732]: I0402 13:56:33.451412 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85e10d53-2bd5-4a87-8ec8-89a9ef13f766-util\") pod \"85e10d53-2bd5-4a87-8ec8-89a9ef13f766\" (UID: \"85e10d53-2bd5-4a87-8ec8-89a9ef13f766\") " Apr 02 13:56:33 crc kubenswrapper[4732]: I0402 13:56:33.451495 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85e10d53-2bd5-4a87-8ec8-89a9ef13f766-bundle\") pod \"85e10d53-2bd5-4a87-8ec8-89a9ef13f766\" (UID: \"85e10d53-2bd5-4a87-8ec8-89a9ef13f766\") " Apr 02 13:56:33 crc kubenswrapper[4732]: I0402 13:56:33.451598 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55sng\" (UniqueName: \"kubernetes.io/projected/85e10d53-2bd5-4a87-8ec8-89a9ef13f766-kube-api-access-55sng\") pod \"85e10d53-2bd5-4a87-8ec8-89a9ef13f766\" (UID: \"85e10d53-2bd5-4a87-8ec8-89a9ef13f766\") " Apr 02 13:56:33 crc kubenswrapper[4732]: I0402 13:56:33.452756 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85e10d53-2bd5-4a87-8ec8-89a9ef13f766-bundle" (OuterVolumeSpecName: "bundle") pod "85e10d53-2bd5-4a87-8ec8-89a9ef13f766" (UID: "85e10d53-2bd5-4a87-8ec8-89a9ef13f766"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:56:33 crc kubenswrapper[4732]: I0402 13:56:33.457104 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85e10d53-2bd5-4a87-8ec8-89a9ef13f766-kube-api-access-55sng" (OuterVolumeSpecName: "kube-api-access-55sng") pod "85e10d53-2bd5-4a87-8ec8-89a9ef13f766" (UID: "85e10d53-2bd5-4a87-8ec8-89a9ef13f766"). InnerVolumeSpecName "kube-api-access-55sng". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:56:33 crc kubenswrapper[4732]: I0402 13:56:33.468000 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85e10d53-2bd5-4a87-8ec8-89a9ef13f766-util" (OuterVolumeSpecName: "util") pod "85e10d53-2bd5-4a87-8ec8-89a9ef13f766" (UID: "85e10d53-2bd5-4a87-8ec8-89a9ef13f766"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:56:33 crc kubenswrapper[4732]: I0402 13:56:33.553491 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55sng\" (UniqueName: \"kubernetes.io/projected/85e10d53-2bd5-4a87-8ec8-89a9ef13f766-kube-api-access-55sng\") on node \"crc\" DevicePath \"\"" Apr 02 13:56:33 crc kubenswrapper[4732]: I0402 13:56:33.553525 4732 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85e10d53-2bd5-4a87-8ec8-89a9ef13f766-util\") on node \"crc\" DevicePath \"\"" Apr 02 13:56:33 crc kubenswrapper[4732]: I0402 13:56:33.553535 4732 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85e10d53-2bd5-4a87-8ec8-89a9ef13f766-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 13:56:34 crc kubenswrapper[4732]: I0402 13:56:34.127042 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf" event={"ID":"85e10d53-2bd5-4a87-8ec8-89a9ef13f766","Type":"ContainerDied","Data":"2ae08aad08b6ce09929dfe9aba85673734edb5ba778156e6268c62ae890ee0cb"} Apr 02 13:56:34 crc kubenswrapper[4732]: I0402 13:56:34.127485 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ae08aad08b6ce09929dfe9aba85673734edb5ba778156e6268c62ae890ee0cb" Apr 02 13:56:34 crc kubenswrapper[4732]: I0402 13:56:34.127082 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf" Apr 02 13:56:40 crc kubenswrapper[4732]: I0402 13:56:40.427400 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-f786688f5-tv7s7"] Apr 02 13:56:40 crc kubenswrapper[4732]: E0402 13:56:40.427965 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e10d53-2bd5-4a87-8ec8-89a9ef13f766" containerName="pull" Apr 02 13:56:40 crc kubenswrapper[4732]: I0402 13:56:40.427983 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e10d53-2bd5-4a87-8ec8-89a9ef13f766" containerName="pull" Apr 02 13:56:40 crc kubenswrapper[4732]: E0402 13:56:40.428004 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e10d53-2bd5-4a87-8ec8-89a9ef13f766" containerName="util" Apr 02 13:56:40 crc kubenswrapper[4732]: I0402 13:56:40.428012 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e10d53-2bd5-4a87-8ec8-89a9ef13f766" containerName="util" Apr 02 13:56:40 crc kubenswrapper[4732]: E0402 13:56:40.428021 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e10d53-2bd5-4a87-8ec8-89a9ef13f766" containerName="extract" Apr 02 13:56:40 crc kubenswrapper[4732]: I0402 13:56:40.428028 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e10d53-2bd5-4a87-8ec8-89a9ef13f766" containerName="extract" Apr 02 13:56:40 crc kubenswrapper[4732]: I0402 13:56:40.428142 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e10d53-2bd5-4a87-8ec8-89a9ef13f766" containerName="extract" Apr 02 13:56:40 crc kubenswrapper[4732]: I0402 13:56:40.428537 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-f786688f5-tv7s7" Apr 02 13:56:40 crc kubenswrapper[4732]: I0402 13:56:40.430113 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-pphzg" Apr 02 13:56:40 crc kubenswrapper[4732]: I0402 13:56:40.510174 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-f786688f5-tv7s7"] Apr 02 13:56:40 crc kubenswrapper[4732]: I0402 13:56:40.543304 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfkdv\" (UniqueName: \"kubernetes.io/projected/1e51ae33-6c4c-4e7f-8309-ef0c8901b6ed-kube-api-access-zfkdv\") pod \"openstack-operator-controller-init-f786688f5-tv7s7\" (UID: \"1e51ae33-6c4c-4e7f-8309-ef0c8901b6ed\") " pod="openstack-operators/openstack-operator-controller-init-f786688f5-tv7s7" Apr 02 13:56:40 crc kubenswrapper[4732]: I0402 13:56:40.644251 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfkdv\" (UniqueName: \"kubernetes.io/projected/1e51ae33-6c4c-4e7f-8309-ef0c8901b6ed-kube-api-access-zfkdv\") pod \"openstack-operator-controller-init-f786688f5-tv7s7\" (UID: \"1e51ae33-6c4c-4e7f-8309-ef0c8901b6ed\") " pod="openstack-operators/openstack-operator-controller-init-f786688f5-tv7s7" Apr 02 13:56:40 crc kubenswrapper[4732]: I0402 13:56:40.662351 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfkdv\" (UniqueName: \"kubernetes.io/projected/1e51ae33-6c4c-4e7f-8309-ef0c8901b6ed-kube-api-access-zfkdv\") pod \"openstack-operator-controller-init-f786688f5-tv7s7\" (UID: \"1e51ae33-6c4c-4e7f-8309-ef0c8901b6ed\") " pod="openstack-operators/openstack-operator-controller-init-f786688f5-tv7s7" Apr 02 13:56:40 crc kubenswrapper[4732]: I0402 13:56:40.748948 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-f786688f5-tv7s7" Apr 02 13:56:41 crc kubenswrapper[4732]: I0402 13:56:41.240933 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-f786688f5-tv7s7"] Apr 02 13:56:42 crc kubenswrapper[4732]: I0402 13:56:42.176879 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-f786688f5-tv7s7" event={"ID":"1e51ae33-6c4c-4e7f-8309-ef0c8901b6ed","Type":"ContainerStarted","Data":"6e924f9d10ea425094e478534709848c57a65ea4191aca02d8fa0b95d2d0227e"} Apr 02 13:56:45 crc kubenswrapper[4732]: I0402 13:56:45.900635 4732 scope.go:117] "RemoveContainer" containerID="0ad2dae1a4cd4b1fd0cf7f3610ac1b2608ff38d2b83dc2c4bbc87f7fe141ef48" Apr 02 13:56:46 crc kubenswrapper[4732]: I0402 13:56:46.211052 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-f786688f5-tv7s7" event={"ID":"1e51ae33-6c4c-4e7f-8309-ef0c8901b6ed","Type":"ContainerStarted","Data":"0a4a9c59d903b3bac316f065e168fe95261783bfaa89302a000760bd0acfd4fd"} Apr 02 13:56:46 crc kubenswrapper[4732]: I0402 13:56:46.211361 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-f786688f5-tv7s7" Apr 02 13:56:46 crc kubenswrapper[4732]: I0402 13:56:46.249408 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-f786688f5-tv7s7" podStartSLOduration=2.381137346 podStartE2EDuration="6.249391521s" podCreationTimestamp="2026-04-02 13:56:40 +0000 UTC" firstStartedPulling="2026-04-02 13:56:41.254819439 +0000 UTC m=+1158.159226992" lastFinishedPulling="2026-04-02 13:56:45.123073614 +0000 UTC m=+1162.027481167" observedRunningTime="2026-04-02 13:56:46.246452182 +0000 UTC m=+1163.150859745" watchObservedRunningTime="2026-04-02 13:56:46.249391521 +0000 UTC m=+1163.153799084" Apr 02 13:56:50 crc kubenswrapper[4732]: I0402 13:56:50.754215 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-f786688f5-tv7s7" Apr 02 13:56:54 crc kubenswrapper[4732]: I0402 13:56:54.675205 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9ss2c"] Apr 02 13:56:54 crc kubenswrapper[4732]: I0402 13:56:54.676654 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9ss2c" Apr 02 13:56:54 crc kubenswrapper[4732]: I0402 13:56:54.704576 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9ss2c"] Apr 02 13:56:54 crc kubenswrapper[4732]: I0402 13:56:54.763372 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f0b4612-5b96-4c2c-a533-873713bb8e56-catalog-content\") pod \"community-operators-9ss2c\" (UID: \"3f0b4612-5b96-4c2c-a533-873713bb8e56\") " pod="openshift-marketplace/community-operators-9ss2c" Apr 02 13:56:54 crc kubenswrapper[4732]: I0402 13:56:54.763526 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f0b4612-5b96-4c2c-a533-873713bb8e56-utilities\") pod \"community-operators-9ss2c\" (UID: \"3f0b4612-5b96-4c2c-a533-873713bb8e56\") " pod="openshift-marketplace/community-operators-9ss2c" Apr 02 13:56:54 crc kubenswrapper[4732]: I0402 13:56:54.763564 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8mtv\" (UniqueName: \"kubernetes.io/projected/3f0b4612-5b96-4c2c-a533-873713bb8e56-kube-api-access-s8mtv\") pod \"community-operators-9ss2c\" (UID: \"3f0b4612-5b96-4c2c-a533-873713bb8e56\") " pod="openshift-marketplace/community-operators-9ss2c" Apr 02 13:56:54 crc kubenswrapper[4732]: I0402 13:56:54.864369 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f0b4612-5b96-4c2c-a533-873713bb8e56-utilities\") pod \"community-operators-9ss2c\" (UID: \"3f0b4612-5b96-4c2c-a533-873713bb8e56\") " pod="openshift-marketplace/community-operators-9ss2c" Apr 02 13:56:54 crc kubenswrapper[4732]: I0402 13:56:54.864432 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8mtv\" (UniqueName: \"kubernetes.io/projected/3f0b4612-5b96-4c2c-a533-873713bb8e56-kube-api-access-s8mtv\") pod \"community-operators-9ss2c\" (UID: \"3f0b4612-5b96-4c2c-a533-873713bb8e56\") " pod="openshift-marketplace/community-operators-9ss2c" Apr 02 13:56:54 crc kubenswrapper[4732]: I0402 13:56:54.864478 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f0b4612-5b96-4c2c-a533-873713bb8e56-catalog-content\") pod \"community-operators-9ss2c\" (UID: \"3f0b4612-5b96-4c2c-a533-873713bb8e56\") " pod="openshift-marketplace/community-operators-9ss2c" Apr 02 13:56:54 crc kubenswrapper[4732]: I0402 13:56:54.865002 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f0b4612-5b96-4c2c-a533-873713bb8e56-catalog-content\") pod \"community-operators-9ss2c\" (UID: \"3f0b4612-5b96-4c2c-a533-873713bb8e56\") " pod="openshift-marketplace/community-operators-9ss2c" Apr 02 13:56:54 crc kubenswrapper[4732]: I0402 13:56:54.865083 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f0b4612-5b96-4c2c-a533-873713bb8e56-utilities\") pod \"community-operators-9ss2c\" (UID: \"3f0b4612-5b96-4c2c-a533-873713bb8e56\") " pod="openshift-marketplace/community-operators-9ss2c" Apr 02 13:56:54 crc kubenswrapper[4732]: I0402 13:56:54.886518 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8mtv\" (UniqueName: \"kubernetes.io/projected/3f0b4612-5b96-4c2c-a533-873713bb8e56-kube-api-access-s8mtv\") pod \"community-operators-9ss2c\" (UID: \"3f0b4612-5b96-4c2c-a533-873713bb8e56\") " pod="openshift-marketplace/community-operators-9ss2c" Apr 02 13:56:54 crc kubenswrapper[4732]: I0402 13:56:54.994577 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9ss2c" Apr 02 13:56:55 crc kubenswrapper[4732]: I0402 13:56:55.479890 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9ss2c"] Apr 02 13:56:56 crc kubenswrapper[4732]: I0402 13:56:56.296285 4732 generic.go:334] "Generic (PLEG): container finished" podID="3f0b4612-5b96-4c2c-a533-873713bb8e56" containerID="72732e6390cc6f0ad704dc9674c554f2c0e11daf6ad7d7bb1fe75cca07edb911" exitCode=0 Apr 02 13:56:56 crc kubenswrapper[4732]: I0402 13:56:56.296336 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ss2c" event={"ID":"3f0b4612-5b96-4c2c-a533-873713bb8e56","Type":"ContainerDied","Data":"72732e6390cc6f0ad704dc9674c554f2c0e11daf6ad7d7bb1fe75cca07edb911"} Apr 02 13:56:56 crc kubenswrapper[4732]: I0402 13:56:56.296369 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ss2c" event={"ID":"3f0b4612-5b96-4c2c-a533-873713bb8e56","Type":"ContainerStarted","Data":"68c004896b149b2e72905493aa2d1e561d51da4e8c4665339c3013b4c6bfd9dd"} Apr 02 13:56:58 crc kubenswrapper[4732]: I0402 13:56:58.309413 4732 generic.go:334] "Generic (PLEG): container finished" podID="3f0b4612-5b96-4c2c-a533-873713bb8e56" containerID="b2492831a9a05c6b600f0dfbc6599ac8766f278e618843b65219e11562e96621" exitCode=0 Apr 02 13:56:58 crc kubenswrapper[4732]: I0402 13:56:58.309513 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ss2c" event={"ID":"3f0b4612-5b96-4c2c-a533-873713bb8e56","Type":"ContainerDied","Data":"b2492831a9a05c6b600f0dfbc6599ac8766f278e618843b65219e11562e96621"} Apr 02 13:57:00 crc kubenswrapper[4732]: I0402 13:57:00.325630 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ss2c" event={"ID":"3f0b4612-5b96-4c2c-a533-873713bb8e56","Type":"ContainerStarted","Data":"1a2b0d4093d95aefbe87e4c30c57a798d0676b24f4106da1a77a10c8de2e5f49"} Apr 02 13:57:00 crc kubenswrapper[4732]: I0402 13:57:00.347611 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9ss2c" podStartSLOduration=3.6643142600000003 podStartE2EDuration="6.34758927s" podCreationTimestamp="2026-04-02 13:56:54 +0000 UTC" firstStartedPulling="2026-04-02 13:56:56.297842977 +0000 UTC m=+1173.202250530" lastFinishedPulling="2026-04-02 13:56:58.981117987 +0000 UTC m=+1175.885525540" observedRunningTime="2026-04-02 13:57:00.344301052 +0000 UTC m=+1177.248708605" watchObservedRunningTime="2026-04-02 13:57:00.34758927 +0000 UTC m=+1177.251996813" Apr 02 13:57:01 crc kubenswrapper[4732]: I0402 13:57:01.924903 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 13:57:01 crc kubenswrapper[4732]: I0402 13:57:01.924971 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 13:57:01 crc kubenswrapper[4732]: I0402 13:57:01.925022 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 13:57:02 crc kubenswrapper[4732]: I0402 13:57:02.339231 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6beef2fa99836ab6f985ec458e30c6e22b8f1d0b42722462a9fe13d02e226853"} pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 02 13:57:02 crc kubenswrapper[4732]: I0402 13:57:02.339506 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" containerID="cri-o://6beef2fa99836ab6f985ec458e30c6e22b8f1d0b42722462a9fe13d02e226853" gracePeriod=600 Apr 02 13:57:03 crc kubenswrapper[4732]: I0402 13:57:03.348064 4732 generic.go:334] "Generic (PLEG): container finished" podID="38409e5e-4545-49da-8f6c-4bfb30582878" containerID="6beef2fa99836ab6f985ec458e30c6e22b8f1d0b42722462a9fe13d02e226853" exitCode=0 Apr 02 13:57:03 crc kubenswrapper[4732]: I0402 13:57:03.348131 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerDied","Data":"6beef2fa99836ab6f985ec458e30c6e22b8f1d0b42722462a9fe13d02e226853"} Apr 02 13:57:03 crc kubenswrapper[4732]: I0402 13:57:03.348401 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerStarted","Data":"7fb2687018e193fb92c41619c313936d4cbab14821cf21277c10428a796150c1"} Apr 02 13:57:03 crc kubenswrapper[4732]: I0402 13:57:03.348425 4732 scope.go:117] "RemoveContainer" containerID="7878e048a4d64163e8a31b7ae9f684fbec512dadbd638965377c01f618d3ee60" Apr 02 13:57:03 crc kubenswrapper[4732]: I0402 13:57:03.776201 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n9mhr"] Apr 02 13:57:03 crc kubenswrapper[4732]: I0402 13:57:03.777686 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n9mhr" Apr 02 13:57:03 crc kubenswrapper[4732]: I0402 13:57:03.790010 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n9mhr"] Apr 02 13:57:03 crc kubenswrapper[4732]: I0402 13:57:03.977337 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e050bf65-a203-40f8-9445-fbdcb87b7e91-catalog-content\") pod \"redhat-marketplace-n9mhr\" (UID: \"e050bf65-a203-40f8-9445-fbdcb87b7e91\") " pod="openshift-marketplace/redhat-marketplace-n9mhr" Apr 02 13:57:03 crc kubenswrapper[4732]: I0402 13:57:03.977453 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4tkv\" (UniqueName: \"kubernetes.io/projected/e050bf65-a203-40f8-9445-fbdcb87b7e91-kube-api-access-m4tkv\") pod \"redhat-marketplace-n9mhr\" (UID: \"e050bf65-a203-40f8-9445-fbdcb87b7e91\") " pod="openshift-marketplace/redhat-marketplace-n9mhr" Apr 02 13:57:03 crc kubenswrapper[4732]: I0402 13:57:03.977476 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e050bf65-a203-40f8-9445-fbdcb87b7e91-utilities\") pod \"redhat-marketplace-n9mhr\" (UID: \"e050bf65-a203-40f8-9445-fbdcb87b7e91\") " pod="openshift-marketplace/redhat-marketplace-n9mhr" Apr 02 13:57:04 crc kubenswrapper[4732]: I0402 13:57:04.165142 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4tkv\" (UniqueName: \"kubernetes.io/projected/e050bf65-a203-40f8-9445-fbdcb87b7e91-kube-api-access-m4tkv\") pod \"redhat-marketplace-n9mhr\" (UID: \"e050bf65-a203-40f8-9445-fbdcb87b7e91\") " pod="openshift-marketplace/redhat-marketplace-n9mhr" Apr 02 13:57:04 crc kubenswrapper[4732]: I0402 13:57:04.165460 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e050bf65-a203-40f8-9445-fbdcb87b7e91-utilities\") pod \"redhat-marketplace-n9mhr\" (UID: \"e050bf65-a203-40f8-9445-fbdcb87b7e91\") " pod="openshift-marketplace/redhat-marketplace-n9mhr" Apr 02 13:57:04 crc kubenswrapper[4732]: I0402 13:57:04.165493 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e050bf65-a203-40f8-9445-fbdcb87b7e91-catalog-content\") pod \"redhat-marketplace-n9mhr\" (UID: \"e050bf65-a203-40f8-9445-fbdcb87b7e91\") " pod="openshift-marketplace/redhat-marketplace-n9mhr" Apr 02 13:57:04 crc kubenswrapper[4732]: I0402 13:57:04.166017 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e050bf65-a203-40f8-9445-fbdcb87b7e91-catalog-content\") pod \"redhat-marketplace-n9mhr\" (UID: \"e050bf65-a203-40f8-9445-fbdcb87b7e91\") " pod="openshift-marketplace/redhat-marketplace-n9mhr" Apr 02 13:57:04 crc kubenswrapper[4732]: I0402 13:57:04.166306 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e050bf65-a203-40f8-9445-fbdcb87b7e91-utilities\") pod \"redhat-marketplace-n9mhr\" (UID: \"e050bf65-a203-40f8-9445-fbdcb87b7e91\") " pod="openshift-marketplace/redhat-marketplace-n9mhr" Apr 02 13:57:04 crc kubenswrapper[4732]: I0402 13:57:04.207073 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4tkv\" (UniqueName: \"kubernetes.io/projected/e050bf65-a203-40f8-9445-fbdcb87b7e91-kube-api-access-m4tkv\") pod \"redhat-marketplace-n9mhr\" (UID: \"e050bf65-a203-40f8-9445-fbdcb87b7e91\") " pod="openshift-marketplace/redhat-marketplace-n9mhr" Apr 02 13:57:04 crc kubenswrapper[4732]: I0402 13:57:04.395737 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n9mhr" Apr 02 13:57:04 crc kubenswrapper[4732]: I0402 13:57:04.958273 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n9mhr"] Apr 02 13:57:04 crc kubenswrapper[4732]: W0402 13:57:04.963253 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode050bf65_a203_40f8_9445_fbdcb87b7e91.slice/crio-7bcad61ad3ae14eae9a9ca84d4c08ca7f9c84e3fa7fe8ac057586efd691e3b58 WatchSource:0}: Error finding container 7bcad61ad3ae14eae9a9ca84d4c08ca7f9c84e3fa7fe8ac057586efd691e3b58: Status 404 returned error can't find the container with id 7bcad61ad3ae14eae9a9ca84d4c08ca7f9c84e3fa7fe8ac057586efd691e3b58 Apr 02 13:57:04 crc kubenswrapper[4732]: I0402 13:57:04.994868 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9ss2c" Apr 02 13:57:04 crc kubenswrapper[4732]: I0402 13:57:04.995124 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9ss2c" Apr 02 13:57:05 crc kubenswrapper[4732]: I0402 13:57:05.058843 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9ss2c" Apr 02 13:57:05 crc kubenswrapper[4732]: I0402 13:57:05.364498 4732 generic.go:334] "Generic (PLEG): container finished" podID="e050bf65-a203-40f8-9445-fbdcb87b7e91" containerID="c0edd05d6ebe2c4e4ba7d14a06d43b65488f00e63d2ea851f836d016d04cbff1" exitCode=0 Apr 02 13:57:05 crc kubenswrapper[4732]: I0402 13:57:05.364603 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n9mhr" event={"ID":"e050bf65-a203-40f8-9445-fbdcb87b7e91","Type":"ContainerDied","Data":"c0edd05d6ebe2c4e4ba7d14a06d43b65488f00e63d2ea851f836d016d04cbff1"} Apr 02 13:57:05 crc kubenswrapper[4732]: I0402 13:57:05.364665 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n9mhr" event={"ID":"e050bf65-a203-40f8-9445-fbdcb87b7e91","Type":"ContainerStarted","Data":"7bcad61ad3ae14eae9a9ca84d4c08ca7f9c84e3fa7fe8ac057586efd691e3b58"} Apr 02 13:57:05 crc kubenswrapper[4732]: I0402 13:57:05.367537 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 02 13:57:05 crc kubenswrapper[4732]: I0402 13:57:05.425989 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9ss2c" Apr 02 13:57:06 crc kubenswrapper[4732]: I0402 13:57:06.373874 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n9mhr" event={"ID":"e050bf65-a203-40f8-9445-fbdcb87b7e91","Type":"ContainerStarted","Data":"1742237420f158c0a12e33a4c5859e58ad7618b20fcaf13f6ccbd243fb2e2c38"} Apr 02 13:57:07 crc kubenswrapper[4732]: I0402 13:57:07.354724 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9ss2c"] Apr 02 13:57:07 crc kubenswrapper[4732]: I0402 13:57:07.381009 4732 generic.go:334] "Generic (PLEG): container finished" podID="e050bf65-a203-40f8-9445-fbdcb87b7e91" containerID="1742237420f158c0a12e33a4c5859e58ad7618b20fcaf13f6ccbd243fb2e2c38" exitCode=0 Apr 02 13:57:07 crc kubenswrapper[4732]: I0402 13:57:07.381116 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n9mhr" event={"ID":"e050bf65-a203-40f8-9445-fbdcb87b7e91","Type":"ContainerDied","Data":"1742237420f158c0a12e33a4c5859e58ad7618b20fcaf13f6ccbd243fb2e2c38"} Apr 02 13:57:07 crc kubenswrapper[4732]: I0402 13:57:07.381209 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9ss2c" podUID="3f0b4612-5b96-4c2c-a533-873713bb8e56" containerName="registry-server" containerID="cri-o://1a2b0d4093d95aefbe87e4c30c57a798d0676b24f4106da1a77a10c8de2e5f49" gracePeriod=2 Apr 02 13:57:07 crc kubenswrapper[4732]: I0402 13:57:07.844726 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9ss2c" Apr 02 13:57:08 crc kubenswrapper[4732]: I0402 13:57:08.015212 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8mtv\" (UniqueName: \"kubernetes.io/projected/3f0b4612-5b96-4c2c-a533-873713bb8e56-kube-api-access-s8mtv\") pod \"3f0b4612-5b96-4c2c-a533-873713bb8e56\" (UID: \"3f0b4612-5b96-4c2c-a533-873713bb8e56\") " Apr 02 13:57:08 crc kubenswrapper[4732]: I0402 13:57:08.015322 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f0b4612-5b96-4c2c-a533-873713bb8e56-utilities\") pod \"3f0b4612-5b96-4c2c-a533-873713bb8e56\" (UID: \"3f0b4612-5b96-4c2c-a533-873713bb8e56\") " Apr 02 13:57:08 crc kubenswrapper[4732]: I0402 13:57:08.015418 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f0b4612-5b96-4c2c-a533-873713bb8e56-catalog-content\") pod \"3f0b4612-5b96-4c2c-a533-873713bb8e56\" (UID: \"3f0b4612-5b96-4c2c-a533-873713bb8e56\") " Apr 02 13:57:08 crc kubenswrapper[4732]: I0402 13:57:08.016550 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f0b4612-5b96-4c2c-a533-873713bb8e56-utilities" (OuterVolumeSpecName: "utilities") pod "3f0b4612-5b96-4c2c-a533-873713bb8e56" (UID: "3f0b4612-5b96-4c2c-a533-873713bb8e56"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:57:08 crc kubenswrapper[4732]: I0402 13:57:08.016980 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f0b4612-5b96-4c2c-a533-873713bb8e56-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 13:57:08 crc kubenswrapper[4732]: I0402 13:57:08.034420 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f0b4612-5b96-4c2c-a533-873713bb8e56-kube-api-access-s8mtv" (OuterVolumeSpecName: "kube-api-access-s8mtv") pod "3f0b4612-5b96-4c2c-a533-873713bb8e56" (UID: "3f0b4612-5b96-4c2c-a533-873713bb8e56"). InnerVolumeSpecName "kube-api-access-s8mtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:57:08 crc kubenswrapper[4732]: I0402 13:57:08.091390 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f0b4612-5b96-4c2c-a533-873713bb8e56-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f0b4612-5b96-4c2c-a533-873713bb8e56" (UID: "3f0b4612-5b96-4c2c-a533-873713bb8e56"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:57:08 crc kubenswrapper[4732]: I0402 13:57:08.120499 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8mtv\" (UniqueName: \"kubernetes.io/projected/3f0b4612-5b96-4c2c-a533-873713bb8e56-kube-api-access-s8mtv\") on node \"crc\" DevicePath \"\"" Apr 02 13:57:08 crc kubenswrapper[4732]: I0402 13:57:08.120553 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f0b4612-5b96-4c2c-a533-873713bb8e56-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 13:57:08 crc kubenswrapper[4732]: I0402 13:57:08.392664 4732 generic.go:334] "Generic (PLEG): container finished" podID="3f0b4612-5b96-4c2c-a533-873713bb8e56" containerID="1a2b0d4093d95aefbe87e4c30c57a798d0676b24f4106da1a77a10c8de2e5f49" exitCode=0 Apr 02 13:57:08 crc kubenswrapper[4732]: I0402 13:57:08.392715 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ss2c" event={"ID":"3f0b4612-5b96-4c2c-a533-873713bb8e56","Type":"ContainerDied","Data":"1a2b0d4093d95aefbe87e4c30c57a798d0676b24f4106da1a77a10c8de2e5f49"} Apr 02 13:57:08 crc kubenswrapper[4732]: I0402 13:57:08.392770 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ss2c" event={"ID":"3f0b4612-5b96-4c2c-a533-873713bb8e56","Type":"ContainerDied","Data":"68c004896b149b2e72905493aa2d1e561d51da4e8c4665339c3013b4c6bfd9dd"} Apr 02 13:57:08 crc kubenswrapper[4732]: I0402 13:57:08.392783 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9ss2c" Apr 02 13:57:08 crc kubenswrapper[4732]: I0402 13:57:08.392789 4732 scope.go:117] "RemoveContainer" containerID="1a2b0d4093d95aefbe87e4c30c57a798d0676b24f4106da1a77a10c8de2e5f49" Apr 02 13:57:08 crc kubenswrapper[4732]: I0402 13:57:08.410570 4732 scope.go:117] "RemoveContainer" containerID="b2492831a9a05c6b600f0dfbc6599ac8766f278e618843b65219e11562e96621" Apr 02 13:57:08 crc kubenswrapper[4732]: I0402 13:57:08.427541 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9ss2c"] Apr 02 13:57:08 crc kubenswrapper[4732]: I0402 13:57:08.432749 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9ss2c"] Apr 02 13:57:08 crc kubenswrapper[4732]: I0402 13:57:08.441283 4732 scope.go:117] "RemoveContainer" containerID="72732e6390cc6f0ad704dc9674c554f2c0e11daf6ad7d7bb1fe75cca07edb911" Apr 02 13:57:08 crc kubenswrapper[4732]: I0402 13:57:08.460982 4732 scope.go:117] "RemoveContainer" containerID="1a2b0d4093d95aefbe87e4c30c57a798d0676b24f4106da1a77a10c8de2e5f49" Apr 02 13:57:08 crc kubenswrapper[4732]: E0402 13:57:08.461409 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a2b0d4093d95aefbe87e4c30c57a798d0676b24f4106da1a77a10c8de2e5f49\": container with ID starting with 1a2b0d4093d95aefbe87e4c30c57a798d0676b24f4106da1a77a10c8de2e5f49 not found: ID does not exist" containerID="1a2b0d4093d95aefbe87e4c30c57a798d0676b24f4106da1a77a10c8de2e5f49" Apr 02 13:57:08 crc kubenswrapper[4732]: I0402 13:57:08.461471 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a2b0d4093d95aefbe87e4c30c57a798d0676b24f4106da1a77a10c8de2e5f49"} err="failed to get container status \"1a2b0d4093d95aefbe87e4c30c57a798d0676b24f4106da1a77a10c8de2e5f49\": rpc error: code = NotFound desc = could not find container \"1a2b0d4093d95aefbe87e4c30c57a798d0676b24f4106da1a77a10c8de2e5f49\": container with ID starting with 1a2b0d4093d95aefbe87e4c30c57a798d0676b24f4106da1a77a10c8de2e5f49 not found: ID does not exist" Apr 02 13:57:08 crc kubenswrapper[4732]: I0402 13:57:08.461496 4732 scope.go:117] "RemoveContainer" containerID="b2492831a9a05c6b600f0dfbc6599ac8766f278e618843b65219e11562e96621" Apr 02 13:57:08 crc kubenswrapper[4732]: E0402 13:57:08.462033 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2492831a9a05c6b600f0dfbc6599ac8766f278e618843b65219e11562e96621\": container with ID starting with b2492831a9a05c6b600f0dfbc6599ac8766f278e618843b65219e11562e96621 not found: ID does not exist" containerID="b2492831a9a05c6b600f0dfbc6599ac8766f278e618843b65219e11562e96621" Apr 02 13:57:08 crc kubenswrapper[4732]: I0402 13:57:08.462081 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2492831a9a05c6b600f0dfbc6599ac8766f278e618843b65219e11562e96621"} err="failed to get container status \"b2492831a9a05c6b600f0dfbc6599ac8766f278e618843b65219e11562e96621\": rpc error: code = NotFound desc = could not find container \"b2492831a9a05c6b600f0dfbc6599ac8766f278e618843b65219e11562e96621\": container with ID starting with b2492831a9a05c6b600f0dfbc6599ac8766f278e618843b65219e11562e96621 not found: ID does not exist" Apr 02 13:57:08 crc kubenswrapper[4732]: I0402 13:57:08.462124 4732 scope.go:117] "RemoveContainer" containerID="72732e6390cc6f0ad704dc9674c554f2c0e11daf6ad7d7bb1fe75cca07edb911" Apr 02 13:57:08 crc kubenswrapper[4732]: E0402 13:57:08.462416 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72732e6390cc6f0ad704dc9674c554f2c0e11daf6ad7d7bb1fe75cca07edb911\": container with ID starting with 72732e6390cc6f0ad704dc9674c554f2c0e11daf6ad7d7bb1fe75cca07edb911 not found: ID does not exist" containerID="72732e6390cc6f0ad704dc9674c554f2c0e11daf6ad7d7bb1fe75cca07edb911" Apr 02 13:57:08 crc kubenswrapper[4732]: I0402 13:57:08.462439 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72732e6390cc6f0ad704dc9674c554f2c0e11daf6ad7d7bb1fe75cca07edb911"} err="failed to get container status \"72732e6390cc6f0ad704dc9674c554f2c0e11daf6ad7d7bb1fe75cca07edb911\": rpc error: code = NotFound desc = could not find container \"72732e6390cc6f0ad704dc9674c554f2c0e11daf6ad7d7bb1fe75cca07edb911\": container with ID starting with 72732e6390cc6f0ad704dc9674c554f2c0e11daf6ad7d7bb1fe75cca07edb911 not found: ID does not exist" Apr 02 13:57:08 crc kubenswrapper[4732]: I0402 13:57:08.689297 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f0b4612-5b96-4c2c-a533-873713bb8e56" path="/var/lib/kubelet/pods/3f0b4612-5b96-4c2c-a533-873713bb8e56/volumes" Apr 02 13:57:09 crc kubenswrapper[4732]: I0402 13:57:09.406291 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n9mhr" event={"ID":"e050bf65-a203-40f8-9445-fbdcb87b7e91","Type":"ContainerStarted","Data":"86262be09f38d058bf81202ad073b865ec76dce7a4781cd72e14d8ca4c54438a"} Apr 02 13:57:09 crc kubenswrapper[4732]: I0402 13:57:09.431442 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n9mhr" podStartSLOduration=3.616509827 podStartE2EDuration="6.431422046s" podCreationTimestamp="2026-04-02 13:57:03 +0000 UTC" firstStartedPulling="2026-04-02 13:57:05.367289986 +0000 UTC m=+1182.271697549" lastFinishedPulling="2026-04-02 13:57:08.182202215 +0000 UTC m=+1185.086609768" observedRunningTime="2026-04-02 13:57:09.426300078 +0000 UTC m=+1186.330707631" watchObservedRunningTime="2026-04-02 13:57:09.431422046 +0000 UTC m=+1186.335829599" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.295757 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d46cccfb9-65vqg"] Apr 02 13:57:10 crc kubenswrapper[4732]: E0402 13:57:10.296056 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f0b4612-5b96-4c2c-a533-873713bb8e56" containerName="registry-server" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.296078 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f0b4612-5b96-4c2c-a533-873713bb8e56" containerName="registry-server" Apr 02 13:57:10 crc kubenswrapper[4732]: E0402 13:57:10.296090 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f0b4612-5b96-4c2c-a533-873713bb8e56" containerName="extract-utilities" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.296216 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f0b4612-5b96-4c2c-a533-873713bb8e56" containerName="extract-utilities" Apr 02 13:57:10 crc kubenswrapper[4732]: E0402 13:57:10.296225 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f0b4612-5b96-4c2c-a533-873713bb8e56" containerName="extract-content" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.296231 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f0b4612-5b96-4c2c-a533-873713bb8e56" containerName="extract-content" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.296402 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f0b4612-5b96-4c2c-a533-873713bb8e56" containerName="registry-server" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.296882 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d46cccfb9-65vqg" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.306065 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-jqlfb" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.313447 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d46cccfb9-65vqg"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.320268 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-86644c9c9c-nhxqn"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.321405 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-86644c9c9c-nhxqn" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.324642 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-vkrqs" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.334762 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-58689c6fff-47xk7"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.335878 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-58689c6fff-47xk7" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.338707 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-j5mx2" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.350038 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28m5z\" (UniqueName: \"kubernetes.io/projected/4c76cc17-ab86-4c9c-9438-7e72e2ce895f-kube-api-access-28m5z\") pod \"designate-operator-controller-manager-58689c6fff-47xk7\" (UID: \"4c76cc17-ab86-4c9c-9438-7e72e2ce895f\") " pod="openstack-operators/designate-operator-controller-manager-58689c6fff-47xk7" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.350109 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82f2g\" (UniqueName: \"kubernetes.io/projected/d925f7c0-af6d-49d5-a09f-82afb7c58a15-kube-api-access-82f2g\") pod \"cinder-operator-controller-manager-5d46cccfb9-65vqg\" (UID: \"d925f7c0-af6d-49d5-a09f-82afb7c58a15\") " pod="openstack-operators/cinder-operator-controller-manager-5d46cccfb9-65vqg" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.350130 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sbbp\" (UniqueName: \"kubernetes.io/projected/08d5eea8-7c67-4aa1-ad91-ab1c60214872-kube-api-access-2sbbp\") pod \"barbican-operator-controller-manager-86644c9c9c-nhxqn\" (UID: \"08d5eea8-7c67-4aa1-ad91-ab1c60214872\") " pod="openstack-operators/barbican-operator-controller-manager-86644c9c9c-nhxqn" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.363845 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-648bdc7f99-vt6x9"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.364829 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-648bdc7f99-vt6x9" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.375805 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-xqbxb" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.377847 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-86644c9c9c-nhxqn"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.382190 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-58689c6fff-47xk7"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.402782 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-8684f86954-xgncs"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.403966 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-8684f86954-xgncs" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.409064 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-d8cpw" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.411685 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-648bdc7f99-vt6x9"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.439999 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-58f79b884c-5q7cz"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.441043 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-58f79b884c-5q7cz" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.442686 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6ccfd84cb4-hv8p6"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.443520 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6ccfd84cb4-hv8p6" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.448286 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-lvqvs" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.448490 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.448689 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-trgmk" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.451222 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43f86830-d407-4dc4-9b09-388fb5db82c8-cert\") pod \"infra-operator-controller-manager-58f79b884c-5q7cz\" (UID: \"43f86830-d407-4dc4-9b09-388fb5db82c8\") " pod="openstack-operators/infra-operator-controller-manager-58f79b884c-5q7cz" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.451290 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wn8r\" (UniqueName: \"kubernetes.io/projected/5e74dfe1-0e0f-4b70-8b9a-db645eb40e05-kube-api-access-2wn8r\") pod \"glance-operator-controller-manager-648bdc7f99-vt6x9\" (UID: \"5e74dfe1-0e0f-4b70-8b9a-db645eb40e05\") " pod="openstack-operators/glance-operator-controller-manager-648bdc7f99-vt6x9" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.451332 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77gjp\" (UniqueName: \"kubernetes.io/projected/12296214-f552-4868-8884-66c241eb973b-kube-api-access-77gjp\") pod \"heat-operator-controller-manager-8684f86954-xgncs\" (UID: \"12296214-f552-4868-8884-66c241eb973b\") " pod="openstack-operators/heat-operator-controller-manager-8684f86954-xgncs" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.451387 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znv82\" (UniqueName: \"kubernetes.io/projected/879197e5-dc13-4c17-b8ac-7e51a97aa0f2-kube-api-access-znv82\") pod \"horizon-operator-controller-manager-6ccfd84cb4-hv8p6\" (UID: \"879197e5-dc13-4c17-b8ac-7e51a97aa0f2\") " pod="openstack-operators/horizon-operator-controller-manager-6ccfd84cb4-hv8p6" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.451427 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28m5z\" (UniqueName: \"kubernetes.io/projected/4c76cc17-ab86-4c9c-9438-7e72e2ce895f-kube-api-access-28m5z\") pod \"designate-operator-controller-manager-58689c6fff-47xk7\" (UID: \"4c76cc17-ab86-4c9c-9438-7e72e2ce895f\") " pod="openstack-operators/designate-operator-controller-manager-58689c6fff-47xk7" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.451475 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4r9t\" (UniqueName: \"kubernetes.io/projected/43f86830-d407-4dc4-9b09-388fb5db82c8-kube-api-access-z4r9t\") pod \"infra-operator-controller-manager-58f79b884c-5q7cz\" (UID: \"43f86830-d407-4dc4-9b09-388fb5db82c8\") " pod="openstack-operators/infra-operator-controller-manager-58f79b884c-5q7cz" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.451530 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82f2g\" (UniqueName: \"kubernetes.io/projected/d925f7c0-af6d-49d5-a09f-82afb7c58a15-kube-api-access-82f2g\") pod \"cinder-operator-controller-manager-5d46cccfb9-65vqg\" (UID: \"d925f7c0-af6d-49d5-a09f-82afb7c58a15\") " pod="openstack-operators/cinder-operator-controller-manager-5d46cccfb9-65vqg" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.451555 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sbbp\" (UniqueName: \"kubernetes.io/projected/08d5eea8-7c67-4aa1-ad91-ab1c60214872-kube-api-access-2sbbp\") pod \"barbican-operator-controller-manager-86644c9c9c-nhxqn\" (UID: \"08d5eea8-7c67-4aa1-ad91-ab1c60214872\") " pod="openstack-operators/barbican-operator-controller-manager-86644c9c9c-nhxqn" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.464672 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-8684f86954-xgncs"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.473415 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-58f79b884c-5q7cz"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.488839 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6ccfd84cb4-hv8p6"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.513909 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28m5z\" (UniqueName: \"kubernetes.io/projected/4c76cc17-ab86-4c9c-9438-7e72e2ce895f-kube-api-access-28m5z\") pod \"designate-operator-controller-manager-58689c6fff-47xk7\" (UID: \"4c76cc17-ab86-4c9c-9438-7e72e2ce895f\") " pod="openstack-operators/designate-operator-controller-manager-58689c6fff-47xk7" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.522021 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sbbp\" (UniqueName: \"kubernetes.io/projected/08d5eea8-7c67-4aa1-ad91-ab1c60214872-kube-api-access-2sbbp\") pod \"barbican-operator-controller-manager-86644c9c9c-nhxqn\" (UID: \"08d5eea8-7c67-4aa1-ad91-ab1c60214872\") " pod="openstack-operators/barbican-operator-controller-manager-86644c9c9c-nhxqn" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.534452 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82f2g\" (UniqueName: \"kubernetes.io/projected/d925f7c0-af6d-49d5-a09f-82afb7c58a15-kube-api-access-82f2g\") pod \"cinder-operator-controller-manager-5d46cccfb9-65vqg\" (UID: \"d925f7c0-af6d-49d5-a09f-82afb7c58a15\") " pod="openstack-operators/cinder-operator-controller-manager-5d46cccfb9-65vqg" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.537087 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f96574b5-nbm76"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.538256 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f96574b5-nbm76" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.548851 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-dbf8bb784-4kz5n"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.552548 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-dbf8bb784-4kz5n" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.553425 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4r9t\" (UniqueName: \"kubernetes.io/projected/43f86830-d407-4dc4-9b09-388fb5db82c8-kube-api-access-z4r9t\") pod \"infra-operator-controller-manager-58f79b884c-5q7cz\" (UID: \"43f86830-d407-4dc4-9b09-388fb5db82c8\") " pod="openstack-operators/infra-operator-controller-manager-58f79b884c-5q7cz" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.553530 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43f86830-d407-4dc4-9b09-388fb5db82c8-cert\") pod \"infra-operator-controller-manager-58f79b884c-5q7cz\" (UID: \"43f86830-d407-4dc4-9b09-388fb5db82c8\") " pod="openstack-operators/infra-operator-controller-manager-58f79b884c-5q7cz" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.553570 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wn8r\" (UniqueName: \"kubernetes.io/projected/5e74dfe1-0e0f-4b70-8b9a-db645eb40e05-kube-api-access-2wn8r\") pod \"glance-operator-controller-manager-648bdc7f99-vt6x9\" (UID: \"5e74dfe1-0e0f-4b70-8b9a-db645eb40e05\") " pod="openstack-operators/glance-operator-controller-manager-648bdc7f99-vt6x9" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.553602 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77gjp\" (UniqueName: \"kubernetes.io/projected/12296214-f552-4868-8884-66c241eb973b-kube-api-access-77gjp\") pod \"heat-operator-controller-manager-8684f86954-xgncs\" (UID: \"12296214-f552-4868-8884-66c241eb973b\") " pod="openstack-operators/heat-operator-controller-manager-8684f86954-xgncs" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.553657 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znv82\" (UniqueName: \"kubernetes.io/projected/879197e5-dc13-4c17-b8ac-7e51a97aa0f2-kube-api-access-znv82\") pod \"horizon-operator-controller-manager-6ccfd84cb4-hv8p6\" (UID: \"879197e5-dc13-4c17-b8ac-7e51a97aa0f2\") " pod="openstack-operators/horizon-operator-controller-manager-6ccfd84cb4-hv8p6" Apr 02 13:57:10 crc kubenswrapper[4732]: E0402 13:57:10.554239 4732 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Apr 02 13:57:10 crc kubenswrapper[4732]: E0402 13:57:10.554302 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43f86830-d407-4dc4-9b09-388fb5db82c8-cert podName:43f86830-d407-4dc4-9b09-388fb5db82c8 nodeName:}" failed. No retries permitted until 2026-04-02 13:57:11.05427888 +0000 UTC m=+1187.958686433 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/43f86830-d407-4dc4-9b09-388fb5db82c8-cert") pod "infra-operator-controller-manager-58f79b884c-5q7cz" (UID: "43f86830-d407-4dc4-9b09-388fb5db82c8") : secret "infra-operator-webhook-server-cert" not found Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.558233 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-bs4mv" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.566584 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-8zrts" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.568689 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f96574b5-nbm76"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.580903 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6b7497dc59-ph5hk"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.585369 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6b7497dc59-ph5hk" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.587175 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wn8r\" (UniqueName: \"kubernetes.io/projected/5e74dfe1-0e0f-4b70-8b9a-db645eb40e05-kube-api-access-2wn8r\") pod \"glance-operator-controller-manager-648bdc7f99-vt6x9\" (UID: \"5e74dfe1-0e0f-4b70-8b9a-db645eb40e05\") " pod="openstack-operators/glance-operator-controller-manager-648bdc7f99-vt6x9" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.588805 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-9sz57" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.593039 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znv82\" (UniqueName: \"kubernetes.io/projected/879197e5-dc13-4c17-b8ac-7e51a97aa0f2-kube-api-access-znv82\") pod \"horizon-operator-controller-manager-6ccfd84cb4-hv8p6\" (UID: \"879197e5-dc13-4c17-b8ac-7e51a97aa0f2\") " pod="openstack-operators/horizon-operator-controller-manager-6ccfd84cb4-hv8p6" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.598283 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77gjp\" (UniqueName: \"kubernetes.io/projected/12296214-f552-4868-8884-66c241eb973b-kube-api-access-77gjp\") pod \"heat-operator-controller-manager-8684f86954-xgncs\" (UID: \"12296214-f552-4868-8884-66c241eb973b\") " pod="openstack-operators/heat-operator-controller-manager-8684f86954-xgncs" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.601079 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-dbf8bb784-4kz5n"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.613457 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4r9t\" (UniqueName: \"kubernetes.io/projected/43f86830-d407-4dc4-9b09-388fb5db82c8-kube-api-access-z4r9t\") pod \"infra-operator-controller-manager-58f79b884c-5q7cz\" (UID: \"43f86830-d407-4dc4-9b09-388fb5db82c8\") " pod="openstack-operators/infra-operator-controller-manager-58f79b884c-5q7cz" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.618742 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6554749d88-4cwml"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.619783 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6554749d88-4cwml" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.620559 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d46cccfb9-65vqg" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.626256 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-z9cng" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.646311 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6b7497dc59-ph5hk"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.651657 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-86644c9c9c-nhxqn" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.655935 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hws7p\" (UniqueName: \"kubernetes.io/projected/6b75349c-23b4-4dc0-914f-f1dc82b12e18-kube-api-access-hws7p\") pod \"keystone-operator-controller-manager-dbf8bb784-4kz5n\" (UID: \"6b75349c-23b4-4dc0-914f-f1dc82b12e18\") " pod="openstack-operators/keystone-operator-controller-manager-dbf8bb784-4kz5n" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.655986 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9gkr\" (UniqueName: \"kubernetes.io/projected/e46394c5-fd9e-4c0d-8e78-96723f5931d9-kube-api-access-k9gkr\") pod \"ironic-operator-controller-manager-5f96574b5-nbm76\" (UID: \"e46394c5-fd9e-4c0d-8e78-96723f5931d9\") " pod="openstack-operators/ironic-operator-controller-manager-5f96574b5-nbm76" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.656270 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6554749d88-4cwml"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.679437 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-58689c6fff-47xk7" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.706198 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-648bdc7f99-vt6x9" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.734233 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-8684f86954-xgncs" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.760167 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwvln\" (UniqueName: \"kubernetes.io/projected/338e9bfc-709f-49f2-8456-9dbe8b815382-kube-api-access-jwvln\") pod \"mariadb-operator-controller-manager-6554749d88-4cwml\" (UID: \"338e9bfc-709f-49f2-8456-9dbe8b815382\") " pod="openstack-operators/mariadb-operator-controller-manager-6554749d88-4cwml" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.760249 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hws7p\" (UniqueName: \"kubernetes.io/projected/6b75349c-23b4-4dc0-914f-f1dc82b12e18-kube-api-access-hws7p\") pod \"keystone-operator-controller-manager-dbf8bb784-4kz5n\" (UID: \"6b75349c-23b4-4dc0-914f-f1dc82b12e18\") " pod="openstack-operators/keystone-operator-controller-manager-dbf8bb784-4kz5n" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.760268 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbpph\" (UniqueName: \"kubernetes.io/projected/084daf4c-82c9-42e7-8eb9-3ae4658c1742-kube-api-access-bbpph\") pod \"manila-operator-controller-manager-6b7497dc59-ph5hk\" (UID: \"084daf4c-82c9-42e7-8eb9-3ae4658c1742\") " pod="openstack-operators/manila-operator-controller-manager-6b7497dc59-ph5hk" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.760294 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9gkr\" (UniqueName: \"kubernetes.io/projected/e46394c5-fd9e-4c0d-8e78-96723f5931d9-kube-api-access-k9gkr\") pod \"ironic-operator-controller-manager-5f96574b5-nbm76\" (UID: \"e46394c5-fd9e-4c0d-8e78-96723f5931d9\") " pod="openstack-operators/ironic-operator-controller-manager-5f96574b5-nbm76" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.771512 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-z49v9"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.772756 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d6f9fd68c-mvbqt"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.772893 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-z49v9" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.773409 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-z49v9"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.773525 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d6f9fd68c-mvbqt" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.773562 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7594f57946-2rvck"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.774298 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d6f9fd68c-mvbqt"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.774325 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7594f57946-2rvck"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.774335 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.774454 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7594f57946-2rvck" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.774848 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-84464c7c78-tgrmk"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.775032 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.775565 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-84464c7c78-tgrmk" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.778202 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-gjrmz" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.778538 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.778745 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-nd28q" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.778922 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-2565f" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.778949 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-fb22z" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.779089 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-d2jqr" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.790253 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hws7p\" (UniqueName: \"kubernetes.io/projected/6b75349c-23b4-4dc0-914f-f1dc82b12e18-kube-api-access-hws7p\") pod \"keystone-operator-controller-manager-dbf8bb784-4kz5n\" (UID: \"6b75349c-23b4-4dc0-914f-f1dc82b12e18\") " pod="openstack-operators/keystone-operator-controller-manager-dbf8bb784-4kz5n" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.791419 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-84464c7c78-tgrmk"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.792800 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6ccfd84cb4-hv8p6" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.805676 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9gkr\" (UniqueName: \"kubernetes.io/projected/e46394c5-fd9e-4c0d-8e78-96723f5931d9-kube-api-access-k9gkr\") pod \"ironic-operator-controller-manager-5f96574b5-nbm76\" (UID: \"e46394c5-fd9e-4c0d-8e78-96723f5931d9\") " pod="openstack-operators/ironic-operator-controller-manager-5f96574b5-nbm76" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.810304 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.854137 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-559d8fdb6b-mfzk4"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.855296 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-559d8fdb6b-mfzk4" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.863099 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mfzt\" (UniqueName: \"kubernetes.io/projected/d5a07520-1380-45b1-a00a-7148b158711e-kube-api-access-2mfzt\") pod \"openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh\" (UID: \"d5a07520-1380-45b1-a00a-7148b158711e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.864406 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4cgl\" (UniqueName: \"kubernetes.io/projected/6296d461-d333-4b9c-a082-e48db64bdd96-kube-api-access-g4cgl\") pod \"placement-operator-controller-manager-559d8fdb6b-mfzk4\" (UID: \"6296d461-d333-4b9c-a082-e48db64bdd96\") " pod="openstack-operators/placement-operator-controller-manager-559d8fdb6b-mfzk4" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.864521 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwvln\" (UniqueName: \"kubernetes.io/projected/338e9bfc-709f-49f2-8456-9dbe8b815382-kube-api-access-jwvln\") pod \"mariadb-operator-controller-manager-6554749d88-4cwml\" (UID: \"338e9bfc-709f-49f2-8456-9dbe8b815382\") " pod="openstack-operators/mariadb-operator-controller-manager-6554749d88-4cwml" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.864565 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drshx\" (UniqueName: \"kubernetes.io/projected/b49d6074-a4b1-4658-b6b8-95bfe63163b0-kube-api-access-drshx\") pod \"nova-operator-controller-manager-5d6f9fd68c-mvbqt\" (UID: \"b49d6074-a4b1-4658-b6b8-95bfe63163b0\") " pod="openstack-operators/nova-operator-controller-manager-5d6f9fd68c-mvbqt" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.864595 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnhx9\" (UniqueName: \"kubernetes.io/projected/5319722c-7913-4dcd-a03d-dc7a5040b434-kube-api-access-qnhx9\") pod \"octavia-operator-controller-manager-7594f57946-2rvck\" (UID: \"5319722c-7913-4dcd-a03d-dc7a5040b434\") " pod="openstack-operators/octavia-operator-controller-manager-7594f57946-2rvck" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.865488 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx6q9\" (UniqueName: \"kubernetes.io/projected/1c68b230-3f85-41f9-a6ed-7da1d0738748-kube-api-access-dx6q9\") pod \"neutron-operator-controller-manager-767865f676-z49v9\" (UID: \"1c68b230-3f85-41f9-a6ed-7da1d0738748\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-z49v9" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.865525 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbpph\" (UniqueName: \"kubernetes.io/projected/084daf4c-82c9-42e7-8eb9-3ae4658c1742-kube-api-access-bbpph\") pod \"manila-operator-controller-manager-6b7497dc59-ph5hk\" (UID: \"084daf4c-82c9-42e7-8eb9-3ae4658c1742\") " pod="openstack-operators/manila-operator-controller-manager-6b7497dc59-ph5hk" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.865558 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nv2z\" (UniqueName: \"kubernetes.io/projected/bec355a9-c60e-4480-a32c-f1a43ef27131-kube-api-access-8nv2z\") pod \"ovn-operator-controller-manager-84464c7c78-tgrmk\" (UID: \"bec355a9-c60e-4480-a32c-f1a43ef27131\") " pod="openstack-operators/ovn-operator-controller-manager-84464c7c78-tgrmk" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.865592 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5a07520-1380-45b1-a00a-7148b158711e-cert\") pod \"openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh\" (UID: \"d5a07520-1380-45b1-a00a-7148b158711e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.869076 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-fbdcf7f7b-bw2df"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.883058 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-zlzhd" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.895875 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-fbdcf7f7b-bw2df" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.900730 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-559d8fdb6b-mfzk4"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.903625 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-xvwg6" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.912173 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwvln\" (UniqueName: \"kubernetes.io/projected/338e9bfc-709f-49f2-8456-9dbe8b815382-kube-api-access-jwvln\") pod \"mariadb-operator-controller-manager-6554749d88-4cwml\" (UID: \"338e9bfc-709f-49f2-8456-9dbe8b815382\") " pod="openstack-operators/mariadb-operator-controller-manager-6554749d88-4cwml" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.912752 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6f76d4c7-nbxg9"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.913485 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6f76d4c7-nbxg9" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.915855 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-g64gz" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.921101 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbpph\" (UniqueName: \"kubernetes.io/projected/084daf4c-82c9-42e7-8eb9-3ae4658c1742-kube-api-access-bbpph\") pod \"manila-operator-controller-manager-6b7497dc59-ph5hk\" (UID: \"084daf4c-82c9-42e7-8eb9-3ae4658c1742\") " pod="openstack-operators/manila-operator-controller-manager-6b7497dc59-ph5hk" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.925300 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-fbdcf7f7b-bw2df"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.937268 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6f76d4c7-nbxg9"] Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.964949 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f96574b5-nbm76" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.975187 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t57kx\" (UniqueName: \"kubernetes.io/projected/5b424dbc-80ac-46ae-90d2-c69fdf4c14d7-kube-api-access-t57kx\") pod \"swift-operator-controller-manager-fbdcf7f7b-bw2df\" (UID: \"5b424dbc-80ac-46ae-90d2-c69fdf4c14d7\") " pod="openstack-operators/swift-operator-controller-manager-fbdcf7f7b-bw2df" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.975251 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mfzt\" (UniqueName: \"kubernetes.io/projected/d5a07520-1380-45b1-a00a-7148b158711e-kube-api-access-2mfzt\") pod \"openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh\" (UID: \"d5a07520-1380-45b1-a00a-7148b158711e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.975289 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4cgl\" (UniqueName: \"kubernetes.io/projected/6296d461-d333-4b9c-a082-e48db64bdd96-kube-api-access-g4cgl\") pod \"placement-operator-controller-manager-559d8fdb6b-mfzk4\" (UID: \"6296d461-d333-4b9c-a082-e48db64bdd96\") " pod="openstack-operators/placement-operator-controller-manager-559d8fdb6b-mfzk4" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.975334 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drshx\" (UniqueName: \"kubernetes.io/projected/b49d6074-a4b1-4658-b6b8-95bfe63163b0-kube-api-access-drshx\") pod \"nova-operator-controller-manager-5d6f9fd68c-mvbqt\" (UID: \"b49d6074-a4b1-4658-b6b8-95bfe63163b0\") " pod="openstack-operators/nova-operator-controller-manager-5d6f9fd68c-mvbqt" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.975366 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnhx9\" (UniqueName: \"kubernetes.io/projected/5319722c-7913-4dcd-a03d-dc7a5040b434-kube-api-access-qnhx9\") pod \"octavia-operator-controller-manager-7594f57946-2rvck\" (UID: \"5319722c-7913-4dcd-a03d-dc7a5040b434\") " pod="openstack-operators/octavia-operator-controller-manager-7594f57946-2rvck" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.975437 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx6q9\" (UniqueName: \"kubernetes.io/projected/1c68b230-3f85-41f9-a6ed-7da1d0738748-kube-api-access-dx6q9\") pod \"neutron-operator-controller-manager-767865f676-z49v9\" (UID: \"1c68b230-3f85-41f9-a6ed-7da1d0738748\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-z49v9" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.975475 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nv2z\" (UniqueName: \"kubernetes.io/projected/bec355a9-c60e-4480-a32c-f1a43ef27131-kube-api-access-8nv2z\") pod \"ovn-operator-controller-manager-84464c7c78-tgrmk\" (UID: \"bec355a9-c60e-4480-a32c-f1a43ef27131\") " pod="openstack-operators/ovn-operator-controller-manager-84464c7c78-tgrmk" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.975496 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc5mw\" (UniqueName: \"kubernetes.io/projected/426c551b-e661-40e0-9aa3-a83897ce2814-kube-api-access-hc5mw\") pod \"telemetry-operator-controller-manager-d6f76d4c7-nbxg9\" (UID: \"426c551b-e661-40e0-9aa3-a83897ce2814\") " pod="openstack-operators/telemetry-operator-controller-manager-d6f76d4c7-nbxg9" Apr 02 13:57:10 crc kubenswrapper[4732]: I0402 13:57:10.975520 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5a07520-1380-45b1-a00a-7148b158711e-cert\") pod \"openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh\" (UID: \"d5a07520-1380-45b1-a00a-7148b158711e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh" Apr 02 13:57:10 crc kubenswrapper[4732]: E0402 13:57:10.975729 4732 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Apr 02 13:57:10 crc kubenswrapper[4732]: E0402 13:57:10.975785 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5a07520-1380-45b1-a00a-7148b158711e-cert podName:d5a07520-1380-45b1-a00a-7148b158711e nodeName:}" failed. No retries permitted until 2026-04-02 13:57:11.475763751 +0000 UTC m=+1188.380171304 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d5a07520-1380-45b1-a00a-7148b158711e-cert") pod "openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh" (UID: "d5a07520-1380-45b1-a00a-7148b158711e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:10.999975 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-56ccc97cf5-ztlzz"] Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.002746 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56ccc97cf5-ztlzz" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.014267 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mfzt\" (UniqueName: \"kubernetes.io/projected/d5a07520-1380-45b1-a00a-7148b158711e-kube-api-access-2mfzt\") pod \"openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh\" (UID: \"d5a07520-1380-45b1-a00a-7148b158711e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.019431 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drshx\" (UniqueName: \"kubernetes.io/projected/b49d6074-a4b1-4658-b6b8-95bfe63163b0-kube-api-access-drshx\") pod \"nova-operator-controller-manager-5d6f9fd68c-mvbqt\" (UID: \"b49d6074-a4b1-4658-b6b8-95bfe63163b0\") " pod="openstack-operators/nova-operator-controller-manager-5d6f9fd68c-mvbqt" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.020815 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-q7srd" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.050888 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx6q9\" (UniqueName: \"kubernetes.io/projected/1c68b230-3f85-41f9-a6ed-7da1d0738748-kube-api-access-dx6q9\") pod \"neutron-operator-controller-manager-767865f676-z49v9\" (UID: \"1c68b230-3f85-41f9-a6ed-7da1d0738748\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-z49v9" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.063979 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nv2z\" (UniqueName: \"kubernetes.io/projected/bec355a9-c60e-4480-a32c-f1a43ef27131-kube-api-access-8nv2z\") pod \"ovn-operator-controller-manager-84464c7c78-tgrmk\" (UID: \"bec355a9-c60e-4480-a32c-f1a43ef27131\") " pod="openstack-operators/ovn-operator-controller-manager-84464c7c78-tgrmk" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.088365 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnhx9\" (UniqueName: \"kubernetes.io/projected/5319722c-7913-4dcd-a03d-dc7a5040b434-kube-api-access-qnhx9\") pod \"octavia-operator-controller-manager-7594f57946-2rvck\" (UID: \"5319722c-7913-4dcd-a03d-dc7a5040b434\") " pod="openstack-operators/octavia-operator-controller-manager-7594f57946-2rvck" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.101719 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-dbf8bb784-4kz5n" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.102190 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6b7497dc59-ph5hk" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.102693 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4cgl\" (UniqueName: \"kubernetes.io/projected/6296d461-d333-4b9c-a082-e48db64bdd96-kube-api-access-g4cgl\") pod \"placement-operator-controller-manager-559d8fdb6b-mfzk4\" (UID: \"6296d461-d333-4b9c-a082-e48db64bdd96\") " pod="openstack-operators/placement-operator-controller-manager-559d8fdb6b-mfzk4" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.105254 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6554749d88-4cwml" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.105311 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56ccc97cf5-ztlzz"] Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.106949 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc5mw\" (UniqueName: \"kubernetes.io/projected/426c551b-e661-40e0-9aa3-a83897ce2814-kube-api-access-hc5mw\") pod \"telemetry-operator-controller-manager-d6f76d4c7-nbxg9\" (UID: \"426c551b-e661-40e0-9aa3-a83897ce2814\") " pod="openstack-operators/telemetry-operator-controller-manager-d6f76d4c7-nbxg9" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.107028 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t57kx\" (UniqueName: \"kubernetes.io/projected/5b424dbc-80ac-46ae-90d2-c69fdf4c14d7-kube-api-access-t57kx\") pod \"swift-operator-controller-manager-fbdcf7f7b-bw2df\" (UID: \"5b424dbc-80ac-46ae-90d2-c69fdf4c14d7\") " pod="openstack-operators/swift-operator-controller-manager-fbdcf7f7b-bw2df" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.107091 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43f86830-d407-4dc4-9b09-388fb5db82c8-cert\") pod \"infra-operator-controller-manager-58f79b884c-5q7cz\" (UID: \"43f86830-d407-4dc4-9b09-388fb5db82c8\") " pod="openstack-operators/infra-operator-controller-manager-58f79b884c-5q7cz" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.107140 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4n66\" (UniqueName: \"kubernetes.io/projected/bd54902c-4922-4c49-85c1-280af54370ba-kube-api-access-s4n66\") pod \"test-operator-controller-manager-56ccc97cf5-ztlzz\" (UID: \"bd54902c-4922-4c49-85c1-280af54370ba\") " pod="openstack-operators/test-operator-controller-manager-56ccc97cf5-ztlzz" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.113484 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-989fbd45-w2zrf"] Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.118014 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-989fbd45-w2zrf" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.129350 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-z49v9" Apr 02 13:57:11 crc kubenswrapper[4732]: E0402 13:57:11.129843 4732 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Apr 02 13:57:11 crc kubenswrapper[4732]: E0402 13:57:11.129885 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43f86830-d407-4dc4-9b09-388fb5db82c8-cert podName:43f86830-d407-4dc4-9b09-388fb5db82c8 nodeName:}" failed. No retries permitted until 2026-04-02 13:57:12.129867412 +0000 UTC m=+1189.034274965 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/43f86830-d407-4dc4-9b09-388fb5db82c8-cert") pod "infra-operator-controller-manager-58f79b884c-5q7cz" (UID: "43f86830-d407-4dc4-9b09-388fb5db82c8") : secret "infra-operator-webhook-server-cert" not found Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.130091 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7594f57946-2rvck" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.130162 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-schmg" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.145691 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-989fbd45-w2zrf"] Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.163645 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-84464c7c78-tgrmk" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.167661 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc5mw\" (UniqueName: \"kubernetes.io/projected/426c551b-e661-40e0-9aa3-a83897ce2814-kube-api-access-hc5mw\") pod \"telemetry-operator-controller-manager-d6f76d4c7-nbxg9\" (UID: \"426c551b-e661-40e0-9aa3-a83897ce2814\") " pod="openstack-operators/telemetry-operator-controller-manager-d6f76d4c7-nbxg9" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.168287 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t57kx\" (UniqueName: \"kubernetes.io/projected/5b424dbc-80ac-46ae-90d2-c69fdf4c14d7-kube-api-access-t57kx\") pod \"swift-operator-controller-manager-fbdcf7f7b-bw2df\" (UID: \"5b424dbc-80ac-46ae-90d2-c69fdf4c14d7\") " pod="openstack-operators/swift-operator-controller-manager-fbdcf7f7b-bw2df" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.186658 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth"] Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.187634 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.190026 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.190214 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-hpbh8" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.190364 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.203234 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth"] Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.207934 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4n66\" (UniqueName: \"kubernetes.io/projected/bd54902c-4922-4c49-85c1-280af54370ba-kube-api-access-s4n66\") pod \"test-operator-controller-manager-56ccc97cf5-ztlzz\" (UID: \"bd54902c-4922-4c49-85c1-280af54370ba\") " pod="openstack-operators/test-operator-controller-manager-56ccc97cf5-ztlzz" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.228727 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4n66\" (UniqueName: \"kubernetes.io/projected/bd54902c-4922-4c49-85c1-280af54370ba-kube-api-access-s4n66\") pod \"test-operator-controller-manager-56ccc97cf5-ztlzz\" (UID: \"bd54902c-4922-4c49-85c1-280af54370ba\") " pod="openstack-operators/test-operator-controller-manager-56ccc97cf5-ztlzz" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.231169 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d6f9fd68c-mvbqt" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.247989 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-559d8fdb6b-mfzk4" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.302383 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-fbdcf7f7b-bw2df" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.309357 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d7hg\" (UniqueName: \"kubernetes.io/projected/64892a56-9180-4d1d-ad33-d87caa5f2002-kube-api-access-5d7hg\") pod \"watcher-operator-controller-manager-989fbd45-w2zrf\" (UID: \"64892a56-9180-4d1d-ad33-d87caa5f2002\") " pod="openstack-operators/watcher-operator-controller-manager-989fbd45-w2zrf" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.309490 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbl58\" (UniqueName: \"kubernetes.io/projected/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-kube-api-access-dbl58\") pod \"openstack-operator-controller-manager-5985877f6-hxnth\" (UID: \"e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f\") " pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.309522 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-metrics-certs\") pod \"openstack-operator-controller-manager-5985877f6-hxnth\" (UID: \"e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f\") " pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.309559 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-webhook-certs\") pod \"openstack-operator-controller-manager-5985877f6-hxnth\" (UID: \"e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f\") " pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.334304 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d46cccfb9-65vqg"] Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.355395 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6f76d4c7-nbxg9" Apr 02 13:57:11 crc kubenswrapper[4732]: W0402 13:57:11.385736 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd925f7c0_af6d_49d5_a09f_82afb7c58a15.slice/crio-2081f4e47556ab26ddea9567fa8c53faedbcd0c679bfb6f2f3492b893aa05947 WatchSource:0}: Error finding container 2081f4e47556ab26ddea9567fa8c53faedbcd0c679bfb6f2f3492b893aa05947: Status 404 returned error can't find the container with id 2081f4e47556ab26ddea9567fa8c53faedbcd0c679bfb6f2f3492b893aa05947 Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.412657 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56ccc97cf5-ztlzz" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.416685 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbl58\" (UniqueName: \"kubernetes.io/projected/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-kube-api-access-dbl58\") pod \"openstack-operator-controller-manager-5985877f6-hxnth\" (UID: \"e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f\") " pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.416714 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-metrics-certs\") pod \"openstack-operator-controller-manager-5985877f6-hxnth\" (UID: \"e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f\") " pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.416766 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-webhook-certs\") pod \"openstack-operator-controller-manager-5985877f6-hxnth\" (UID: \"e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f\") " pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.416843 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d7hg\" (UniqueName: \"kubernetes.io/projected/64892a56-9180-4d1d-ad33-d87caa5f2002-kube-api-access-5d7hg\") pod \"watcher-operator-controller-manager-989fbd45-w2zrf\" (UID: \"64892a56-9180-4d1d-ad33-d87caa5f2002\") " pod="openstack-operators/watcher-operator-controller-manager-989fbd45-w2zrf" Apr 02 13:57:11 crc kubenswrapper[4732]: E0402 13:57:11.417254 4732 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Apr 02 13:57:11 crc kubenswrapper[4732]: E0402 13:57:11.417280 4732 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Apr 02 13:57:11 crc kubenswrapper[4732]: E0402 13:57:11.417300 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-webhook-certs podName:e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f nodeName:}" failed. No retries permitted until 2026-04-02 13:57:11.917285498 +0000 UTC m=+1188.821693051 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-webhook-certs") pod "openstack-operator-controller-manager-5985877f6-hxnth" (UID: "e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f") : secret "webhook-server-cert" not found Apr 02 13:57:11 crc kubenswrapper[4732]: E0402 13:57:11.417429 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-metrics-certs podName:e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f nodeName:}" failed. No retries permitted until 2026-04-02 13:57:11.917419862 +0000 UTC m=+1188.821827425 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-metrics-certs") pod "openstack-operator-controller-manager-5985877f6-hxnth" (UID: "e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f") : secret "metrics-server-cert" not found Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.439649 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbl58\" (UniqueName: \"kubernetes.io/projected/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-kube-api-access-dbl58\") pod \"openstack-operator-controller-manager-5985877f6-hxnth\" (UID: \"e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f\") " pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.451339 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d7hg\" (UniqueName: \"kubernetes.io/projected/64892a56-9180-4d1d-ad33-d87caa5f2002-kube-api-access-5d7hg\") pod \"watcher-operator-controller-manager-989fbd45-w2zrf\" (UID: \"64892a56-9180-4d1d-ad33-d87caa5f2002\") " pod="openstack-operators/watcher-operator-controller-manager-989fbd45-w2zrf" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.467821 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d46cccfb9-65vqg" event={"ID":"d925f7c0-af6d-49d5-a09f-82afb7c58a15","Type":"ContainerStarted","Data":"2081f4e47556ab26ddea9567fa8c53faedbcd0c679bfb6f2f3492b893aa05947"} Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.468037 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-989fbd45-w2zrf" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.517599 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5a07520-1380-45b1-a00a-7148b158711e-cert\") pod \"openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh\" (UID: \"d5a07520-1380-45b1-a00a-7148b158711e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh" Apr 02 13:57:11 crc kubenswrapper[4732]: E0402 13:57:11.517834 4732 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Apr 02 13:57:11 crc kubenswrapper[4732]: E0402 13:57:11.518033 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5a07520-1380-45b1-a00a-7148b158711e-cert podName:d5a07520-1380-45b1-a00a-7148b158711e nodeName:}" failed. No retries permitted until 2026-04-02 13:57:12.518017626 +0000 UTC m=+1189.422425179 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d5a07520-1380-45b1-a00a-7148b158711e-cert") pod "openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh" (UID: "d5a07520-1380-45b1-a00a-7148b158711e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.518308 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-86644c9c9c-nhxqn"] Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.532882 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-58689c6fff-47xk7"] Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.560347 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-648bdc7f99-vt6x9"] Apr 02 13:57:11 crc kubenswrapper[4732]: W0402 13:57:11.706834 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode46394c5_fd9e_4c0d_8e78_96723f5931d9.slice/crio-194fcd16a773f7cbfb27a275926419b1bdf18ec802a4db8281e3e97740e6d0d9 WatchSource:0}: Error finding container 194fcd16a773f7cbfb27a275926419b1bdf18ec802a4db8281e3e97740e6d0d9: Status 404 returned error can't find the container with id 194fcd16a773f7cbfb27a275926419b1bdf18ec802a4db8281e3e97740e6d0d9 Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.715166 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-8684f86954-xgncs"] Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.723912 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f96574b5-nbm76"] Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.732343 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6ccfd84cb4-hv8p6"] Apr 02 13:57:11 crc kubenswrapper[4732]: W0402 13:57:11.738923 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12296214_f552_4868_8884_66c241eb973b.slice/crio-8aac733ce06774b633df3f3958082857aab957d46952b2b758305396560e9296 WatchSource:0}: Error finding container 8aac733ce06774b633df3f3958082857aab957d46952b2b758305396560e9296: Status 404 returned error can't find the container with id 8aac733ce06774b633df3f3958082857aab957d46952b2b758305396560e9296 Apr 02 13:57:11 crc kubenswrapper[4732]: W0402 13:57:11.775485 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod879197e5_dc13_4c17_b8ac_7e51a97aa0f2.slice/crio-cd572d8a44a2a2d09deb1634ddf3e55524a3945a8c5a2b2f54fcde560f0f1388 WatchSource:0}: Error finding container cd572d8a44a2a2d09deb1634ddf3e55524a3945a8c5a2b2f54fcde560f0f1388: Status 404 returned error can't find the container with id cd572d8a44a2a2d09deb1634ddf3e55524a3945a8c5a2b2f54fcde560f0f1388 Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.855843 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-dbf8bb784-4kz5n"] Apr 02 13:57:11 crc kubenswrapper[4732]: W0402 13:57:11.860232 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b75349c_23b4_4dc0_914f_f1dc82b12e18.slice/crio-872f0ffbc8626c5672aee80c7188201e096fb72c657ed8f482fabe25a1a3b921 WatchSource:0}: Error finding container 872f0ffbc8626c5672aee80c7188201e096fb72c657ed8f482fabe25a1a3b921: Status 404 returned error can't find the container with id 872f0ffbc8626c5672aee80c7188201e096fb72c657ed8f482fabe25a1a3b921 Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.864728 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6b7497dc59-ph5hk"] Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.925444 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-metrics-certs\") pod \"openstack-operator-controller-manager-5985877f6-hxnth\" (UID: \"e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f\") " pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" Apr 02 13:57:11 crc kubenswrapper[4732]: I0402 13:57:11.925485 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-webhook-certs\") pod \"openstack-operator-controller-manager-5985877f6-hxnth\" (UID: \"e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f\") " pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" Apr 02 13:57:11 crc kubenswrapper[4732]: E0402 13:57:11.925638 4732 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Apr 02 13:57:11 crc kubenswrapper[4732]: E0402 13:57:11.925685 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-webhook-certs podName:e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f nodeName:}" failed. No retries permitted until 2026-04-02 13:57:12.925672105 +0000 UTC m=+1189.830079658 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-webhook-certs") pod "openstack-operator-controller-manager-5985877f6-hxnth" (UID: "e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f") : secret "webhook-server-cert" not found Apr 02 13:57:11 crc kubenswrapper[4732]: E0402 13:57:11.925982 4732 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Apr 02 13:57:11 crc kubenswrapper[4732]: E0402 13:57:11.926005 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-metrics-certs podName:e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f nodeName:}" failed. No retries permitted until 2026-04-02 13:57:12.925997883 +0000 UTC m=+1189.830405436 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-metrics-certs") pod "openstack-operator-controller-manager-5985877f6-hxnth" (UID: "e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f") : secret "metrics-server-cert" not found Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.099725 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-fbdcf7f7b-bw2df"] Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.184065 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-84464c7c78-tgrmk"] Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.202766 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7594f57946-2rvck"] Apr 02 13:57:12 crc kubenswrapper[4732]: W0402 13:57:12.209644 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c68b230_3f85_41f9_a6ed_7da1d0738748.slice/crio-68977b0d14c770b0e0c78b9abb59952140543638cd420ab610f25d9ec04ebf64 WatchSource:0}: Error finding container 68977b0d14c770b0e0c78b9abb59952140543638cd420ab610f25d9ec04ebf64: Status 404 returned error can't find the container with id 68977b0d14c770b0e0c78b9abb59952140543638cd420ab610f25d9ec04ebf64 Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.218514 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-z49v9"] Apr 02 13:57:12 crc kubenswrapper[4732]: E0402 13:57:12.227047 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:566b1f4d3f3d50e9620b845e12ef72bf3a27e07233a9c7424c1102045a4e74a2,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hc5mw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6f76d4c7-nbxg9_openstack-operators(426c551b-e661-40e0-9aa3-a83897ce2814): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.229256 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43f86830-d407-4dc4-9b09-388fb5db82c8-cert\") pod \"infra-operator-controller-manager-58f79b884c-5q7cz\" (UID: \"43f86830-d407-4dc4-9b09-388fb5db82c8\") " pod="openstack-operators/infra-operator-controller-manager-58f79b884c-5q7cz" Apr 02 13:57:12 crc kubenswrapper[4732]: W0402 13:57:12.229465 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd54902c_4922_4c49_85c1_280af54370ba.slice/crio-f712c7ef101d0ef394cf5a79529db54f3d06b50e61a8a4ca2ba01e076415b846 WatchSource:0}: Error finding container f712c7ef101d0ef394cf5a79529db54f3d06b50e61a8a4ca2ba01e076415b846: Status 404 returned error can't find the container with id f712c7ef101d0ef394cf5a79529db54f3d06b50e61a8a4ca2ba01e076415b846 Apr 02 13:57:12 crc kubenswrapper[4732]: E0402 13:57:12.229493 4732 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Apr 02 13:57:12 crc kubenswrapper[4732]: E0402 13:57:12.229556 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43f86830-d407-4dc4-9b09-388fb5db82c8-cert podName:43f86830-d407-4dc4-9b09-388fb5db82c8 nodeName:}" failed. No retries permitted until 2026-04-02 13:57:14.229536503 +0000 UTC m=+1191.133944056 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/43f86830-d407-4dc4-9b09-388fb5db82c8-cert") pod "infra-operator-controller-manager-58f79b884c-5q7cz" (UID: "43f86830-d407-4dc4-9b09-388fb5db82c8") : secret "infra-operator-webhook-server-cert" not found Apr 02 13:57:12 crc kubenswrapper[4732]: E0402 13:57:12.230759 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6f76d4c7-nbxg9" podUID="426c551b-e661-40e0-9aa3-a83897ce2814" Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.233143 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-559d8fdb6b-mfzk4"] Apr 02 13:57:12 crc kubenswrapper[4732]: E0402 13:57:12.236893 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:2c1ef8575d74ef938c900e7ea7e622afeb589db6b4dcf30da544cc5689775296,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s4n66,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-56ccc97cf5-ztlzz_openstack-operators(bd54902c-4922-4c49-85c1-280af54370ba): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Apr 02 13:57:12 crc kubenswrapper[4732]: E0402 13:57:12.238412 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-56ccc97cf5-ztlzz" podUID="bd54902c-4922-4c49-85c1-280af54370ba" Apr 02 13:57:12 crc kubenswrapper[4732]: W0402 13:57:12.238582 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64892a56_9180_4d1d_ad33_d87caa5f2002.slice/crio-e0aff2f74eb5ad81f402103f406bedfc27a4f4789cfb3e8da714b254baefdec4 WatchSource:0}: Error finding container e0aff2f74eb5ad81f402103f406bedfc27a4f4789cfb3e8da714b254baefdec4: Status 404 returned error can't find the container with id e0aff2f74eb5ad81f402103f406bedfc27a4f4789cfb3e8da714b254baefdec4 Apr 02 13:57:12 crc kubenswrapper[4732]: E0402 13:57:12.243342 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:cbfb984a8e275ea0a5f80f343d6650d7e9ac0aface0b4f7aa38a2de3b115153c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5d7hg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-989fbd45-w2zrf_openstack-operators(64892a56-9180-4d1d-ad33-d87caa5f2002): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Apr 02 13:57:12 crc kubenswrapper[4732]: E0402 13:57:12.244562 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-989fbd45-w2zrf" podUID="64892a56-9180-4d1d-ad33-d87caa5f2002" Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.246354 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6f76d4c7-nbxg9"] Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.273899 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-989fbd45-w2zrf"] Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.283177 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56ccc97cf5-ztlzz"] Apr 02 13:57:12 crc kubenswrapper[4732]: W0402 13:57:12.290796 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod338e9bfc_709f_49f2_8456_9dbe8b815382.slice/crio-69421083cac79aab47b01b3c5892e0a723fa8cd8ed374322df56578f9bcad7d2 WatchSource:0}: Error finding container 69421083cac79aab47b01b3c5892e0a723fa8cd8ed374322df56578f9bcad7d2: Status 404 returned error can't find the container with id 69421083cac79aab47b01b3c5892e0a723fa8cd8ed374322df56578f9bcad7d2 Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.291117 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d6f9fd68c-mvbqt"] Apr 02 13:57:12 crc kubenswrapper[4732]: E0402 13:57:12.294637 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:16b276e3f22dc79232c2150ae53becd10ebcf2d9b883f7df4ff98a929eefac91,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jwvln,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6554749d88-4cwml_openstack-operators(338e9bfc-709f-49f2-8456-9dbe8b815382): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Apr 02 13:57:12 crc kubenswrapper[4732]: E0402 13:57:12.294698 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:388c06cd947e4eaf823e3d64de2d3ba7660dbd9d4c01729d92bd628e5e73bc5f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-drshx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d6f9fd68c-mvbqt_openstack-operators(b49d6074-a4b1-4658-b6b8-95bfe63163b0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Apr 02 13:57:12 crc kubenswrapper[4732]: E0402 13:57:12.295881 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-5d6f9fd68c-mvbqt" podUID="b49d6074-a4b1-4658-b6b8-95bfe63163b0" Apr 02 13:57:12 crc kubenswrapper[4732]: E0402 13:57:12.295932 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-6554749d88-4cwml" podUID="338e9bfc-709f-49f2-8456-9dbe8b815382" Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.298140 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6554749d88-4cwml"] Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.480386 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-dbf8bb784-4kz5n" event={"ID":"6b75349c-23b4-4dc0-914f-f1dc82b12e18","Type":"ContainerStarted","Data":"872f0ffbc8626c5672aee80c7188201e096fb72c657ed8f482fabe25a1a3b921"} Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.482221 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-z49v9" event={"ID":"1c68b230-3f85-41f9-a6ed-7da1d0738748","Type":"ContainerStarted","Data":"68977b0d14c770b0e0c78b9abb59952140543638cd420ab610f25d9ec04ebf64"} Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.488575 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-58689c6fff-47xk7" event={"ID":"4c76cc17-ab86-4c9c-9438-7e72e2ce895f","Type":"ContainerStarted","Data":"a8b56011b230332a58cc5bfed43abb7a1a3407cca4cbb390fa09e30529e317a5"} Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.491569 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7594f57946-2rvck" event={"ID":"5319722c-7913-4dcd-a03d-dc7a5040b434","Type":"ContainerStarted","Data":"d2874d35d03e73e855b876a26156fa4c5d532ae2098826c318d853a08d898a39"} Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.493461 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-559d8fdb6b-mfzk4" event={"ID":"6296d461-d333-4b9c-a082-e48db64bdd96","Type":"ContainerStarted","Data":"a12d7295d0b0da062a242b1a5f95cf9a617bfb1c7158b9b8dec54f1dcd8fc3d8"} Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.495307 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-84464c7c78-tgrmk" event={"ID":"bec355a9-c60e-4480-a32c-f1a43ef27131","Type":"ContainerStarted","Data":"f18c871366dd6086dd2dc13161272f5a685d2eec58e0b9a5174d419bf2e2e3d7"} Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.496860 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-648bdc7f99-vt6x9" event={"ID":"5e74dfe1-0e0f-4b70-8b9a-db645eb40e05","Type":"ContainerStarted","Data":"712f0de753ec9a01533eff93b4e3521d14be60e1899eeb41d21449e7e5b81f0a"} Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.498657 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-fbdcf7f7b-bw2df" event={"ID":"5b424dbc-80ac-46ae-90d2-c69fdf4c14d7","Type":"ContainerStarted","Data":"32b72bc3761cbbc3527ae0dbf31b4c4d3c5d8090c0a93d2debbd80da06f95b14"} Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.500081 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6b7497dc59-ph5hk" event={"ID":"084daf4c-82c9-42e7-8eb9-3ae4658c1742","Type":"ContainerStarted","Data":"55118c495f15e41d23e1e5643a4bce3bdd8c3c9627655cb42328f8907aafb3ab"} Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.502203 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8684f86954-xgncs" event={"ID":"12296214-f552-4868-8884-66c241eb973b","Type":"ContainerStarted","Data":"8aac733ce06774b633df3f3958082857aab957d46952b2b758305396560e9296"} Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.503583 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56ccc97cf5-ztlzz" event={"ID":"bd54902c-4922-4c49-85c1-280af54370ba","Type":"ContainerStarted","Data":"f712c7ef101d0ef394cf5a79529db54f3d06b50e61a8a4ca2ba01e076415b846"} Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.504639 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6f76d4c7-nbxg9" event={"ID":"426c551b-e661-40e0-9aa3-a83897ce2814","Type":"ContainerStarted","Data":"b39d9ade78205e2e20cb04c6d00457d4526c3d8eb40372db8e537f80495a2927"} Apr 02 13:57:12 crc kubenswrapper[4732]: E0402 13:57:12.505497 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:2c1ef8575d74ef938c900e7ea7e622afeb589db6b4dcf30da544cc5689775296\\\"\"" pod="openstack-operators/test-operator-controller-manager-56ccc97cf5-ztlzz" podUID="bd54902c-4922-4c49-85c1-280af54370ba" Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.505749 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-989fbd45-w2zrf" event={"ID":"64892a56-9180-4d1d-ad33-d87caa5f2002","Type":"ContainerStarted","Data":"e0aff2f74eb5ad81f402103f406bedfc27a4f4789cfb3e8da714b254baefdec4"} Apr 02 13:57:12 crc kubenswrapper[4732]: E0402 13:57:12.505898 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:566b1f4d3f3d50e9620b845e12ef72bf3a27e07233a9c7424c1102045a4e74a2\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6f76d4c7-nbxg9" podUID="426c551b-e661-40e0-9aa3-a83897ce2814" Apr 02 13:57:12 crc kubenswrapper[4732]: E0402 13:57:12.506748 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:cbfb984a8e275ea0a5f80f343d6650d7e9ac0aface0b4f7aa38a2de3b115153c\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-989fbd45-w2zrf" podUID="64892a56-9180-4d1d-ad33-d87caa5f2002" Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.507557 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6ccfd84cb4-hv8p6" event={"ID":"879197e5-dc13-4c17-b8ac-7e51a97aa0f2","Type":"ContainerStarted","Data":"cd572d8a44a2a2d09deb1634ddf3e55524a3945a8c5a2b2f54fcde560f0f1388"} Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.509791 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d6f9fd68c-mvbqt" event={"ID":"b49d6074-a4b1-4658-b6b8-95bfe63163b0","Type":"ContainerStarted","Data":"6b54552d83b4b5b5c42bb3ae982c757a6e8f04ef69192925ad18c0de969d1e5e"} Apr 02 13:57:12 crc kubenswrapper[4732]: E0402 13:57:12.511525 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:388c06cd947e4eaf823e3d64de2d3ba7660dbd9d4c01729d92bd628e5e73bc5f\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d6f9fd68c-mvbqt" podUID="b49d6074-a4b1-4658-b6b8-95bfe63163b0" Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.517315 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f96574b5-nbm76" event={"ID":"e46394c5-fd9e-4c0d-8e78-96723f5931d9","Type":"ContainerStarted","Data":"194fcd16a773f7cbfb27a275926419b1bdf18ec802a4db8281e3e97740e6d0d9"} Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.519483 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-86644c9c9c-nhxqn" event={"ID":"08d5eea8-7c67-4aa1-ad91-ab1c60214872","Type":"ContainerStarted","Data":"5ba65f6ac17975acdb942a26415d2778645ca231ad92a2614d5914dbe8b6ea87"} Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.528813 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6554749d88-4cwml" event={"ID":"338e9bfc-709f-49f2-8456-9dbe8b815382","Type":"ContainerStarted","Data":"69421083cac79aab47b01b3c5892e0a723fa8cd8ed374322df56578f9bcad7d2"} Apr 02 13:57:12 crc kubenswrapper[4732]: E0402 13:57:12.532491 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:16b276e3f22dc79232c2150ae53becd10ebcf2d9b883f7df4ff98a929eefac91\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6554749d88-4cwml" podUID="338e9bfc-709f-49f2-8456-9dbe8b815382" Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.534039 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5a07520-1380-45b1-a00a-7148b158711e-cert\") pod \"openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh\" (UID: \"d5a07520-1380-45b1-a00a-7148b158711e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh" Apr 02 13:57:12 crc kubenswrapper[4732]: E0402 13:57:12.534302 4732 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Apr 02 13:57:12 crc kubenswrapper[4732]: E0402 13:57:12.534421 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5a07520-1380-45b1-a00a-7148b158711e-cert podName:d5a07520-1380-45b1-a00a-7148b158711e nodeName:}" failed. No retries permitted until 2026-04-02 13:57:14.534400048 +0000 UTC m=+1191.438807601 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d5a07520-1380-45b1-a00a-7148b158711e-cert") pod "openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh" (UID: "d5a07520-1380-45b1-a00a-7148b158711e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.940946 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-metrics-certs\") pod \"openstack-operator-controller-manager-5985877f6-hxnth\" (UID: \"e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f\") " pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" Apr 02 13:57:12 crc kubenswrapper[4732]: E0402 13:57:12.941061 4732 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Apr 02 13:57:12 crc kubenswrapper[4732]: I0402 13:57:12.941144 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-webhook-certs\") pod \"openstack-operator-controller-manager-5985877f6-hxnth\" (UID: \"e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f\") " pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" Apr 02 13:57:12 crc kubenswrapper[4732]: E0402 13:57:12.941317 4732 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Apr 02 13:57:12 crc kubenswrapper[4732]: E0402 13:57:12.941321 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-metrics-certs podName:e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f nodeName:}" failed. No retries permitted until 2026-04-02 13:57:14.941289346 +0000 UTC m=+1191.845696929 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-metrics-certs") pod "openstack-operator-controller-manager-5985877f6-hxnth" (UID: "e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f") : secret "metrics-server-cert" not found Apr 02 13:57:12 crc kubenswrapper[4732]: E0402 13:57:12.941520 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-webhook-certs podName:e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f nodeName:}" failed. No retries permitted until 2026-04-02 13:57:14.941498502 +0000 UTC m=+1191.845906095 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-webhook-certs") pod "openstack-operator-controller-manager-5985877f6-hxnth" (UID: "e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f") : secret "webhook-server-cert" not found Apr 02 13:57:13 crc kubenswrapper[4732]: E0402 13:57:13.539876 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:388c06cd947e4eaf823e3d64de2d3ba7660dbd9d4c01729d92bd628e5e73bc5f\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d6f9fd68c-mvbqt" podUID="b49d6074-a4b1-4658-b6b8-95bfe63163b0" Apr 02 13:57:13 crc kubenswrapper[4732]: E0402 13:57:13.541633 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:2c1ef8575d74ef938c900e7ea7e622afeb589db6b4dcf30da544cc5689775296\\\"\"" pod="openstack-operators/test-operator-controller-manager-56ccc97cf5-ztlzz" podUID="bd54902c-4922-4c49-85c1-280af54370ba" Apr 02 13:57:13 crc kubenswrapper[4732]: E0402 13:57:13.541750 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:cbfb984a8e275ea0a5f80f343d6650d7e9ac0aface0b4f7aa38a2de3b115153c\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-989fbd45-w2zrf" podUID="64892a56-9180-4d1d-ad33-d87caa5f2002" Apr 02 13:57:13 crc kubenswrapper[4732]: E0402 13:57:13.541843 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:566b1f4d3f3d50e9620b845e12ef72bf3a27e07233a9c7424c1102045a4e74a2\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6f76d4c7-nbxg9" podUID="426c551b-e661-40e0-9aa3-a83897ce2814" Apr 02 13:57:13 crc kubenswrapper[4732]: E0402 13:57:13.543474 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:16b276e3f22dc79232c2150ae53becd10ebcf2d9b883f7df4ff98a929eefac91\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6554749d88-4cwml" podUID="338e9bfc-709f-49f2-8456-9dbe8b815382" Apr 02 13:57:14 crc kubenswrapper[4732]: I0402 13:57:14.279022 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43f86830-d407-4dc4-9b09-388fb5db82c8-cert\") pod \"infra-operator-controller-manager-58f79b884c-5q7cz\" (UID: \"43f86830-d407-4dc4-9b09-388fb5db82c8\") " pod="openstack-operators/infra-operator-controller-manager-58f79b884c-5q7cz" Apr 02 13:57:14 crc kubenswrapper[4732]: E0402 13:57:14.279428 4732 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Apr 02 13:57:14 crc kubenswrapper[4732]: E0402 13:57:14.279481 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43f86830-d407-4dc4-9b09-388fb5db82c8-cert podName:43f86830-d407-4dc4-9b09-388fb5db82c8 nodeName:}" failed. No retries permitted until 2026-04-02 13:57:18.279467799 +0000 UTC m=+1195.183875352 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/43f86830-d407-4dc4-9b09-388fb5db82c8-cert") pod "infra-operator-controller-manager-58f79b884c-5q7cz" (UID: "43f86830-d407-4dc4-9b09-388fb5db82c8") : secret "infra-operator-webhook-server-cert" not found Apr 02 13:57:14 crc kubenswrapper[4732]: I0402 13:57:14.396006 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n9mhr" Apr 02 13:57:14 crc kubenswrapper[4732]: I0402 13:57:14.396080 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n9mhr" Apr 02 13:57:14 crc kubenswrapper[4732]: I0402 13:57:14.453702 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n9mhr" Apr 02 13:57:14 crc kubenswrapper[4732]: I0402 13:57:14.597054 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5a07520-1380-45b1-a00a-7148b158711e-cert\") pod \"openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh\" (UID: \"d5a07520-1380-45b1-a00a-7148b158711e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh" Apr 02 13:57:14 crc kubenswrapper[4732]: E0402 13:57:14.597195 4732 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Apr 02 13:57:14 crc kubenswrapper[4732]: E0402 13:57:14.597254 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5a07520-1380-45b1-a00a-7148b158711e-cert podName:d5a07520-1380-45b1-a00a-7148b158711e nodeName:}" failed. No retries permitted until 2026-04-02 13:57:18.597237421 +0000 UTC m=+1195.501644974 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d5a07520-1380-45b1-a00a-7148b158711e-cert") pod "openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh" (UID: "d5a07520-1380-45b1-a00a-7148b158711e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Apr 02 13:57:14 crc kubenswrapper[4732]: I0402 13:57:14.599811 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n9mhr" Apr 02 13:57:14 crc kubenswrapper[4732]: I0402 13:57:14.689194 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n9mhr"] Apr 02 13:57:15 crc kubenswrapper[4732]: I0402 13:57:15.007850 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-webhook-certs\") pod \"openstack-operator-controller-manager-5985877f6-hxnth\" (UID: \"e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f\") " pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" Apr 02 13:57:15 crc kubenswrapper[4732]: E0402 13:57:15.008012 4732 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Apr 02 13:57:15 crc kubenswrapper[4732]: E0402 13:57:15.008094 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-webhook-certs podName:e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f nodeName:}" failed. No retries permitted until 2026-04-02 13:57:19.008069334 +0000 UTC m=+1195.912476917 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-webhook-certs") pod "openstack-operator-controller-manager-5985877f6-hxnth" (UID: "e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f") : secret "webhook-server-cert" not found Apr 02 13:57:15 crc kubenswrapper[4732]: I0402 13:57:15.008088 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-metrics-certs\") pod \"openstack-operator-controller-manager-5985877f6-hxnth\" (UID: \"e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f\") " pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" Apr 02 13:57:15 crc kubenswrapper[4732]: E0402 13:57:15.008183 4732 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Apr 02 13:57:15 crc kubenswrapper[4732]: E0402 13:57:15.008240 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-metrics-certs podName:e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f nodeName:}" failed. No retries permitted until 2026-04-02 13:57:19.008225978 +0000 UTC m=+1195.912633531 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-metrics-certs") pod "openstack-operator-controller-manager-5985877f6-hxnth" (UID: "e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f") : secret "metrics-server-cert" not found Apr 02 13:57:16 crc kubenswrapper[4732]: I0402 13:57:16.570038 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n9mhr" podUID="e050bf65-a203-40f8-9445-fbdcb87b7e91" containerName="registry-server" containerID="cri-o://86262be09f38d058bf81202ad073b865ec76dce7a4781cd72e14d8ca4c54438a" gracePeriod=2 Apr 02 13:57:17 crc kubenswrapper[4732]: I0402 13:57:17.580881 4732 generic.go:334] "Generic (PLEG): container finished" podID="e050bf65-a203-40f8-9445-fbdcb87b7e91" containerID="86262be09f38d058bf81202ad073b865ec76dce7a4781cd72e14d8ca4c54438a" exitCode=0 Apr 02 13:57:17 crc kubenswrapper[4732]: I0402 13:57:17.581056 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n9mhr" event={"ID":"e050bf65-a203-40f8-9445-fbdcb87b7e91","Type":"ContainerDied","Data":"86262be09f38d058bf81202ad073b865ec76dce7a4781cd72e14d8ca4c54438a"} Apr 02 13:57:18 crc kubenswrapper[4732]: I0402 13:57:18.363029 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43f86830-d407-4dc4-9b09-388fb5db82c8-cert\") pod \"infra-operator-controller-manager-58f79b884c-5q7cz\" (UID: \"43f86830-d407-4dc4-9b09-388fb5db82c8\") " pod="openstack-operators/infra-operator-controller-manager-58f79b884c-5q7cz" Apr 02 13:57:18 crc kubenswrapper[4732]: E0402 13:57:18.363214 4732 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Apr 02 13:57:18 crc kubenswrapper[4732]: E0402 13:57:18.363543 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43f86830-d407-4dc4-9b09-388fb5db82c8-cert podName:43f86830-d407-4dc4-9b09-388fb5db82c8 nodeName:}" failed. No retries permitted until 2026-04-02 13:57:26.363525223 +0000 UTC m=+1203.267932776 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/43f86830-d407-4dc4-9b09-388fb5db82c8-cert") pod "infra-operator-controller-manager-58f79b884c-5q7cz" (UID: "43f86830-d407-4dc4-9b09-388fb5db82c8") : secret "infra-operator-webhook-server-cert" not found Apr 02 13:57:18 crc kubenswrapper[4732]: I0402 13:57:18.666900 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5a07520-1380-45b1-a00a-7148b158711e-cert\") pod \"openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh\" (UID: \"d5a07520-1380-45b1-a00a-7148b158711e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh" Apr 02 13:57:18 crc kubenswrapper[4732]: E0402 13:57:18.667029 4732 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Apr 02 13:57:18 crc kubenswrapper[4732]: E0402 13:57:18.667081 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5a07520-1380-45b1-a00a-7148b158711e-cert podName:d5a07520-1380-45b1-a00a-7148b158711e nodeName:}" failed. No retries permitted until 2026-04-02 13:57:26.667067983 +0000 UTC m=+1203.571475536 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d5a07520-1380-45b1-a00a-7148b158711e-cert") pod "openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh" (UID: "d5a07520-1380-45b1-a00a-7148b158711e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Apr 02 13:57:19 crc kubenswrapper[4732]: I0402 13:57:19.074732 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-metrics-certs\") pod \"openstack-operator-controller-manager-5985877f6-hxnth\" (UID: \"e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f\") " pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" Apr 02 13:57:19 crc kubenswrapper[4732]: I0402 13:57:19.074783 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-webhook-certs\") pod \"openstack-operator-controller-manager-5985877f6-hxnth\" (UID: \"e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f\") " pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" Apr 02 13:57:19 crc kubenswrapper[4732]: E0402 13:57:19.074889 4732 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Apr 02 13:57:19 crc kubenswrapper[4732]: E0402 13:57:19.074943 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-webhook-certs podName:e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f nodeName:}" failed. No retries permitted until 2026-04-02 13:57:27.074930107 +0000 UTC m=+1203.979337660 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-webhook-certs") pod "openstack-operator-controller-manager-5985877f6-hxnth" (UID: "e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f") : secret "webhook-server-cert" not found Apr 02 13:57:19 crc kubenswrapper[4732]: E0402 13:57:19.075253 4732 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Apr 02 13:57:19 crc kubenswrapper[4732]: E0402 13:57:19.075294 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-metrics-certs podName:e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f nodeName:}" failed. No retries permitted until 2026-04-02 13:57:27.075284587 +0000 UTC m=+1203.979692140 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-metrics-certs") pod "openstack-operator-controller-manager-5985877f6-hxnth" (UID: "e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f") : secret "metrics-server-cert" not found Apr 02 13:57:24 crc kubenswrapper[4732]: E0402 13:57:24.397394 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 86262be09f38d058bf81202ad073b865ec76dce7a4781cd72e14d8ca4c54438a is running failed: container process not found" containerID="86262be09f38d058bf81202ad073b865ec76dce7a4781cd72e14d8ca4c54438a" cmd=["grpc_health_probe","-addr=:50051"] Apr 02 13:57:24 crc kubenswrapper[4732]: E0402 13:57:24.398471 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 86262be09f38d058bf81202ad073b865ec76dce7a4781cd72e14d8ca4c54438a is running failed: container process not found" containerID="86262be09f38d058bf81202ad073b865ec76dce7a4781cd72e14d8ca4c54438a" cmd=["grpc_health_probe","-addr=:50051"] Apr 02 13:57:24 crc kubenswrapper[4732]: E0402 13:57:24.399396 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 86262be09f38d058bf81202ad073b865ec76dce7a4781cd72e14d8ca4c54438a is running failed: container process not found" containerID="86262be09f38d058bf81202ad073b865ec76dce7a4781cd72e14d8ca4c54438a" cmd=["grpc_health_probe","-addr=:50051"] Apr 02 13:57:24 crc kubenswrapper[4732]: E0402 13:57:24.399556 4732 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 86262be09f38d058bf81202ad073b865ec76dce7a4781cd72e14d8ca4c54438a is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-n9mhr" podUID="e050bf65-a203-40f8-9445-fbdcb87b7e91" containerName="registry-server" Apr 02 13:57:24 crc kubenswrapper[4732]: E0402 13:57:24.433663 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:8a7004a3835cbb93cdda6d3006cfe098b3333a6010344099a4dfbd2927280cc5" Apr 02 13:57:24 crc kubenswrapper[4732]: E0402 13:57:24.433880 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:8a7004a3835cbb93cdda6d3006cfe098b3333a6010344099a4dfbd2927280cc5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2sbbp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-86644c9c9c-nhxqn_openstack-operators(08d5eea8-7c67-4aa1-ad91-ab1c60214872): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Apr 02 13:57:24 crc kubenswrapper[4732]: E0402 13:57:24.435041 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-86644c9c9c-nhxqn" podUID="08d5eea8-7c67-4aa1-ad91-ab1c60214872" Apr 02 13:57:24 crc kubenswrapper[4732]: E0402 13:57:24.632418 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:8a7004a3835cbb93cdda6d3006cfe098b3333a6010344099a4dfbd2927280cc5\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-86644c9c9c-nhxqn" podUID="08d5eea8-7c67-4aa1-ad91-ab1c60214872" Apr 02 13:57:25 crc kubenswrapper[4732]: E0402 13:57:25.062101 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:741c1bc38c0d9430995a0d0aae6adae5c1f490b23d620564595cdd40683df68b" Apr 02 13:57:25 crc kubenswrapper[4732]: E0402 13:57:25.062502 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:741c1bc38c0d9430995a0d0aae6adae5c1f490b23d620564595cdd40683df68b,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k9gkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-5f96574b5-nbm76_openstack-operators(e46394c5-fd9e-4c0d-8e78-96723f5931d9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Apr 02 13:57:25 crc kubenswrapper[4732]: E0402 13:57:25.063687 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-5f96574b5-nbm76" podUID="e46394c5-fd9e-4c0d-8e78-96723f5931d9" Apr 02 13:57:25 crc kubenswrapper[4732]: E0402 13:57:25.578166 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:e5f1303497321c083933cd8ab46e0c95c3f7f3f4101e0c2c3df79eb089abab9e" Apr 02 13:57:25 crc kubenswrapper[4732]: E0402 13:57:25.578368 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:e5f1303497321c083933cd8ab46e0c95c3f7f3f4101e0c2c3df79eb089abab9e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bbpph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-6b7497dc59-ph5hk_openstack-operators(084daf4c-82c9-42e7-8eb9-3ae4658c1742): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Apr 02 13:57:25 crc kubenswrapper[4732]: E0402 13:57:25.579577 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-6b7497dc59-ph5hk" podUID="084daf4c-82c9-42e7-8eb9-3ae4658c1742" Apr 02 13:57:25 crc kubenswrapper[4732]: E0402 13:57:25.639918 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:e5f1303497321c083933cd8ab46e0c95c3f7f3f4101e0c2c3df79eb089abab9e\\\"\"" pod="openstack-operators/manila-operator-controller-manager-6b7497dc59-ph5hk" podUID="084daf4c-82c9-42e7-8eb9-3ae4658c1742" Apr 02 13:57:25 crc kubenswrapper[4732]: E0402 13:57:25.639944 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:741c1bc38c0d9430995a0d0aae6adae5c1f490b23d620564595cdd40683df68b\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-5f96574b5-nbm76" podUID="e46394c5-fd9e-4c0d-8e78-96723f5931d9" Apr 02 13:57:26 crc kubenswrapper[4732]: I0402 13:57:26.370976 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43f86830-d407-4dc4-9b09-388fb5db82c8-cert\") pod \"infra-operator-controller-manager-58f79b884c-5q7cz\" (UID: \"43f86830-d407-4dc4-9b09-388fb5db82c8\") " pod="openstack-operators/infra-operator-controller-manager-58f79b884c-5q7cz" Apr 02 13:57:26 crc kubenswrapper[4732]: E0402 13:57:26.371159 4732 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Apr 02 13:57:26 crc kubenswrapper[4732]: E0402 13:57:26.371284 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43f86830-d407-4dc4-9b09-388fb5db82c8-cert podName:43f86830-d407-4dc4-9b09-388fb5db82c8 nodeName:}" failed. No retries permitted until 2026-04-02 13:57:42.371260013 +0000 UTC m=+1219.275667566 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/43f86830-d407-4dc4-9b09-388fb5db82c8-cert") pod "infra-operator-controller-manager-58f79b884c-5q7cz" (UID: "43f86830-d407-4dc4-9b09-388fb5db82c8") : secret "infra-operator-webhook-server-cert" not found Apr 02 13:57:26 crc kubenswrapper[4732]: I0402 13:57:26.674750 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5a07520-1380-45b1-a00a-7148b158711e-cert\") pod \"openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh\" (UID: \"d5a07520-1380-45b1-a00a-7148b158711e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh" Apr 02 13:57:26 crc kubenswrapper[4732]: I0402 13:57:26.680870 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5a07520-1380-45b1-a00a-7148b158711e-cert\") pod \"openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh\" (UID: \"d5a07520-1380-45b1-a00a-7148b158711e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh" Apr 02 13:57:26 crc kubenswrapper[4732]: I0402 13:57:26.737441 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ntccm"] Apr 02 13:57:26 crc kubenswrapper[4732]: I0402 13:57:26.739461 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ntccm" Apr 02 13:57:26 crc kubenswrapper[4732]: I0402 13:57:26.747498 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-2565f" Apr 02 13:57:26 crc kubenswrapper[4732]: I0402 13:57:26.757350 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh" Apr 02 13:57:26 crc kubenswrapper[4732]: I0402 13:57:26.759185 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ntccm"] Apr 02 13:57:26 crc kubenswrapper[4732]: I0402 13:57:26.878651 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9-utilities\") pod \"certified-operators-ntccm\" (UID: \"27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9\") " pod="openshift-marketplace/certified-operators-ntccm" Apr 02 13:57:26 crc kubenswrapper[4732]: I0402 13:57:26.878742 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9-catalog-content\") pod \"certified-operators-ntccm\" (UID: \"27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9\") " pod="openshift-marketplace/certified-operators-ntccm" Apr 02 13:57:26 crc kubenswrapper[4732]: I0402 13:57:26.879013 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwbjv\" (UniqueName: \"kubernetes.io/projected/27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9-kube-api-access-qwbjv\") pod \"certified-operators-ntccm\" (UID: \"27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9\") " pod="openshift-marketplace/certified-operators-ntccm" Apr 02 13:57:26 crc kubenswrapper[4732]: I0402 13:57:26.979586 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwbjv\" (UniqueName: \"kubernetes.io/projected/27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9-kube-api-access-qwbjv\") pod \"certified-operators-ntccm\" (UID: \"27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9\") " pod="openshift-marketplace/certified-operators-ntccm" Apr 02 13:57:26 crc kubenswrapper[4732]: I0402 13:57:26.979719 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9-utilities\") pod \"certified-operators-ntccm\" (UID: \"27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9\") " pod="openshift-marketplace/certified-operators-ntccm" Apr 02 13:57:26 crc kubenswrapper[4732]: I0402 13:57:26.980204 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9-utilities\") pod \"certified-operators-ntccm\" (UID: \"27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9\") " pod="openshift-marketplace/certified-operators-ntccm" Apr 02 13:57:26 crc kubenswrapper[4732]: I0402 13:57:26.980281 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9-catalog-content\") pod \"certified-operators-ntccm\" (UID: \"27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9\") " pod="openshift-marketplace/certified-operators-ntccm" Apr 02 13:57:26 crc kubenswrapper[4732]: I0402 13:57:26.980572 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9-catalog-content\") pod \"certified-operators-ntccm\" (UID: \"27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9\") " pod="openshift-marketplace/certified-operators-ntccm" Apr 02 13:57:26 crc kubenswrapper[4732]: I0402 13:57:26.997551 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwbjv\" (UniqueName: \"kubernetes.io/projected/27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9-kube-api-access-qwbjv\") pod \"certified-operators-ntccm\" (UID: \"27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9\") " pod="openshift-marketplace/certified-operators-ntccm" Apr 02 13:57:27 crc kubenswrapper[4732]: I0402 13:57:27.059048 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ntccm" Apr 02 13:57:27 crc kubenswrapper[4732]: I0402 13:57:27.081474 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-metrics-certs\") pod \"openstack-operator-controller-manager-5985877f6-hxnth\" (UID: \"e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f\") " pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" Apr 02 13:57:27 crc kubenswrapper[4732]: I0402 13:57:27.081533 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-webhook-certs\") pod \"openstack-operator-controller-manager-5985877f6-hxnth\" (UID: \"e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f\") " pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" Apr 02 13:57:27 crc kubenswrapper[4732]: E0402 13:57:27.081729 4732 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Apr 02 13:57:27 crc kubenswrapper[4732]: E0402 13:57:27.081786 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-webhook-certs podName:e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f nodeName:}" failed. No retries permitted until 2026-04-02 13:57:43.081770193 +0000 UTC m=+1219.986177736 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-webhook-certs") pod "openstack-operator-controller-manager-5985877f6-hxnth" (UID: "e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f") : secret "webhook-server-cert" not found Apr 02 13:57:27 crc kubenswrapper[4732]: I0402 13:57:27.084570 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-metrics-certs\") pod \"openstack-operator-controller-manager-5985877f6-hxnth\" (UID: \"e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f\") " pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" Apr 02 13:57:29 crc kubenswrapper[4732]: E0402 13:57:29.222171 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:cbc03ca8837c64974a4670506a8df688c44432c4aab095f3fa7f1330e72bd3bd" Apr 02 13:57:29 crc kubenswrapper[4732]: E0402 13:57:29.222713 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:cbc03ca8837c64974a4670506a8df688c44432c4aab095f3fa7f1330e72bd3bd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t57kx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-fbdcf7f7b-bw2df_openstack-operators(5b424dbc-80ac-46ae-90d2-c69fdf4c14d7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Apr 02 13:57:29 crc kubenswrapper[4732]: E0402 13:57:29.223993 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-fbdcf7f7b-bw2df" podUID="5b424dbc-80ac-46ae-90d2-c69fdf4c14d7" Apr 02 13:57:29 crc kubenswrapper[4732]: E0402 13:57:29.665896 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:cbc03ca8837c64974a4670506a8df688c44432c4aab095f3fa7f1330e72bd3bd\\\"\"" pod="openstack-operators/swift-operator-controller-manager-fbdcf7f7b-bw2df" podUID="5b424dbc-80ac-46ae-90d2-c69fdf4c14d7" Apr 02 13:57:30 crc kubenswrapper[4732]: E0402 13:57:30.770146 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:b6d44a28b047f402b17b4cc07584f04cd6f1168d8742a9a8b17a9ce7c8550c5a" Apr 02 13:57:30 crc kubenswrapper[4732]: E0402 13:57:30.770466 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:b6d44a28b047f402b17b4cc07584f04cd6f1168d8742a9a8b17a9ce7c8550c5a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qnhx9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7594f57946-2rvck_openstack-operators(5319722c-7913-4dcd-a03d-dc7a5040b434): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Apr 02 13:57:30 crc kubenswrapper[4732]: E0402 13:57:30.773321 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-7594f57946-2rvck" podUID="5319722c-7913-4dcd-a03d-dc7a5040b434" Apr 02 13:57:30 crc kubenswrapper[4732]: I0402 13:57:30.826647 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n9mhr" Apr 02 13:57:30 crc kubenswrapper[4732]: I0402 13:57:30.935152 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4tkv\" (UniqueName: \"kubernetes.io/projected/e050bf65-a203-40f8-9445-fbdcb87b7e91-kube-api-access-m4tkv\") pod \"e050bf65-a203-40f8-9445-fbdcb87b7e91\" (UID: \"e050bf65-a203-40f8-9445-fbdcb87b7e91\") " Apr 02 13:57:30 crc kubenswrapper[4732]: I0402 13:57:30.935218 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e050bf65-a203-40f8-9445-fbdcb87b7e91-utilities\") pod \"e050bf65-a203-40f8-9445-fbdcb87b7e91\" (UID: \"e050bf65-a203-40f8-9445-fbdcb87b7e91\") " Apr 02 13:57:30 crc kubenswrapper[4732]: I0402 13:57:30.935247 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e050bf65-a203-40f8-9445-fbdcb87b7e91-catalog-content\") pod \"e050bf65-a203-40f8-9445-fbdcb87b7e91\" (UID: \"e050bf65-a203-40f8-9445-fbdcb87b7e91\") " Apr 02 13:57:30 crc kubenswrapper[4732]: I0402 13:57:30.936394 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e050bf65-a203-40f8-9445-fbdcb87b7e91-utilities" (OuterVolumeSpecName: "utilities") pod "e050bf65-a203-40f8-9445-fbdcb87b7e91" (UID: "e050bf65-a203-40f8-9445-fbdcb87b7e91"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:57:30 crc kubenswrapper[4732]: I0402 13:57:30.940854 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e050bf65-a203-40f8-9445-fbdcb87b7e91-kube-api-access-m4tkv" (OuterVolumeSpecName: "kube-api-access-m4tkv") pod "e050bf65-a203-40f8-9445-fbdcb87b7e91" (UID: "e050bf65-a203-40f8-9445-fbdcb87b7e91"). InnerVolumeSpecName "kube-api-access-m4tkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:57:30 crc kubenswrapper[4732]: I0402 13:57:30.960256 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e050bf65-a203-40f8-9445-fbdcb87b7e91-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e050bf65-a203-40f8-9445-fbdcb87b7e91" (UID: "e050bf65-a203-40f8-9445-fbdcb87b7e91"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:57:31 crc kubenswrapper[4732]: I0402 13:57:31.036401 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4tkv\" (UniqueName: \"kubernetes.io/projected/e050bf65-a203-40f8-9445-fbdcb87b7e91-kube-api-access-m4tkv\") on node \"crc\" DevicePath \"\"" Apr 02 13:57:31 crc kubenswrapper[4732]: I0402 13:57:31.036427 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e050bf65-a203-40f8-9445-fbdcb87b7e91-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 13:57:31 crc kubenswrapper[4732]: I0402 13:57:31.036436 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e050bf65-a203-40f8-9445-fbdcb87b7e91-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 13:57:31 crc kubenswrapper[4732]: E0402 13:57:31.343928 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:9fd3681c6c8549a78b12dc5e83676bc0956558b01327b95598aa424d62acb189" Apr 02 13:57:31 crc kubenswrapper[4732]: E0402 13:57:31.344104 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:9fd3681c6c8549a78b12dc5e83676bc0956558b01327b95598aa424d62acb189,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hws7p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-dbf8bb784-4kz5n_openstack-operators(6b75349c-23b4-4dc0-914f-f1dc82b12e18): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Apr 02 13:57:31 crc kubenswrapper[4732]: E0402 13:57:31.346077 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-dbf8bb784-4kz5n" podUID="6b75349c-23b4-4dc0-914f-f1dc82b12e18" Apr 02 13:57:31 crc kubenswrapper[4732]: I0402 13:57:31.679769 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n9mhr" event={"ID":"e050bf65-a203-40f8-9445-fbdcb87b7e91","Type":"ContainerDied","Data":"7bcad61ad3ae14eae9a9ca84d4c08ca7f9c84e3fa7fe8ac057586efd691e3b58"} Apr 02 13:57:31 crc kubenswrapper[4732]: I0402 13:57:31.679843 4732 scope.go:117] "RemoveContainer" containerID="86262be09f38d058bf81202ad073b865ec76dce7a4781cd72e14d8ca4c54438a" Apr 02 13:57:31 crc kubenswrapper[4732]: I0402 13:57:31.679961 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n9mhr" Apr 02 13:57:31 crc kubenswrapper[4732]: E0402 13:57:31.682346 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:9fd3681c6c8549a78b12dc5e83676bc0956558b01327b95598aa424d62acb189\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-dbf8bb784-4kz5n" podUID="6b75349c-23b4-4dc0-914f-f1dc82b12e18" Apr 02 13:57:31 crc kubenswrapper[4732]: E0402 13:57:31.682775 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:b6d44a28b047f402b17b4cc07584f04cd6f1168d8742a9a8b17a9ce7c8550c5a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7594f57946-2rvck" podUID="5319722c-7913-4dcd-a03d-dc7a5040b434" Apr 02 13:57:31 crc kubenswrapper[4732]: I0402 13:57:31.730297 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n9mhr"] Apr 02 13:57:31 crc kubenswrapper[4732]: I0402 13:57:31.738560 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n9mhr"] Apr 02 13:57:32 crc kubenswrapper[4732]: I0402 13:57:32.708554 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e050bf65-a203-40f8-9445-fbdcb87b7e91" path="/var/lib/kubelet/pods/e050bf65-a203-40f8-9445-fbdcb87b7e91/volumes" Apr 02 13:57:34 crc kubenswrapper[4732]: I0402 13:57:34.446891 4732 scope.go:117] "RemoveContainer" containerID="1742237420f158c0a12e33a4c5859e58ad7618b20fcaf13f6ccbd243fb2e2c38" Apr 02 13:57:34 crc kubenswrapper[4732]: I0402 13:57:34.568891 4732 scope.go:117] "RemoveContainer" containerID="c0edd05d6ebe2c4e4ba7d14a06d43b65488f00e63d2ea851f836d016d04cbff1" Apr 02 13:57:34 crc kubenswrapper[4732]: I0402 13:57:34.716452 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8684f86954-xgncs" event={"ID":"12296214-f552-4868-8884-66c241eb973b","Type":"ContainerStarted","Data":"8221a39ee4893f07e7492b969be1d155021985c6f0cae7f2856066d28016f25f"} Apr 02 13:57:34 crc kubenswrapper[4732]: I0402 13:57:34.716595 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-8684f86954-xgncs" Apr 02 13:57:34 crc kubenswrapper[4732]: I0402 13:57:34.745058 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-8684f86954-xgncs" podStartSLOduration=5.156388091 podStartE2EDuration="24.745034352s" podCreationTimestamp="2026-04-02 13:57:10 +0000 UTC" firstStartedPulling="2026-04-02 13:57:11.741275898 +0000 UTC m=+1188.645683451" lastFinishedPulling="2026-04-02 13:57:31.329922159 +0000 UTC m=+1208.234329712" observedRunningTime="2026-04-02 13:57:34.738426714 +0000 UTC m=+1211.642834277" watchObservedRunningTime="2026-04-02 13:57:34.745034352 +0000 UTC m=+1211.649441905" Apr 02 13:57:34 crc kubenswrapper[4732]: I0402 13:57:34.772109 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh"] Apr 02 13:57:34 crc kubenswrapper[4732]: W0402 13:57:34.782971 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5a07520_1380_45b1_a00a_7148b158711e.slice/crio-8d5acb50050f20bdaf09c3ebbbb6174f19aded7497830c205c53c4e59abfcee8 WatchSource:0}: Error finding container 8d5acb50050f20bdaf09c3ebbbb6174f19aded7497830c205c53c4e59abfcee8: Status 404 returned error can't find the container with id 8d5acb50050f20bdaf09c3ebbbb6174f19aded7497830c205c53c4e59abfcee8 Apr 02 13:57:34 crc kubenswrapper[4732]: I0402 13:57:34.877457 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ntccm"] Apr 02 13:57:34 crc kubenswrapper[4732]: W0402 13:57:34.889254 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27c40a49_d6e1_4a4e_8cd1_b2592a9e99e9.slice/crio-27c7e06ea572094adec22473853929b09f79c67d53cabfd3c2a25329e18730db WatchSource:0}: Error finding container 27c7e06ea572094adec22473853929b09f79c67d53cabfd3c2a25329e18730db: Status 404 returned error can't find the container with id 27c7e06ea572094adec22473853929b09f79c67d53cabfd3c2a25329e18730db Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.727741 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh" event={"ID":"d5a07520-1380-45b1-a00a-7148b158711e","Type":"ContainerStarted","Data":"8d5acb50050f20bdaf09c3ebbbb6174f19aded7497830c205c53c4e59abfcee8"} Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.729393 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56ccc97cf5-ztlzz" event={"ID":"bd54902c-4922-4c49-85c1-280af54370ba","Type":"ContainerStarted","Data":"1e02b528f9f3720c7a33e98b848ca71f23fb305520f4075270df2ba768bff907"} Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.729586 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-56ccc97cf5-ztlzz" Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.734085 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-84464c7c78-tgrmk" event={"ID":"bec355a9-c60e-4480-a32c-f1a43ef27131","Type":"ContainerStarted","Data":"449120d79b7bfa2edb71c37f9e3a46b600e6e8042456a4f4f93960d235e1d907"} Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.734541 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-84464c7c78-tgrmk" Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.743566 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-58689c6fff-47xk7" event={"ID":"4c76cc17-ab86-4c9c-9438-7e72e2ce895f","Type":"ContainerStarted","Data":"a9ca4c2266fff23cd8156fffdaa687a6c209f0e17053624a0e0a88d9153a336f"} Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.743942 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-58689c6fff-47xk7" Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.746300 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6f76d4c7-nbxg9" event={"ID":"426c551b-e661-40e0-9aa3-a83897ce2814","Type":"ContainerStarted","Data":"7721235c3c176ad15904d81aee6298d3c64344dde27c89f2b23b65f1d238872f"} Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.746478 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6f76d4c7-nbxg9" Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.749702 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-559d8fdb6b-mfzk4" event={"ID":"6296d461-d333-4b9c-a082-e48db64bdd96","Type":"ContainerStarted","Data":"aefafd5f00a92ee7121827a9b07f5f8ea14f89993fb4821125afbc98970868c2"} Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.750079 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-559d8fdb6b-mfzk4" Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.753391 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-56ccc97cf5-ztlzz" podStartSLOduration=3.519044717 podStartE2EDuration="25.753373518s" podCreationTimestamp="2026-04-02 13:57:10 +0000 UTC" firstStartedPulling="2026-04-02 13:57:12.236685795 +0000 UTC m=+1189.141093348" lastFinishedPulling="2026-04-02 13:57:34.471014596 +0000 UTC m=+1211.375422149" observedRunningTime="2026-04-02 13:57:35.750597453 +0000 UTC m=+1212.655005026" watchObservedRunningTime="2026-04-02 13:57:35.753373518 +0000 UTC m=+1212.657781071" Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.760259 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6554749d88-4cwml" event={"ID":"338e9bfc-709f-49f2-8456-9dbe8b815382","Type":"ContainerStarted","Data":"952a772c82dfe33bd0b0737c1ecdcc2ac9d54d705b8c173542346d5b76f20a07"} Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.760598 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6554749d88-4cwml" Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.762403 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-648bdc7f99-vt6x9" event={"ID":"5e74dfe1-0e0f-4b70-8b9a-db645eb40e05","Type":"ContainerStarted","Data":"510f1732ec725a9cb8939420f8e15ab9d7489bdf378ed5c1dc9e4e1f5a6cac5c"} Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.762948 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-648bdc7f99-vt6x9" Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.765918 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-z49v9" event={"ID":"1c68b230-3f85-41f9-a6ed-7da1d0738748","Type":"ContainerStarted","Data":"4b7d6ba4e2abf73ae118baba00cb87b5017951f06a524fbae55548063b8e3acc"} Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.766568 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-z49v9" Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.770361 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6ccfd84cb4-hv8p6" event={"ID":"879197e5-dc13-4c17-b8ac-7e51a97aa0f2","Type":"ContainerStarted","Data":"118b3d5fc614bf656277fccb6de940bb7398e5dc293c2fc0bf10dab71378a232"} Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.770655 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6ccfd84cb4-hv8p6" Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.773182 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-559d8fdb6b-mfzk4" podStartSLOduration=5.96655518 podStartE2EDuration="25.77316191s" podCreationTimestamp="2026-04-02 13:57:10 +0000 UTC" firstStartedPulling="2026-04-02 13:57:12.209659199 +0000 UTC m=+1189.114066752" lastFinishedPulling="2026-04-02 13:57:32.016265919 +0000 UTC m=+1208.920673482" observedRunningTime="2026-04-02 13:57:35.770029895 +0000 UTC m=+1212.674437458" watchObservedRunningTime="2026-04-02 13:57:35.77316191 +0000 UTC m=+1212.677569463" Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.776333 4732 generic.go:334] "Generic (PLEG): container finished" podID="27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9" containerID="0c28aad6e93ae6fa3cefb5e3e24bdcfbba4e826b3b5726a29358cc5b741e7a12" exitCode=0 Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.776399 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntccm" event={"ID":"27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9","Type":"ContainerDied","Data":"0c28aad6e93ae6fa3cefb5e3e24bdcfbba4e826b3b5726a29358cc5b741e7a12"} Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.776423 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntccm" event={"ID":"27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9","Type":"ContainerStarted","Data":"27c7e06ea572094adec22473853929b09f79c67d53cabfd3c2a25329e18730db"} Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.780830 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-989fbd45-w2zrf" event={"ID":"64892a56-9180-4d1d-ad33-d87caa5f2002","Type":"ContainerStarted","Data":"16766526f54dab850bb3ffd972a10a7cd86f5836895c04ea6bb0b83e19e1eaa7"} Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.781021 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-989fbd45-w2zrf" Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.795429 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d6f9fd68c-mvbqt" event={"ID":"b49d6074-a4b1-4658-b6b8-95bfe63163b0","Type":"ContainerStarted","Data":"906f84543553b189d6e07f460b2cea187028cc320e6e1205b6532b61c4976a72"} Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.796131 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d6f9fd68c-mvbqt" Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.804054 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d46cccfb9-65vqg" event={"ID":"d925f7c0-af6d-49d5-a09f-82afb7c58a15","Type":"ContainerStarted","Data":"eabe860c729fd970920111e574b30d7f753e0ac25d6706298b9a0fe404a3854d"} Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.804099 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d46cccfb9-65vqg" Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.810294 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6f76d4c7-nbxg9" podStartSLOduration=3.557949802 podStartE2EDuration="25.810273257s" podCreationTimestamp="2026-04-02 13:57:10 +0000 UTC" firstStartedPulling="2026-04-02 13:57:12.226863671 +0000 UTC m=+1189.131271224" lastFinishedPulling="2026-04-02 13:57:34.479187126 +0000 UTC m=+1211.383594679" observedRunningTime="2026-04-02 13:57:35.793958399 +0000 UTC m=+1212.698365972" watchObservedRunningTime="2026-04-02 13:57:35.810273257 +0000 UTC m=+1212.714680810" Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.813038 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-58689c6fff-47xk7" podStartSLOduration=5.390670158 podStartE2EDuration="25.813020341s" podCreationTimestamp="2026-04-02 13:57:10 +0000 UTC" firstStartedPulling="2026-04-02 13:57:11.593451884 +0000 UTC m=+1188.497859447" lastFinishedPulling="2026-04-02 13:57:32.015802077 +0000 UTC m=+1208.920209630" observedRunningTime="2026-04-02 13:57:35.80927069 +0000 UTC m=+1212.713678253" watchObservedRunningTime="2026-04-02 13:57:35.813020341 +0000 UTC m=+1212.717427894" Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.840803 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-84464c7c78-tgrmk" podStartSLOduration=6.70719764 podStartE2EDuration="25.840784868s" podCreationTimestamp="2026-04-02 13:57:10 +0000 UTC" firstStartedPulling="2026-04-02 13:57:12.196523386 +0000 UTC m=+1189.100930939" lastFinishedPulling="2026-04-02 13:57:31.330110614 +0000 UTC m=+1208.234518167" observedRunningTime="2026-04-02 13:57:35.838215548 +0000 UTC m=+1212.742623121" watchObservedRunningTime="2026-04-02 13:57:35.840784868 +0000 UTC m=+1212.745192421" Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.870314 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-648bdc7f99-vt6x9" podStartSLOduration=6.149028815 podStartE2EDuration="25.870283661s" podCreationTimestamp="2026-04-02 13:57:10 +0000 UTC" firstStartedPulling="2026-04-02 13:57:11.608731265 +0000 UTC m=+1188.513138818" lastFinishedPulling="2026-04-02 13:57:31.329986111 +0000 UTC m=+1208.234393664" observedRunningTime="2026-04-02 13:57:35.85911263 +0000 UTC m=+1212.763520193" watchObservedRunningTime="2026-04-02 13:57:35.870283661 +0000 UTC m=+1212.774691214" Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.904053 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6ccfd84cb4-hv8p6" podStartSLOduration=5.679059872 podStartE2EDuration="25.904029618s" podCreationTimestamp="2026-04-02 13:57:10 +0000 UTC" firstStartedPulling="2026-04-02 13:57:11.791722864 +0000 UTC m=+1188.696130427" lastFinishedPulling="2026-04-02 13:57:32.01669261 +0000 UTC m=+1208.921100173" observedRunningTime="2026-04-02 13:57:35.89148348 +0000 UTC m=+1212.795891043" watchObservedRunningTime="2026-04-02 13:57:35.904029618 +0000 UTC m=+1212.808437181" Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.924034 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d6f9fd68c-mvbqt" podStartSLOduration=3.721454618 podStartE2EDuration="25.924011795s" podCreationTimestamp="2026-04-02 13:57:10 +0000 UTC" firstStartedPulling="2026-04-02 13:57:12.29454069 +0000 UTC m=+1189.198948243" lastFinishedPulling="2026-04-02 13:57:34.497097867 +0000 UTC m=+1211.401505420" observedRunningTime="2026-04-02 13:57:35.921857167 +0000 UTC m=+1212.826264720" watchObservedRunningTime="2026-04-02 13:57:35.924011795 +0000 UTC m=+1212.828419358" Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.954005 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6554749d88-4cwml" podStartSLOduration=3.7061711170000002 podStartE2EDuration="25.953988931s" podCreationTimestamp="2026-04-02 13:57:10 +0000 UTC" firstStartedPulling="2026-04-02 13:57:12.294503369 +0000 UTC m=+1189.198910922" lastFinishedPulling="2026-04-02 13:57:34.542321183 +0000 UTC m=+1211.446728736" observedRunningTime="2026-04-02 13:57:35.950161208 +0000 UTC m=+1212.854568781" watchObservedRunningTime="2026-04-02 13:57:35.953988931 +0000 UTC m=+1212.858396484" Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.965794 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-z49v9" podStartSLOduration=6.8623888 podStartE2EDuration="25.965775627s" podCreationTimestamp="2026-04-02 13:57:10 +0000 UTC" firstStartedPulling="2026-04-02 13:57:12.226563583 +0000 UTC m=+1189.130971136" lastFinishedPulling="2026-04-02 13:57:31.32995041 +0000 UTC m=+1208.234357963" observedRunningTime="2026-04-02 13:57:35.962749556 +0000 UTC m=+1212.867157119" watchObservedRunningTime="2026-04-02 13:57:35.965775627 +0000 UTC m=+1212.870183190" Apr 02 13:57:35 crc kubenswrapper[4732]: I0402 13:57:35.992281 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-989fbd45-w2zrf" podStartSLOduration=3.7797006140000002 podStartE2EDuration="25.992259279s" podCreationTimestamp="2026-04-02 13:57:10 +0000 UTC" firstStartedPulling="2026-04-02 13:57:12.24317378 +0000 UTC m=+1189.147581333" lastFinishedPulling="2026-04-02 13:57:34.455732435 +0000 UTC m=+1211.360139998" observedRunningTime="2026-04-02 13:57:35.988019725 +0000 UTC m=+1212.892427298" watchObservedRunningTime="2026-04-02 13:57:35.992259279 +0000 UTC m=+1212.896666832" Apr 02 13:57:36 crc kubenswrapper[4732]: I0402 13:57:36.027414 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d46cccfb9-65vqg" podStartSLOduration=6.096215125 podStartE2EDuration="26.027397494s" podCreationTimestamp="2026-04-02 13:57:10 +0000 UTC" firstStartedPulling="2026-04-02 13:57:11.399735767 +0000 UTC m=+1188.304143320" lastFinishedPulling="2026-04-02 13:57:31.330918136 +0000 UTC m=+1208.235325689" observedRunningTime="2026-04-02 13:57:36.023040197 +0000 UTC m=+1212.927447760" watchObservedRunningTime="2026-04-02 13:57:36.027397494 +0000 UTC m=+1212.931805037" Apr 02 13:57:38 crc kubenswrapper[4732]: I0402 13:57:38.829091 4732 generic.go:334] "Generic (PLEG): container finished" podID="27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9" containerID="2870f349e386d21182b089f4cf77f1315343bcc18790ab8718880bbc39a248b3" exitCode=0 Apr 02 13:57:38 crc kubenswrapper[4732]: I0402 13:57:38.829281 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntccm" event={"ID":"27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9","Type":"ContainerDied","Data":"2870f349e386d21182b089f4cf77f1315343bcc18790ab8718880bbc39a248b3"} Apr 02 13:57:39 crc kubenswrapper[4732]: I0402 13:57:39.837574 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f96574b5-nbm76" event={"ID":"e46394c5-fd9e-4c0d-8e78-96723f5931d9","Type":"ContainerStarted","Data":"53e70b0d6d6e6dc827b78e01dfd93763a5c7478663f15f540b8b7125bd71e2cb"} Apr 02 13:57:39 crc kubenswrapper[4732]: I0402 13:57:39.838076 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5f96574b5-nbm76" Apr 02 13:57:39 crc kubenswrapper[4732]: I0402 13:57:39.839566 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh" event={"ID":"d5a07520-1380-45b1-a00a-7148b158711e","Type":"ContainerStarted","Data":"ac2b8dde9417f6dffcb4462a97db03b90553afbaec37ff556bd55645f2602c0e"} Apr 02 13:57:39 crc kubenswrapper[4732]: I0402 13:57:39.839730 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh" Apr 02 13:57:39 crc kubenswrapper[4732]: I0402 13:57:39.852095 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5f96574b5-nbm76" podStartSLOduration=2.956841443 podStartE2EDuration="29.852073665s" podCreationTimestamp="2026-04-02 13:57:10 +0000 UTC" firstStartedPulling="2026-04-02 13:57:11.709583276 +0000 UTC m=+1188.613990829" lastFinishedPulling="2026-04-02 13:57:38.604815498 +0000 UTC m=+1215.509223051" observedRunningTime="2026-04-02 13:57:39.850811062 +0000 UTC m=+1216.755218635" watchObservedRunningTime="2026-04-02 13:57:39.852073665 +0000 UTC m=+1216.756481218" Apr 02 13:57:39 crc kubenswrapper[4732]: I0402 13:57:39.878121 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh" podStartSLOduration=26.060306708 podStartE2EDuration="29.878103865s" podCreationTimestamp="2026-04-02 13:57:10 +0000 UTC" firstStartedPulling="2026-04-02 13:57:34.785277974 +0000 UTC m=+1211.689685527" lastFinishedPulling="2026-04-02 13:57:38.603075131 +0000 UTC m=+1215.507482684" observedRunningTime="2026-04-02 13:57:39.87010615 +0000 UTC m=+1216.774513703" watchObservedRunningTime="2026-04-02 13:57:39.878103865 +0000 UTC m=+1216.782511418" Apr 02 13:57:40 crc kubenswrapper[4732]: I0402 13:57:40.624875 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d46cccfb9-65vqg" Apr 02 13:57:40 crc kubenswrapper[4732]: I0402 13:57:40.689535 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-58689c6fff-47xk7" Apr 02 13:57:40 crc kubenswrapper[4732]: I0402 13:57:40.711103 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-648bdc7f99-vt6x9" Apr 02 13:57:40 crc kubenswrapper[4732]: I0402 13:57:40.739230 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-8684f86954-xgncs" Apr 02 13:57:40 crc kubenswrapper[4732]: I0402 13:57:40.795551 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6ccfd84cb4-hv8p6" Apr 02 13:57:40 crc kubenswrapper[4732]: I0402 13:57:40.872928 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6b7497dc59-ph5hk" event={"ID":"084daf4c-82c9-42e7-8eb9-3ae4658c1742","Type":"ContainerStarted","Data":"fc415ef802276b72c0411bad0f229ac0c4bc29c064fe179764ccee209c80b619"} Apr 02 13:57:40 crc kubenswrapper[4732]: I0402 13:57:40.873931 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6b7497dc59-ph5hk" Apr 02 13:57:40 crc kubenswrapper[4732]: I0402 13:57:40.884841 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-86644c9c9c-nhxqn" event={"ID":"08d5eea8-7c67-4aa1-ad91-ab1c60214872","Type":"ContainerStarted","Data":"b5777f8329512a25b92fdfb6df6e332d8dda8bea728333fee04c9a50b90698cc"} Apr 02 13:57:40 crc kubenswrapper[4732]: I0402 13:57:40.885445 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-86644c9c9c-nhxqn" Apr 02 13:57:40 crc kubenswrapper[4732]: I0402 13:57:40.900753 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntccm" event={"ID":"27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9","Type":"ContainerStarted","Data":"2e344d9529490f5f4ab7b9309992a9c868df06af2b5840419ae8f264eb998e54"} Apr 02 13:57:40 crc kubenswrapper[4732]: I0402 13:57:40.932586 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6b7497dc59-ph5hk" podStartSLOduration=2.56717752 podStartE2EDuration="30.932564541s" podCreationTimestamp="2026-04-02 13:57:10 +0000 UTC" firstStartedPulling="2026-04-02 13:57:11.870171583 +0000 UTC m=+1188.774579136" lastFinishedPulling="2026-04-02 13:57:40.235558604 +0000 UTC m=+1217.139966157" observedRunningTime="2026-04-02 13:57:40.928016819 +0000 UTC m=+1217.832424382" watchObservedRunningTime="2026-04-02 13:57:40.932564541 +0000 UTC m=+1217.836972094" Apr 02 13:57:40 crc kubenswrapper[4732]: I0402 13:57:40.972962 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ntccm" podStartSLOduration=10.515341931 podStartE2EDuration="14.972944326s" podCreationTimestamp="2026-04-02 13:57:26 +0000 UTC" firstStartedPulling="2026-04-02 13:57:35.777771314 +0000 UTC m=+1212.682178867" lastFinishedPulling="2026-04-02 13:57:40.235373709 +0000 UTC m=+1217.139781262" observedRunningTime="2026-04-02 13:57:40.96787885 +0000 UTC m=+1217.872286403" watchObservedRunningTime="2026-04-02 13:57:40.972944326 +0000 UTC m=+1217.877351879" Apr 02 13:57:41 crc kubenswrapper[4732]: I0402 13:57:41.004935 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-86644c9c9c-nhxqn" podStartSLOduration=2.341065951 podStartE2EDuration="31.004918666s" podCreationTimestamp="2026-04-02 13:57:10 +0000 UTC" firstStartedPulling="2026-04-02 13:57:11.571551635 +0000 UTC m=+1188.475959188" lastFinishedPulling="2026-04-02 13:57:40.23540435 +0000 UTC m=+1217.139811903" observedRunningTime="2026-04-02 13:57:41.001896725 +0000 UTC m=+1217.906304288" watchObservedRunningTime="2026-04-02 13:57:41.004918666 +0000 UTC m=+1217.909326219" Apr 02 13:57:41 crc kubenswrapper[4732]: I0402 13:57:41.110175 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6554749d88-4cwml" Apr 02 13:57:41 crc kubenswrapper[4732]: I0402 13:57:41.132221 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-z49v9" Apr 02 13:57:41 crc kubenswrapper[4732]: I0402 13:57:41.168423 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-84464c7c78-tgrmk" Apr 02 13:57:41 crc kubenswrapper[4732]: I0402 13:57:41.234585 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d6f9fd68c-mvbqt" Apr 02 13:57:41 crc kubenswrapper[4732]: I0402 13:57:41.251283 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-559d8fdb6b-mfzk4" Apr 02 13:57:41 crc kubenswrapper[4732]: I0402 13:57:41.358071 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6f76d4c7-nbxg9" Apr 02 13:57:41 crc kubenswrapper[4732]: I0402 13:57:41.416684 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-56ccc97cf5-ztlzz" Apr 02 13:57:41 crc kubenswrapper[4732]: I0402 13:57:41.470805 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-989fbd45-w2zrf" Apr 02 13:57:42 crc kubenswrapper[4732]: I0402 13:57:42.418515 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43f86830-d407-4dc4-9b09-388fb5db82c8-cert\") pod \"infra-operator-controller-manager-58f79b884c-5q7cz\" (UID: \"43f86830-d407-4dc4-9b09-388fb5db82c8\") " pod="openstack-operators/infra-operator-controller-manager-58f79b884c-5q7cz" Apr 02 13:57:42 crc kubenswrapper[4732]: I0402 13:57:42.426379 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43f86830-d407-4dc4-9b09-388fb5db82c8-cert\") pod \"infra-operator-controller-manager-58f79b884c-5q7cz\" (UID: \"43f86830-d407-4dc4-9b09-388fb5db82c8\") " pod="openstack-operators/infra-operator-controller-manager-58f79b884c-5q7cz" Apr 02 13:57:42 crc kubenswrapper[4732]: I0402 13:57:42.580097 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-lvqvs" Apr 02 13:57:42 crc kubenswrapper[4732]: I0402 13:57:42.589214 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-58f79b884c-5q7cz" Apr 02 13:57:43 crc kubenswrapper[4732]: I0402 13:57:43.065059 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-58f79b884c-5q7cz"] Apr 02 13:57:43 crc kubenswrapper[4732]: I0402 13:57:43.127241 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-webhook-certs\") pod \"openstack-operator-controller-manager-5985877f6-hxnth\" (UID: \"e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f\") " pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" Apr 02 13:57:43 crc kubenswrapper[4732]: I0402 13:57:43.132567 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f-webhook-certs\") pod \"openstack-operator-controller-manager-5985877f6-hxnth\" (UID: \"e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f\") " pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" Apr 02 13:57:43 crc kubenswrapper[4732]: I0402 13:57:43.313655 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-hpbh8" Apr 02 13:57:43 crc kubenswrapper[4732]: I0402 13:57:43.322583 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" Apr 02 13:57:43 crc kubenswrapper[4732]: I0402 13:57:43.547252 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth"] Apr 02 13:57:43 crc kubenswrapper[4732]: W0402 13:57:43.559828 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode37f12bd_d8bf_4e9d_86ab_4da2dfbeff5f.slice/crio-a0777abd222c72fa645e0e53f3086f2434dee8607331624b43cfba0f6f6d60e0 WatchSource:0}: Error finding container a0777abd222c72fa645e0e53f3086f2434dee8607331624b43cfba0f6f6d60e0: Status 404 returned error can't find the container with id a0777abd222c72fa645e0e53f3086f2434dee8607331624b43cfba0f6f6d60e0 Apr 02 13:57:43 crc kubenswrapper[4732]: I0402 13:57:43.923169 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-58f79b884c-5q7cz" event={"ID":"43f86830-d407-4dc4-9b09-388fb5db82c8","Type":"ContainerStarted","Data":"85566a048d238a9e748069d06043125c4e274dd352b88ddc40dcbe1e17c8b8eb"} Apr 02 13:57:43 crc kubenswrapper[4732]: I0402 13:57:43.924657 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" event={"ID":"e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f","Type":"ContainerStarted","Data":"ef9fad9af96b87dfff0cb8a3c9cfc4f4ae19d3a77da57830bb4cdb3e04e4c439"} Apr 02 13:57:43 crc kubenswrapper[4732]: I0402 13:57:43.924680 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" event={"ID":"e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f","Type":"ContainerStarted","Data":"a0777abd222c72fa645e0e53f3086f2434dee8607331624b43cfba0f6f6d60e0"} Apr 02 13:57:44 crc kubenswrapper[4732]: I0402 13:57:44.953583 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7594f57946-2rvck" event={"ID":"5319722c-7913-4dcd-a03d-dc7a5040b434","Type":"ContainerStarted","Data":"2b9c307174141e259d2bc929af1594b32616f275a53a67b049616d93de718d9a"} Apr 02 13:57:44 crc kubenswrapper[4732]: I0402 13:57:44.954071 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7594f57946-2rvck" Apr 02 13:57:44 crc kubenswrapper[4732]: I0402 13:57:44.955820 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-fbdcf7f7b-bw2df" event={"ID":"5b424dbc-80ac-46ae-90d2-c69fdf4c14d7","Type":"ContainerStarted","Data":"29b49ec94cd847061717cacb78d559e54fd1594c317e160c803516f67f95c430"} Apr 02 13:57:44 crc kubenswrapper[4732]: I0402 13:57:44.955936 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" Apr 02 13:57:44 crc kubenswrapper[4732]: I0402 13:57:44.977204 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7594f57946-2rvck" podStartSLOduration=2.524092381 podStartE2EDuration="34.977184696s" podCreationTimestamp="2026-04-02 13:57:10 +0000 UTC" firstStartedPulling="2026-04-02 13:57:12.202131346 +0000 UTC m=+1189.106538899" lastFinishedPulling="2026-04-02 13:57:44.655223651 +0000 UTC m=+1221.559631214" observedRunningTime="2026-04-02 13:57:44.966949301 +0000 UTC m=+1221.871356864" watchObservedRunningTime="2026-04-02 13:57:44.977184696 +0000 UTC m=+1221.881592259" Apr 02 13:57:44 crc kubenswrapper[4732]: I0402 13:57:44.994321 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" podStartSLOduration=33.994299556 podStartE2EDuration="33.994299556s" podCreationTimestamp="2026-04-02 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:57:44.993026862 +0000 UTC m=+1221.897434425" watchObservedRunningTime="2026-04-02 13:57:44.994299556 +0000 UTC m=+1221.898707109" Apr 02 13:57:45 crc kubenswrapper[4732]: I0402 13:57:45.017591 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-fbdcf7f7b-bw2df" podStartSLOduration=2.490029996 podStartE2EDuration="35.017572002s" podCreationTimestamp="2026-04-02 13:57:10 +0000 UTC" firstStartedPulling="2026-04-02 13:57:12.127279194 +0000 UTC m=+1189.031686747" lastFinishedPulling="2026-04-02 13:57:44.6548212 +0000 UTC m=+1221.559228753" observedRunningTime="2026-04-02 13:57:45.01230384 +0000 UTC m=+1221.916711413" watchObservedRunningTime="2026-04-02 13:57:45.017572002 +0000 UTC m=+1221.921979555" Apr 02 13:57:45 crc kubenswrapper[4732]: I0402 13:57:45.966176 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-dbf8bb784-4kz5n" event={"ID":"6b75349c-23b4-4dc0-914f-f1dc82b12e18","Type":"ContainerStarted","Data":"a8d5b440ae441314e3d170fe963dc22d9baa83b70bb3fe7e9d14d86dd367f0ee"} Apr 02 13:57:45 crc kubenswrapper[4732]: I0402 13:57:45.966702 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-dbf8bb784-4kz5n" Apr 02 13:57:45 crc kubenswrapper[4732]: I0402 13:57:45.991385 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-dbf8bb784-4kz5n" podStartSLOduration=2.707535563 podStartE2EDuration="35.991363819s" podCreationTimestamp="2026-04-02 13:57:10 +0000 UTC" firstStartedPulling="2026-04-02 13:57:11.865715533 +0000 UTC m=+1188.770123086" lastFinishedPulling="2026-04-02 13:57:45.149543789 +0000 UTC m=+1222.053951342" observedRunningTime="2026-04-02 13:57:45.983910038 +0000 UTC m=+1222.888317601" watchObservedRunningTime="2026-04-02 13:57:45.991363819 +0000 UTC m=+1222.895771372" Apr 02 13:57:46 crc kubenswrapper[4732]: I0402 13:57:46.765995 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh" Apr 02 13:57:46 crc kubenswrapper[4732]: I0402 13:57:46.973956 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-58f79b884c-5q7cz" event={"ID":"43f86830-d407-4dc4-9b09-388fb5db82c8","Type":"ContainerStarted","Data":"51b0561b6946f82264b0fb4086f76919be63689400eef4ca0b78e74255d82591"} Apr 02 13:57:47 crc kubenswrapper[4732]: I0402 13:57:47.001109 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-58f79b884c-5q7cz" podStartSLOduration=33.639506037 podStartE2EDuration="37.001083281s" podCreationTimestamp="2026-04-02 13:57:10 +0000 UTC" firstStartedPulling="2026-04-02 13:57:43.070194724 +0000 UTC m=+1219.974602277" lastFinishedPulling="2026-04-02 13:57:46.431771968 +0000 UTC m=+1223.336179521" observedRunningTime="2026-04-02 13:57:46.993468876 +0000 UTC m=+1223.897876449" watchObservedRunningTime="2026-04-02 13:57:47.001083281 +0000 UTC m=+1223.905490854" Apr 02 13:57:47 crc kubenswrapper[4732]: I0402 13:57:47.059438 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ntccm" Apr 02 13:57:47 crc kubenswrapper[4732]: I0402 13:57:47.059519 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ntccm" Apr 02 13:57:47 crc kubenswrapper[4732]: I0402 13:57:47.106729 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ntccm" Apr 02 13:57:47 crc kubenswrapper[4732]: I0402 13:57:47.980943 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-58f79b884c-5q7cz" Apr 02 13:57:48 crc kubenswrapper[4732]: I0402 13:57:48.031727 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ntccm" Apr 02 13:57:48 crc kubenswrapper[4732]: I0402 13:57:48.095596 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ntccm"] Apr 02 13:57:49 crc kubenswrapper[4732]: I0402 13:57:49.997912 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ntccm" podUID="27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9" containerName="registry-server" containerID="cri-o://2e344d9529490f5f4ab7b9309992a9c868df06af2b5840419ae8f264eb998e54" gracePeriod=2 Apr 02 13:57:50 crc kubenswrapper[4732]: I0402 13:57:50.351135 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ntccm" Apr 02 13:57:50 crc kubenswrapper[4732]: I0402 13:57:50.427156 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9-utilities\") pod \"27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9\" (UID: \"27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9\") " Apr 02 13:57:50 crc kubenswrapper[4732]: I0402 13:57:50.427205 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9-catalog-content\") pod \"27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9\" (UID: \"27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9\") " Apr 02 13:57:50 crc kubenswrapper[4732]: I0402 13:57:50.427237 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwbjv\" (UniqueName: \"kubernetes.io/projected/27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9-kube-api-access-qwbjv\") pod \"27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9\" (UID: \"27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9\") " Apr 02 13:57:50 crc kubenswrapper[4732]: I0402 13:57:50.427895 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9-utilities" (OuterVolumeSpecName: "utilities") pod "27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9" (UID: "27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:57:50 crc kubenswrapper[4732]: I0402 13:57:50.431938 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9-kube-api-access-qwbjv" (OuterVolumeSpecName: "kube-api-access-qwbjv") pod "27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9" (UID: "27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9"). InnerVolumeSpecName "kube-api-access-qwbjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:57:50 crc kubenswrapper[4732]: I0402 13:57:50.478186 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9" (UID: "27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:57:50 crc kubenswrapper[4732]: I0402 13:57:50.528501 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 13:57:50 crc kubenswrapper[4732]: I0402 13:57:50.528541 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 13:57:50 crc kubenswrapper[4732]: I0402 13:57:50.528554 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwbjv\" (UniqueName: \"kubernetes.io/projected/27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9-kube-api-access-qwbjv\") on node \"crc\" DevicePath \"\"" Apr 02 13:57:50 crc kubenswrapper[4732]: I0402 13:57:50.655245 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-86644c9c9c-nhxqn" Apr 02 13:57:50 crc kubenswrapper[4732]: I0402 13:57:50.967488 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5f96574b5-nbm76" Apr 02 13:57:51 crc kubenswrapper[4732]: I0402 13:57:51.006285 4732 generic.go:334] "Generic (PLEG): container finished" podID="27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9" containerID="2e344d9529490f5f4ab7b9309992a9c868df06af2b5840419ae8f264eb998e54" exitCode=0 Apr 02 13:57:51 crc kubenswrapper[4732]: I0402 13:57:51.006346 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntccm" event={"ID":"27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9","Type":"ContainerDied","Data":"2e344d9529490f5f4ab7b9309992a9c868df06af2b5840419ae8f264eb998e54"} Apr 02 13:57:51 crc kubenswrapper[4732]: I0402 13:57:51.006364 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ntccm" Apr 02 13:57:51 crc kubenswrapper[4732]: I0402 13:57:51.006382 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntccm" event={"ID":"27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9","Type":"ContainerDied","Data":"27c7e06ea572094adec22473853929b09f79c67d53cabfd3c2a25329e18730db"} Apr 02 13:57:51 crc kubenswrapper[4732]: I0402 13:57:51.006404 4732 scope.go:117] "RemoveContainer" containerID="2e344d9529490f5f4ab7b9309992a9c868df06af2b5840419ae8f264eb998e54" Apr 02 13:57:51 crc kubenswrapper[4732]: I0402 13:57:51.030446 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ntccm"] Apr 02 13:57:51 crc kubenswrapper[4732]: I0402 13:57:51.031417 4732 scope.go:117] "RemoveContainer" containerID="2870f349e386d21182b089f4cf77f1315343bcc18790ab8718880bbc39a248b3" Apr 02 13:57:51 crc kubenswrapper[4732]: I0402 13:57:51.038086 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ntccm"] Apr 02 13:57:51 crc kubenswrapper[4732]: I0402 13:57:51.048873 4732 scope.go:117] "RemoveContainer" containerID="0c28aad6e93ae6fa3cefb5e3e24bdcfbba4e826b3b5726a29358cc5b741e7a12" Apr 02 13:57:51 crc kubenswrapper[4732]: I0402 13:57:51.072941 4732 scope.go:117] "RemoveContainer" containerID="2e344d9529490f5f4ab7b9309992a9c868df06af2b5840419ae8f264eb998e54" Apr 02 13:57:51 crc kubenswrapper[4732]: E0402 13:57:51.073440 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e344d9529490f5f4ab7b9309992a9c868df06af2b5840419ae8f264eb998e54\": container with ID starting with 2e344d9529490f5f4ab7b9309992a9c868df06af2b5840419ae8f264eb998e54 not found: ID does not exist" containerID="2e344d9529490f5f4ab7b9309992a9c868df06af2b5840419ae8f264eb998e54" Apr 02 13:57:51 crc kubenswrapper[4732]: I0402 13:57:51.073481 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e344d9529490f5f4ab7b9309992a9c868df06af2b5840419ae8f264eb998e54"} err="failed to get container status \"2e344d9529490f5f4ab7b9309992a9c868df06af2b5840419ae8f264eb998e54\": rpc error: code = NotFound desc = could not find container \"2e344d9529490f5f4ab7b9309992a9c868df06af2b5840419ae8f264eb998e54\": container with ID starting with 2e344d9529490f5f4ab7b9309992a9c868df06af2b5840419ae8f264eb998e54 not found: ID does not exist" Apr 02 13:57:51 crc kubenswrapper[4732]: I0402 13:57:51.073506 4732 scope.go:117] "RemoveContainer" containerID="2870f349e386d21182b089f4cf77f1315343bcc18790ab8718880bbc39a248b3" Apr 02 13:57:51 crc kubenswrapper[4732]: E0402 13:57:51.074144 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2870f349e386d21182b089f4cf77f1315343bcc18790ab8718880bbc39a248b3\": container with ID starting with 2870f349e386d21182b089f4cf77f1315343bcc18790ab8718880bbc39a248b3 not found: ID does not exist" containerID="2870f349e386d21182b089f4cf77f1315343bcc18790ab8718880bbc39a248b3" Apr 02 13:57:51 crc kubenswrapper[4732]: I0402 13:57:51.074179 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2870f349e386d21182b089f4cf77f1315343bcc18790ab8718880bbc39a248b3"} err="failed to get container status \"2870f349e386d21182b089f4cf77f1315343bcc18790ab8718880bbc39a248b3\": rpc error: code = NotFound desc = could not find container \"2870f349e386d21182b089f4cf77f1315343bcc18790ab8718880bbc39a248b3\": container with ID starting with 2870f349e386d21182b089f4cf77f1315343bcc18790ab8718880bbc39a248b3 not found: ID does not exist" Apr 02 13:57:51 crc kubenswrapper[4732]: I0402 13:57:51.074200 4732 scope.go:117] "RemoveContainer" containerID="0c28aad6e93ae6fa3cefb5e3e24bdcfbba4e826b3b5726a29358cc5b741e7a12" Apr 02 13:57:51 crc kubenswrapper[4732]: E0402 13:57:51.074626 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c28aad6e93ae6fa3cefb5e3e24bdcfbba4e826b3b5726a29358cc5b741e7a12\": container with ID starting with 0c28aad6e93ae6fa3cefb5e3e24bdcfbba4e826b3b5726a29358cc5b741e7a12 not found: ID does not exist" containerID="0c28aad6e93ae6fa3cefb5e3e24bdcfbba4e826b3b5726a29358cc5b741e7a12" Apr 02 13:57:51 crc kubenswrapper[4732]: I0402 13:57:51.074665 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c28aad6e93ae6fa3cefb5e3e24bdcfbba4e826b3b5726a29358cc5b741e7a12"} err="failed to get container status \"0c28aad6e93ae6fa3cefb5e3e24bdcfbba4e826b3b5726a29358cc5b741e7a12\": rpc error: code = NotFound desc = could not find container \"0c28aad6e93ae6fa3cefb5e3e24bdcfbba4e826b3b5726a29358cc5b741e7a12\": container with ID starting with 0c28aad6e93ae6fa3cefb5e3e24bdcfbba4e826b3b5726a29358cc5b741e7a12 not found: ID does not exist" Apr 02 13:57:51 crc kubenswrapper[4732]: I0402 13:57:51.104950 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-dbf8bb784-4kz5n" Apr 02 13:57:51 crc kubenswrapper[4732]: I0402 13:57:51.105488 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6b7497dc59-ph5hk" Apr 02 13:57:51 crc kubenswrapper[4732]: I0402 13:57:51.140313 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7594f57946-2rvck" Apr 02 13:57:51 crc kubenswrapper[4732]: I0402 13:57:51.302871 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-fbdcf7f7b-bw2df" Apr 02 13:57:51 crc kubenswrapper[4732]: I0402 13:57:51.305340 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-fbdcf7f7b-bw2df" Apr 02 13:57:52 crc kubenswrapper[4732]: I0402 13:57:52.595480 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-58f79b884c-5q7cz" Apr 02 13:57:52 crc kubenswrapper[4732]: I0402 13:57:52.689108 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9" path="/var/lib/kubelet/pods/27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9/volumes" Apr 02 13:57:53 crc kubenswrapper[4732]: I0402 13:57:53.328673 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5985877f6-hxnth" Apr 02 13:58:00 crc kubenswrapper[4732]: I0402 13:58:00.153477 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585638-glv6j"] Apr 02 13:58:00 crc kubenswrapper[4732]: E0402 13:58:00.154382 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9" containerName="registry-server" Apr 02 13:58:00 crc kubenswrapper[4732]: I0402 13:58:00.154396 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9" containerName="registry-server" Apr 02 13:58:00 crc kubenswrapper[4732]: E0402 13:58:00.154407 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e050bf65-a203-40f8-9445-fbdcb87b7e91" containerName="extract-utilities" Apr 02 13:58:00 crc kubenswrapper[4732]: I0402 13:58:00.154413 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e050bf65-a203-40f8-9445-fbdcb87b7e91" containerName="extract-utilities" Apr 02 13:58:00 crc kubenswrapper[4732]: E0402 13:58:00.154423 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e050bf65-a203-40f8-9445-fbdcb87b7e91" containerName="extract-content" Apr 02 13:58:00 crc kubenswrapper[4732]: I0402 13:58:00.154430 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e050bf65-a203-40f8-9445-fbdcb87b7e91" containerName="extract-content" Apr 02 13:58:00 crc kubenswrapper[4732]: E0402 13:58:00.154445 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e050bf65-a203-40f8-9445-fbdcb87b7e91" containerName="registry-server" Apr 02 13:58:00 crc kubenswrapper[4732]: I0402 13:58:00.154450 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e050bf65-a203-40f8-9445-fbdcb87b7e91" containerName="registry-server" Apr 02 13:58:00 crc kubenswrapper[4732]: E0402 13:58:00.154461 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9" containerName="extract-content" Apr 02 13:58:00 crc kubenswrapper[4732]: I0402 13:58:00.154467 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9" containerName="extract-content" Apr 02 13:58:00 crc kubenswrapper[4732]: E0402 13:58:00.154478 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9" containerName="extract-utilities" Apr 02 13:58:00 crc kubenswrapper[4732]: I0402 13:58:00.154483 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9" containerName="extract-utilities" Apr 02 13:58:00 crc kubenswrapper[4732]: I0402 13:58:00.154626 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e050bf65-a203-40f8-9445-fbdcb87b7e91" containerName="registry-server" Apr 02 13:58:00 crc kubenswrapper[4732]: I0402 13:58:00.154649 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="27c40a49-d6e1-4a4e-8cd1-b2592a9e99e9" containerName="registry-server" Apr 02 13:58:00 crc kubenswrapper[4732]: I0402 13:58:00.155090 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585638-glv6j" Apr 02 13:58:00 crc kubenswrapper[4732]: I0402 13:58:00.157040 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 13:58:00 crc kubenswrapper[4732]: I0402 13:58:00.157277 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 13:58:00 crc kubenswrapper[4732]: I0402 13:58:00.157485 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 13:58:00 crc kubenswrapper[4732]: I0402 13:58:00.159973 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585638-glv6j"] Apr 02 13:58:00 crc kubenswrapper[4732]: I0402 13:58:00.169266 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9prh\" (UniqueName: \"kubernetes.io/projected/e874a7dc-8608-47fc-bf14-3eefa3fe6e6f-kube-api-access-j9prh\") pod \"auto-csr-approver-29585638-glv6j\" (UID: \"e874a7dc-8608-47fc-bf14-3eefa3fe6e6f\") " pod="openshift-infra/auto-csr-approver-29585638-glv6j" Apr 02 13:58:00 crc kubenswrapper[4732]: I0402 13:58:00.270466 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9prh\" (UniqueName: \"kubernetes.io/projected/e874a7dc-8608-47fc-bf14-3eefa3fe6e6f-kube-api-access-j9prh\") pod \"auto-csr-approver-29585638-glv6j\" (UID: \"e874a7dc-8608-47fc-bf14-3eefa3fe6e6f\") " pod="openshift-infra/auto-csr-approver-29585638-glv6j" Apr 02 13:58:00 crc kubenswrapper[4732]: I0402 13:58:00.293060 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9prh\" (UniqueName: \"kubernetes.io/projected/e874a7dc-8608-47fc-bf14-3eefa3fe6e6f-kube-api-access-j9prh\") pod \"auto-csr-approver-29585638-glv6j\" (UID: \"e874a7dc-8608-47fc-bf14-3eefa3fe6e6f\") " pod="openshift-infra/auto-csr-approver-29585638-glv6j" Apr 02 13:58:00 crc kubenswrapper[4732]: I0402 13:58:00.473759 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585638-glv6j" Apr 02 13:58:00 crc kubenswrapper[4732]: I0402 13:58:00.885620 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585638-glv6j"] Apr 02 13:58:01 crc kubenswrapper[4732]: I0402 13:58:01.083737 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585638-glv6j" event={"ID":"e874a7dc-8608-47fc-bf14-3eefa3fe6e6f","Type":"ContainerStarted","Data":"577e76f92cedd213f03746969c1ce543b3ed666d6937cc74bba1a3af75cd0861"} Apr 02 13:58:03 crc kubenswrapper[4732]: I0402 13:58:03.103498 4732 generic.go:334] "Generic (PLEG): container finished" podID="e874a7dc-8608-47fc-bf14-3eefa3fe6e6f" containerID="f4dab2ddf2e82a3432454f31237a43e274f0990c5151a6264a973b31c88c0f1f" exitCode=0 Apr 02 13:58:03 crc kubenswrapper[4732]: I0402 13:58:03.103686 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585638-glv6j" event={"ID":"e874a7dc-8608-47fc-bf14-3eefa3fe6e6f","Type":"ContainerDied","Data":"f4dab2ddf2e82a3432454f31237a43e274f0990c5151a6264a973b31c88c0f1f"} Apr 02 13:58:04 crc kubenswrapper[4732]: I0402 13:58:04.442390 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585638-glv6j" Apr 02 13:58:04 crc kubenswrapper[4732]: I0402 13:58:04.623530 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9prh\" (UniqueName: \"kubernetes.io/projected/e874a7dc-8608-47fc-bf14-3eefa3fe6e6f-kube-api-access-j9prh\") pod \"e874a7dc-8608-47fc-bf14-3eefa3fe6e6f\" (UID: \"e874a7dc-8608-47fc-bf14-3eefa3fe6e6f\") " Apr 02 13:58:04 crc kubenswrapper[4732]: I0402 13:58:04.628336 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e874a7dc-8608-47fc-bf14-3eefa3fe6e6f-kube-api-access-j9prh" (OuterVolumeSpecName: "kube-api-access-j9prh") pod "e874a7dc-8608-47fc-bf14-3eefa3fe6e6f" (UID: "e874a7dc-8608-47fc-bf14-3eefa3fe6e6f"). InnerVolumeSpecName "kube-api-access-j9prh". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:58:04 crc kubenswrapper[4732]: I0402 13:58:04.724984 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9prh\" (UniqueName: \"kubernetes.io/projected/e874a7dc-8608-47fc-bf14-3eefa3fe6e6f-kube-api-access-j9prh\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:05 crc kubenswrapper[4732]: I0402 13:58:05.127222 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585638-glv6j" event={"ID":"e874a7dc-8608-47fc-bf14-3eefa3fe6e6f","Type":"ContainerDied","Data":"577e76f92cedd213f03746969c1ce543b3ed666d6937cc74bba1a3af75cd0861"} Apr 02 13:58:05 crc kubenswrapper[4732]: I0402 13:58:05.127277 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="577e76f92cedd213f03746969c1ce543b3ed666d6937cc74bba1a3af75cd0861" Apr 02 13:58:05 crc kubenswrapper[4732]: I0402 13:58:05.127348 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585638-glv6j" Apr 02 13:58:05 crc kubenswrapper[4732]: I0402 13:58:05.514403 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585632-sh6n2"] Apr 02 13:58:05 crc kubenswrapper[4732]: I0402 13:58:05.521760 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585632-sh6n2"] Apr 02 13:58:06 crc kubenswrapper[4732]: I0402 13:58:06.688511 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd8b784e-9585-4fa8-b133-7c9b77ff167c" path="/var/lib/kubelet/pods/bd8b784e-9585-4fa8-b133-7c9b77ff167c/volumes" Apr 02 13:58:10 crc kubenswrapper[4732]: I0402 13:58:10.918683 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rf4w6"] Apr 02 13:58:10 crc kubenswrapper[4732]: E0402 13:58:10.921284 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e874a7dc-8608-47fc-bf14-3eefa3fe6e6f" containerName="oc" Apr 02 13:58:10 crc kubenswrapper[4732]: I0402 13:58:10.921312 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e874a7dc-8608-47fc-bf14-3eefa3fe6e6f" containerName="oc" Apr 02 13:58:10 crc kubenswrapper[4732]: I0402 13:58:10.921520 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e874a7dc-8608-47fc-bf14-3eefa3fe6e6f" containerName="oc" Apr 02 13:58:10 crc kubenswrapper[4732]: I0402 13:58:10.922736 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rf4w6" Apr 02 13:58:10 crc kubenswrapper[4732]: I0402 13:58:10.925690 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Apr 02 13:58:10 crc kubenswrapper[4732]: I0402 13:58:10.925821 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Apr 02 13:58:10 crc kubenswrapper[4732]: I0402 13:58:10.928968 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rf4w6"] Apr 02 13:58:10 crc kubenswrapper[4732]: I0402 13:58:10.931350 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Apr 02 13:58:10 crc kubenswrapper[4732]: I0402 13:58:10.931543 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-hvmxw" Apr 02 13:58:10 crc kubenswrapper[4732]: I0402 13:58:10.972028 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-8pjqd"] Apr 02 13:58:10 crc kubenswrapper[4732]: I0402 13:58:10.973340 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-8pjqd" Apr 02 13:58:10 crc kubenswrapper[4732]: I0402 13:58:10.976463 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Apr 02 13:58:10 crc kubenswrapper[4732]: I0402 13:58:10.986984 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-8pjqd"] Apr 02 13:58:11 crc kubenswrapper[4732]: I0402 13:58:11.016781 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81c6acdb-a950-4cc4-a77d-dff56acc00c9-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-8pjqd\" (UID: \"81c6acdb-a950-4cc4-a77d-dff56acc00c9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8pjqd" Apr 02 13:58:11 crc kubenswrapper[4732]: I0402 13:58:11.016823 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/543e9524-dc5c-4f94-8577-3fa8493c61d4-config\") pod \"dnsmasq-dns-675f4bcbfc-rf4w6\" (UID: \"543e9524-dc5c-4f94-8577-3fa8493c61d4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rf4w6" Apr 02 13:58:11 crc kubenswrapper[4732]: I0402 13:58:11.016843 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd4rd\" (UniqueName: \"kubernetes.io/projected/543e9524-dc5c-4f94-8577-3fa8493c61d4-kube-api-access-fd4rd\") pod \"dnsmasq-dns-675f4bcbfc-rf4w6\" (UID: \"543e9524-dc5c-4f94-8577-3fa8493c61d4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rf4w6" Apr 02 13:58:11 crc kubenswrapper[4732]: I0402 13:58:11.016893 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81c6acdb-a950-4cc4-a77d-dff56acc00c9-config\") pod \"dnsmasq-dns-78dd6ddcc-8pjqd\" (UID: \"81c6acdb-a950-4cc4-a77d-dff56acc00c9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8pjqd" Apr 02 13:58:11 crc kubenswrapper[4732]: I0402 13:58:11.017067 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhhpn\" (UniqueName: \"kubernetes.io/projected/81c6acdb-a950-4cc4-a77d-dff56acc00c9-kube-api-access-bhhpn\") pod \"dnsmasq-dns-78dd6ddcc-8pjqd\" (UID: \"81c6acdb-a950-4cc4-a77d-dff56acc00c9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8pjqd" Apr 02 13:58:11 crc kubenswrapper[4732]: I0402 13:58:11.118152 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81c6acdb-a950-4cc4-a77d-dff56acc00c9-config\") pod \"dnsmasq-dns-78dd6ddcc-8pjqd\" (UID: \"81c6acdb-a950-4cc4-a77d-dff56acc00c9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8pjqd" Apr 02 13:58:11 crc kubenswrapper[4732]: I0402 13:58:11.118253 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhhpn\" (UniqueName: \"kubernetes.io/projected/81c6acdb-a950-4cc4-a77d-dff56acc00c9-kube-api-access-bhhpn\") pod \"dnsmasq-dns-78dd6ddcc-8pjqd\" (UID: \"81c6acdb-a950-4cc4-a77d-dff56acc00c9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8pjqd" Apr 02 13:58:11 crc kubenswrapper[4732]: I0402 13:58:11.118313 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81c6acdb-a950-4cc4-a77d-dff56acc00c9-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-8pjqd\" (UID: \"81c6acdb-a950-4cc4-a77d-dff56acc00c9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8pjqd" Apr 02 13:58:11 crc kubenswrapper[4732]: I0402 13:58:11.118343 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/543e9524-dc5c-4f94-8577-3fa8493c61d4-config\") pod \"dnsmasq-dns-675f4bcbfc-rf4w6\" (UID: \"543e9524-dc5c-4f94-8577-3fa8493c61d4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rf4w6" Apr 02 13:58:11 crc kubenswrapper[4732]: I0402 13:58:11.118369 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd4rd\" (UniqueName: \"kubernetes.io/projected/543e9524-dc5c-4f94-8577-3fa8493c61d4-kube-api-access-fd4rd\") pod \"dnsmasq-dns-675f4bcbfc-rf4w6\" (UID: \"543e9524-dc5c-4f94-8577-3fa8493c61d4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rf4w6" Apr 02 13:58:11 crc kubenswrapper[4732]: I0402 13:58:11.119345 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81c6acdb-a950-4cc4-a77d-dff56acc00c9-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-8pjqd\" (UID: \"81c6acdb-a950-4cc4-a77d-dff56acc00c9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8pjqd" Apr 02 13:58:11 crc kubenswrapper[4732]: I0402 13:58:11.119516 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/543e9524-dc5c-4f94-8577-3fa8493c61d4-config\") pod \"dnsmasq-dns-675f4bcbfc-rf4w6\" (UID: \"543e9524-dc5c-4f94-8577-3fa8493c61d4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rf4w6" Apr 02 13:58:11 crc kubenswrapper[4732]: I0402 13:58:11.119574 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81c6acdb-a950-4cc4-a77d-dff56acc00c9-config\") pod \"dnsmasq-dns-78dd6ddcc-8pjqd\" (UID: \"81c6acdb-a950-4cc4-a77d-dff56acc00c9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8pjqd" Apr 02 13:58:11 crc kubenswrapper[4732]: I0402 13:58:11.142070 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd4rd\" (UniqueName: \"kubernetes.io/projected/543e9524-dc5c-4f94-8577-3fa8493c61d4-kube-api-access-fd4rd\") pod \"dnsmasq-dns-675f4bcbfc-rf4w6\" (UID: \"543e9524-dc5c-4f94-8577-3fa8493c61d4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rf4w6" Apr 02 13:58:11 crc kubenswrapper[4732]: I0402 13:58:11.144026 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhhpn\" (UniqueName: \"kubernetes.io/projected/81c6acdb-a950-4cc4-a77d-dff56acc00c9-kube-api-access-bhhpn\") pod \"dnsmasq-dns-78dd6ddcc-8pjqd\" (UID: \"81c6acdb-a950-4cc4-a77d-dff56acc00c9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8pjqd" Apr 02 13:58:11 crc kubenswrapper[4732]: I0402 13:58:11.253359 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rf4w6" Apr 02 13:58:11 crc kubenswrapper[4732]: I0402 13:58:11.299034 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-8pjqd" Apr 02 13:58:11 crc kubenswrapper[4732]: I0402 13:58:11.488971 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rf4w6"] Apr 02 13:58:11 crc kubenswrapper[4732]: I0402 13:58:11.568374 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-8pjqd"] Apr 02 13:58:12 crc kubenswrapper[4732]: I0402 13:58:12.175319 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-rf4w6" event={"ID":"543e9524-dc5c-4f94-8577-3fa8493c61d4","Type":"ContainerStarted","Data":"64eb3a43004c2166defb609e0bafcd2069de40ea6582f67a3b40535bc5348538"} Apr 02 13:58:12 crc kubenswrapper[4732]: I0402 13:58:12.176260 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-8pjqd" event={"ID":"81c6acdb-a950-4cc4-a77d-dff56acc00c9","Type":"ContainerStarted","Data":"d1f114bde1305d650651f11ceef64292411176f8d0c4539e57e115e41386016c"} Apr 02 13:58:13 crc kubenswrapper[4732]: I0402 13:58:13.419896 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rf4w6"] Apr 02 13:58:13 crc kubenswrapper[4732]: I0402 13:58:13.446788 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-285k7"] Apr 02 13:58:13 crc kubenswrapper[4732]: I0402 13:58:13.450397 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-285k7" Apr 02 13:58:13 crc kubenswrapper[4732]: I0402 13:58:13.455405 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-285k7"] Apr 02 13:58:13 crc kubenswrapper[4732]: I0402 13:58:13.651115 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9rbj\" (UniqueName: \"kubernetes.io/projected/97c8b10a-f931-4b1d-967b-3c3eeac5ca4c-kube-api-access-s9rbj\") pod \"dnsmasq-dns-666b6646f7-285k7\" (UID: \"97c8b10a-f931-4b1d-967b-3c3eeac5ca4c\") " pod="openstack/dnsmasq-dns-666b6646f7-285k7" Apr 02 13:58:13 crc kubenswrapper[4732]: I0402 13:58:13.651212 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97c8b10a-f931-4b1d-967b-3c3eeac5ca4c-config\") pod \"dnsmasq-dns-666b6646f7-285k7\" (UID: \"97c8b10a-f931-4b1d-967b-3c3eeac5ca4c\") " pod="openstack/dnsmasq-dns-666b6646f7-285k7" Apr 02 13:58:13 crc kubenswrapper[4732]: I0402 13:58:13.651322 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97c8b10a-f931-4b1d-967b-3c3eeac5ca4c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-285k7\" (UID: \"97c8b10a-f931-4b1d-967b-3c3eeac5ca4c\") " pod="openstack/dnsmasq-dns-666b6646f7-285k7" Apr 02 13:58:13 crc kubenswrapper[4732]: I0402 13:58:13.753735 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97c8b10a-f931-4b1d-967b-3c3eeac5ca4c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-285k7\" (UID: \"97c8b10a-f931-4b1d-967b-3c3eeac5ca4c\") " pod="openstack/dnsmasq-dns-666b6646f7-285k7" Apr 02 13:58:13 crc kubenswrapper[4732]: I0402 13:58:13.753893 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9rbj\" (UniqueName: \"kubernetes.io/projected/97c8b10a-f931-4b1d-967b-3c3eeac5ca4c-kube-api-access-s9rbj\") pod \"dnsmasq-dns-666b6646f7-285k7\" (UID: \"97c8b10a-f931-4b1d-967b-3c3eeac5ca4c\") " pod="openstack/dnsmasq-dns-666b6646f7-285k7" Apr 02 13:58:13 crc kubenswrapper[4732]: I0402 13:58:13.754021 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97c8b10a-f931-4b1d-967b-3c3eeac5ca4c-config\") pod \"dnsmasq-dns-666b6646f7-285k7\" (UID: \"97c8b10a-f931-4b1d-967b-3c3eeac5ca4c\") " pod="openstack/dnsmasq-dns-666b6646f7-285k7" Apr 02 13:58:13 crc kubenswrapper[4732]: I0402 13:58:13.755782 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97c8b10a-f931-4b1d-967b-3c3eeac5ca4c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-285k7\" (UID: \"97c8b10a-f931-4b1d-967b-3c3eeac5ca4c\") " pod="openstack/dnsmasq-dns-666b6646f7-285k7" Apr 02 13:58:13 crc kubenswrapper[4732]: I0402 13:58:13.756863 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97c8b10a-f931-4b1d-967b-3c3eeac5ca4c-config\") pod \"dnsmasq-dns-666b6646f7-285k7\" (UID: \"97c8b10a-f931-4b1d-967b-3c3eeac5ca4c\") " pod="openstack/dnsmasq-dns-666b6646f7-285k7" Apr 02 13:58:13 crc kubenswrapper[4732]: I0402 13:58:13.786316 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9rbj\" (UniqueName: \"kubernetes.io/projected/97c8b10a-f931-4b1d-967b-3c3eeac5ca4c-kube-api-access-s9rbj\") pod \"dnsmasq-dns-666b6646f7-285k7\" (UID: \"97c8b10a-f931-4b1d-967b-3c3eeac5ca4c\") " pod="openstack/dnsmasq-dns-666b6646f7-285k7" Apr 02 13:58:13 crc kubenswrapper[4732]: I0402 13:58:13.787001 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-8pjqd"] Apr 02 13:58:13 crc kubenswrapper[4732]: I0402 13:58:13.816741 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rjf6j"] Apr 02 13:58:13 crc kubenswrapper[4732]: I0402 13:58:13.818027 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rjf6j" Apr 02 13:58:13 crc kubenswrapper[4732]: I0402 13:58:13.851784 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rjf6j"] Apr 02 13:58:13 crc kubenswrapper[4732]: I0402 13:58:13.956748 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38a12349-b9ff-4123-ba3b-96edc0cf2bc6-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rjf6j\" (UID: \"38a12349-b9ff-4123-ba3b-96edc0cf2bc6\") " pod="openstack/dnsmasq-dns-57d769cc4f-rjf6j" Apr 02 13:58:13 crc kubenswrapper[4732]: I0402 13:58:13.956946 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh982\" (UniqueName: \"kubernetes.io/projected/38a12349-b9ff-4123-ba3b-96edc0cf2bc6-kube-api-access-gh982\") pod \"dnsmasq-dns-57d769cc4f-rjf6j\" (UID: \"38a12349-b9ff-4123-ba3b-96edc0cf2bc6\") " pod="openstack/dnsmasq-dns-57d769cc4f-rjf6j" Apr 02 13:58:13 crc kubenswrapper[4732]: I0402 13:58:13.957028 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38a12349-b9ff-4123-ba3b-96edc0cf2bc6-config\") pod \"dnsmasq-dns-57d769cc4f-rjf6j\" (UID: \"38a12349-b9ff-4123-ba3b-96edc0cf2bc6\") " pod="openstack/dnsmasq-dns-57d769cc4f-rjf6j" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.058324 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38a12349-b9ff-4123-ba3b-96edc0cf2bc6-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rjf6j\" (UID: \"38a12349-b9ff-4123-ba3b-96edc0cf2bc6\") " pod="openstack/dnsmasq-dns-57d769cc4f-rjf6j" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.059216 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh982\" (UniqueName: \"kubernetes.io/projected/38a12349-b9ff-4123-ba3b-96edc0cf2bc6-kube-api-access-gh982\") pod \"dnsmasq-dns-57d769cc4f-rjf6j\" (UID: \"38a12349-b9ff-4123-ba3b-96edc0cf2bc6\") " pod="openstack/dnsmasq-dns-57d769cc4f-rjf6j" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.059261 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38a12349-b9ff-4123-ba3b-96edc0cf2bc6-config\") pod \"dnsmasq-dns-57d769cc4f-rjf6j\" (UID: \"38a12349-b9ff-4123-ba3b-96edc0cf2bc6\") " pod="openstack/dnsmasq-dns-57d769cc4f-rjf6j" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.059904 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38a12349-b9ff-4123-ba3b-96edc0cf2bc6-config\") pod \"dnsmasq-dns-57d769cc4f-rjf6j\" (UID: \"38a12349-b9ff-4123-ba3b-96edc0cf2bc6\") " pod="openstack/dnsmasq-dns-57d769cc4f-rjf6j" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.059140 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38a12349-b9ff-4123-ba3b-96edc0cf2bc6-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rjf6j\" (UID: \"38a12349-b9ff-4123-ba3b-96edc0cf2bc6\") " pod="openstack/dnsmasq-dns-57d769cc4f-rjf6j" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.078303 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-285k7" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.079437 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh982\" (UniqueName: \"kubernetes.io/projected/38a12349-b9ff-4123-ba3b-96edc0cf2bc6-kube-api-access-gh982\") pod \"dnsmasq-dns-57d769cc4f-rjf6j\" (UID: \"38a12349-b9ff-4123-ba3b-96edc0cf2bc6\") " pod="openstack/dnsmasq-dns-57d769cc4f-rjf6j" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.143691 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rjf6j" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.431229 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rjf6j"] Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.438795 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.440325 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.442493 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.442828 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.442860 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.443272 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.443526 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.444213 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.445167 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-smt97" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.451561 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.555341 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-285k7"] Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.564986 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56762f05-a513-4f47-8cf7-5d19bb58c5bd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.565115 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56762f05-a513-4f47-8cf7-5d19bb58c5bd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.565148 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56762f05-a513-4f47-8cf7-5d19bb58c5bd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.565251 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.565307 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56762f05-a513-4f47-8cf7-5d19bb58c5bd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.565333 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56762f05-a513-4f47-8cf7-5d19bb58c5bd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.565355 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56762f05-a513-4f47-8cf7-5d19bb58c5bd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.565412 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56762f05-a513-4f47-8cf7-5d19bb58c5bd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.565443 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56762f05-a513-4f47-8cf7-5d19bb58c5bd-config-data\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.565514 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnhpf\" (UniqueName: \"kubernetes.io/projected/56762f05-a513-4f47-8cf7-5d19bb58c5bd-kube-api-access-dnhpf\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.565549 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56762f05-a513-4f47-8cf7-5d19bb58c5bd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.667293 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56762f05-a513-4f47-8cf7-5d19bb58c5bd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.667365 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56762f05-a513-4f47-8cf7-5d19bb58c5bd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.667402 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56762f05-a513-4f47-8cf7-5d19bb58c5bd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.667436 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.667459 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56762f05-a513-4f47-8cf7-5d19bb58c5bd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.667479 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56762f05-a513-4f47-8cf7-5d19bb58c5bd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.667501 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56762f05-a513-4f47-8cf7-5d19bb58c5bd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.667537 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56762f05-a513-4f47-8cf7-5d19bb58c5bd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.667561 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56762f05-a513-4f47-8cf7-5d19bb58c5bd-config-data\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.667608 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnhpf\" (UniqueName: \"kubernetes.io/projected/56762f05-a513-4f47-8cf7-5d19bb58c5bd-kube-api-access-dnhpf\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.667656 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56762f05-a513-4f47-8cf7-5d19bb58c5bd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.668513 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.668898 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56762f05-a513-4f47-8cf7-5d19bb58c5bd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.669306 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56762f05-a513-4f47-8cf7-5d19bb58c5bd-config-data\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.669825 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56762f05-a513-4f47-8cf7-5d19bb58c5bd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.671160 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56762f05-a513-4f47-8cf7-5d19bb58c5bd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.673447 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56762f05-a513-4f47-8cf7-5d19bb58c5bd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.674259 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56762f05-a513-4f47-8cf7-5d19bb58c5bd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.674554 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56762f05-a513-4f47-8cf7-5d19bb58c5bd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.674562 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56762f05-a513-4f47-8cf7-5d19bb58c5bd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.675102 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56762f05-a513-4f47-8cf7-5d19bb58c5bd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.688148 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnhpf\" (UniqueName: \"kubernetes.io/projected/56762f05-a513-4f47-8cf7-5d19bb58c5bd-kube-api-access-dnhpf\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.694609 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.765368 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.765390 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.766943 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.769699 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.770058 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.770061 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.770301 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.770399 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.770918 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-d2wpm" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.771125 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.775240 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.871366 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b0bb93d2-9da7-4667-9079-b403332d31e0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.871570 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b0bb93d2-9da7-4667-9079-b403332d31e0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.871741 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.871809 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b0bb93d2-9da7-4667-9079-b403332d31e0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.871912 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b0bb93d2-9da7-4667-9079-b403332d31e0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.871998 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b0bb93d2-9da7-4667-9079-b403332d31e0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.872051 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b0bb93d2-9da7-4667-9079-b403332d31e0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.872091 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b0bb93d2-9da7-4667-9079-b403332d31e0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.872161 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b0bb93d2-9da7-4667-9079-b403332d31e0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.872244 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb6bz\" (UniqueName: \"kubernetes.io/projected/b0bb93d2-9da7-4667-9079-b403332d31e0-kube-api-access-mb6bz\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.872314 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b0bb93d2-9da7-4667-9079-b403332d31e0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.974014 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b0bb93d2-9da7-4667-9079-b403332d31e0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.974067 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b0bb93d2-9da7-4667-9079-b403332d31e0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.974094 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b0bb93d2-9da7-4667-9079-b403332d31e0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.974118 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b0bb93d2-9da7-4667-9079-b403332d31e0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.974155 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb6bz\" (UniqueName: \"kubernetes.io/projected/b0bb93d2-9da7-4667-9079-b403332d31e0-kube-api-access-mb6bz\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.974183 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b0bb93d2-9da7-4667-9079-b403332d31e0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.974242 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b0bb93d2-9da7-4667-9079-b403332d31e0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.974267 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.974290 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b0bb93d2-9da7-4667-9079-b403332d31e0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.974324 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b0bb93d2-9da7-4667-9079-b403332d31e0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.974355 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b0bb93d2-9da7-4667-9079-b403332d31e0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.974784 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.976942 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b0bb93d2-9da7-4667-9079-b403332d31e0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.976861 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b0bb93d2-9da7-4667-9079-b403332d31e0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.976020 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b0bb93d2-9da7-4667-9079-b403332d31e0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.978317 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b0bb93d2-9da7-4667-9079-b403332d31e0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.978629 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b0bb93d2-9da7-4667-9079-b403332d31e0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.980776 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b0bb93d2-9da7-4667-9079-b403332d31e0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.981005 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b0bb93d2-9da7-4667-9079-b403332d31e0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.984392 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b0bb93d2-9da7-4667-9079-b403332d31e0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:14 crc kubenswrapper[4732]: I0402 13:58:14.985699 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b0bb93d2-9da7-4667-9079-b403332d31e0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:15 crc kubenswrapper[4732]: I0402 13:58:15.006187 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb6bz\" (UniqueName: \"kubernetes.io/projected/b0bb93d2-9da7-4667-9079-b403332d31e0-kube-api-access-mb6bz\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:15 crc kubenswrapper[4732]: I0402 13:58:15.013692 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:15 crc kubenswrapper[4732]: I0402 13:58:15.091589 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:58:15 crc kubenswrapper[4732]: I0402 13:58:15.232059 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rjf6j" event={"ID":"38a12349-b9ff-4123-ba3b-96edc0cf2bc6","Type":"ContainerStarted","Data":"3e787762a53d8a1b1175df1be8e3c94d07093c61c7dc9dfe099962875a02710e"} Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.334444 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.336727 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.340795 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.341985 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.348772 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-p96nn" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.348977 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.349378 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.351017 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.496659 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"2fbe66fb-6f02-432d-8acf-50fec5339d96\") " pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.496714 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fbe66fb-6f02-432d-8acf-50fec5339d96-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2fbe66fb-6f02-432d-8acf-50fec5339d96\") " pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.496742 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2fbe66fb-6f02-432d-8acf-50fec5339d96-config-data-default\") pod \"openstack-galera-0\" (UID: \"2fbe66fb-6f02-432d-8acf-50fec5339d96\") " pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.496790 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phz8r\" (UniqueName: \"kubernetes.io/projected/2fbe66fb-6f02-432d-8acf-50fec5339d96-kube-api-access-phz8r\") pod \"openstack-galera-0\" (UID: \"2fbe66fb-6f02-432d-8acf-50fec5339d96\") " pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.496812 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fbe66fb-6f02-432d-8acf-50fec5339d96-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2fbe66fb-6f02-432d-8acf-50fec5339d96\") " pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.497010 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2fbe66fb-6f02-432d-8acf-50fec5339d96-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2fbe66fb-6f02-432d-8acf-50fec5339d96\") " pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.497077 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fbe66fb-6f02-432d-8acf-50fec5339d96-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2fbe66fb-6f02-432d-8acf-50fec5339d96\") " pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.497100 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2fbe66fb-6f02-432d-8acf-50fec5339d96-kolla-config\") pod \"openstack-galera-0\" (UID: \"2fbe66fb-6f02-432d-8acf-50fec5339d96\") " pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.598061 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2fbe66fb-6f02-432d-8acf-50fec5339d96-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2fbe66fb-6f02-432d-8acf-50fec5339d96\") " pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.598119 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fbe66fb-6f02-432d-8acf-50fec5339d96-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2fbe66fb-6f02-432d-8acf-50fec5339d96\") " pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.598138 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2fbe66fb-6f02-432d-8acf-50fec5339d96-kolla-config\") pod \"openstack-galera-0\" (UID: \"2fbe66fb-6f02-432d-8acf-50fec5339d96\") " pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.598168 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"2fbe66fb-6f02-432d-8acf-50fec5339d96\") " pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.598198 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fbe66fb-6f02-432d-8acf-50fec5339d96-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2fbe66fb-6f02-432d-8acf-50fec5339d96\") " pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.598218 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2fbe66fb-6f02-432d-8acf-50fec5339d96-config-data-default\") pod \"openstack-galera-0\" (UID: \"2fbe66fb-6f02-432d-8acf-50fec5339d96\") " pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.598243 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phz8r\" (UniqueName: \"kubernetes.io/projected/2fbe66fb-6f02-432d-8acf-50fec5339d96-kube-api-access-phz8r\") pod \"openstack-galera-0\" (UID: \"2fbe66fb-6f02-432d-8acf-50fec5339d96\") " pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.598262 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fbe66fb-6f02-432d-8acf-50fec5339d96-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2fbe66fb-6f02-432d-8acf-50fec5339d96\") " pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.598505 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"2fbe66fb-6f02-432d-8acf-50fec5339d96\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.598711 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2fbe66fb-6f02-432d-8acf-50fec5339d96-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2fbe66fb-6f02-432d-8acf-50fec5339d96\") " pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.599503 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2fbe66fb-6f02-432d-8acf-50fec5339d96-config-data-default\") pod \"openstack-galera-0\" (UID: \"2fbe66fb-6f02-432d-8acf-50fec5339d96\") " pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.599690 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fbe66fb-6f02-432d-8acf-50fec5339d96-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2fbe66fb-6f02-432d-8acf-50fec5339d96\") " pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.602370 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2fbe66fb-6f02-432d-8acf-50fec5339d96-kolla-config\") pod \"openstack-galera-0\" (UID: \"2fbe66fb-6f02-432d-8acf-50fec5339d96\") " pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.603534 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fbe66fb-6f02-432d-8acf-50fec5339d96-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2fbe66fb-6f02-432d-8acf-50fec5339d96\") " pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.612595 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fbe66fb-6f02-432d-8acf-50fec5339d96-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2fbe66fb-6f02-432d-8acf-50fec5339d96\") " pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.618365 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phz8r\" (UniqueName: \"kubernetes.io/projected/2fbe66fb-6f02-432d-8acf-50fec5339d96-kube-api-access-phz8r\") pod \"openstack-galera-0\" (UID: \"2fbe66fb-6f02-432d-8acf-50fec5339d96\") " pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.630002 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"2fbe66fb-6f02-432d-8acf-50fec5339d96\") " pod="openstack/openstack-galera-0" Apr 02 13:58:16 crc kubenswrapper[4732]: I0402 13:58:16.665807 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Apr 02 13:58:17 crc kubenswrapper[4732]: I0402 13:58:17.809487 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Apr 02 13:58:17 crc kubenswrapper[4732]: I0402 13:58:17.811396 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:17 crc kubenswrapper[4732]: I0402 13:58:17.813468 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-tx7kf" Apr 02 13:58:17 crc kubenswrapper[4732]: I0402 13:58:17.815134 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Apr 02 13:58:17 crc kubenswrapper[4732]: I0402 13:58:17.815294 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Apr 02 13:58:17 crc kubenswrapper[4732]: I0402 13:58:17.815419 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Apr 02 13:58:17 crc kubenswrapper[4732]: I0402 13:58:17.817207 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Apr 02 13:58:17 crc kubenswrapper[4732]: I0402 13:58:17.918838 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/688bac91-aede-4c9f-a063-6469bb03db8c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"688bac91-aede-4c9f-a063-6469bb03db8c\") " pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:17 crc kubenswrapper[4732]: I0402 13:58:17.918896 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/688bac91-aede-4c9f-a063-6469bb03db8c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"688bac91-aede-4c9f-a063-6469bb03db8c\") " pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:17 crc kubenswrapper[4732]: I0402 13:58:17.918916 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688bac91-aede-4c9f-a063-6469bb03db8c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"688bac91-aede-4c9f-a063-6469bb03db8c\") " pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:17 crc kubenswrapper[4732]: I0402 13:58:17.919050 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/688bac91-aede-4c9f-a063-6469bb03db8c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"688bac91-aede-4c9f-a063-6469bb03db8c\") " pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:17 crc kubenswrapper[4732]: I0402 13:58:17.919082 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"688bac91-aede-4c9f-a063-6469bb03db8c\") " pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:17 crc kubenswrapper[4732]: I0402 13:58:17.919115 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/688bac91-aede-4c9f-a063-6469bb03db8c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"688bac91-aede-4c9f-a063-6469bb03db8c\") " pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:17 crc kubenswrapper[4732]: I0402 13:58:17.919277 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/688bac91-aede-4c9f-a063-6469bb03db8c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"688bac91-aede-4c9f-a063-6469bb03db8c\") " pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:17 crc kubenswrapper[4732]: I0402 13:58:17.919305 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48rzc\" (UniqueName: \"kubernetes.io/projected/688bac91-aede-4c9f-a063-6469bb03db8c-kube-api-access-48rzc\") pod \"openstack-cell1-galera-0\" (UID: \"688bac91-aede-4c9f-a063-6469bb03db8c\") " pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.020522 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/688bac91-aede-4c9f-a063-6469bb03db8c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"688bac91-aede-4c9f-a063-6469bb03db8c\") " pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.020576 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"688bac91-aede-4c9f-a063-6469bb03db8c\") " pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.020635 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/688bac91-aede-4c9f-a063-6469bb03db8c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"688bac91-aede-4c9f-a063-6469bb03db8c\") " pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.020687 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/688bac91-aede-4c9f-a063-6469bb03db8c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"688bac91-aede-4c9f-a063-6469bb03db8c\") " pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.020709 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48rzc\" (UniqueName: \"kubernetes.io/projected/688bac91-aede-4c9f-a063-6469bb03db8c-kube-api-access-48rzc\") pod \"openstack-cell1-galera-0\" (UID: \"688bac91-aede-4c9f-a063-6469bb03db8c\") " pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.020749 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/688bac91-aede-4c9f-a063-6469bb03db8c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"688bac91-aede-4c9f-a063-6469bb03db8c\") " pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.020779 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/688bac91-aede-4c9f-a063-6469bb03db8c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"688bac91-aede-4c9f-a063-6469bb03db8c\") " pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.020801 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688bac91-aede-4c9f-a063-6469bb03db8c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"688bac91-aede-4c9f-a063-6469bb03db8c\") " pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.022081 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"688bac91-aede-4c9f-a063-6469bb03db8c\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.022238 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/688bac91-aede-4c9f-a063-6469bb03db8c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"688bac91-aede-4c9f-a063-6469bb03db8c\") " pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.022901 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/688bac91-aede-4c9f-a063-6469bb03db8c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"688bac91-aede-4c9f-a063-6469bb03db8c\") " pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.023965 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/688bac91-aede-4c9f-a063-6469bb03db8c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"688bac91-aede-4c9f-a063-6469bb03db8c\") " pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.024460 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/688bac91-aede-4c9f-a063-6469bb03db8c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"688bac91-aede-4c9f-a063-6469bb03db8c\") " pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.026931 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/688bac91-aede-4c9f-a063-6469bb03db8c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"688bac91-aede-4c9f-a063-6469bb03db8c\") " pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.062088 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48rzc\" (UniqueName: \"kubernetes.io/projected/688bac91-aede-4c9f-a063-6469bb03db8c-kube-api-access-48rzc\") pod \"openstack-cell1-galera-0\" (UID: \"688bac91-aede-4c9f-a063-6469bb03db8c\") " pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.062789 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688bac91-aede-4c9f-a063-6469bb03db8c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"688bac91-aede-4c9f-a063-6469bb03db8c\") " pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.076359 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"688bac91-aede-4c9f-a063-6469bb03db8c\") " pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.090729 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.094639 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.097646 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.097838 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.097984 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-tsb8z" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.123020 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.138263 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.223475 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f87d2b1-82d0-4126-aeae-46aa84ba3d1f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4f87d2b1-82d0-4126-aeae-46aa84ba3d1f\") " pod="openstack/memcached-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.223653 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f87d2b1-82d0-4126-aeae-46aa84ba3d1f-config-data\") pod \"memcached-0\" (UID: \"4f87d2b1-82d0-4126-aeae-46aa84ba3d1f\") " pod="openstack/memcached-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.223688 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f87d2b1-82d0-4126-aeae-46aa84ba3d1f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4f87d2b1-82d0-4126-aeae-46aa84ba3d1f\") " pod="openstack/memcached-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.223741 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4f87d2b1-82d0-4126-aeae-46aa84ba3d1f-kolla-config\") pod \"memcached-0\" (UID: \"4f87d2b1-82d0-4126-aeae-46aa84ba3d1f\") " pod="openstack/memcached-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.223780 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7pg6\" (UniqueName: \"kubernetes.io/projected/4f87d2b1-82d0-4126-aeae-46aa84ba3d1f-kube-api-access-q7pg6\") pod \"memcached-0\" (UID: \"4f87d2b1-82d0-4126-aeae-46aa84ba3d1f\") " pod="openstack/memcached-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.254158 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-285k7" event={"ID":"97c8b10a-f931-4b1d-967b-3c3eeac5ca4c","Type":"ContainerStarted","Data":"79d4b465040609920d0d6234d66a5ad8f0ed34392e4095355bf8fa341443985a"} Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.325148 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f87d2b1-82d0-4126-aeae-46aa84ba3d1f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4f87d2b1-82d0-4126-aeae-46aa84ba3d1f\") " pod="openstack/memcached-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.325216 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4f87d2b1-82d0-4126-aeae-46aa84ba3d1f-kolla-config\") pod \"memcached-0\" (UID: \"4f87d2b1-82d0-4126-aeae-46aa84ba3d1f\") " pod="openstack/memcached-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.325245 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7pg6\" (UniqueName: \"kubernetes.io/projected/4f87d2b1-82d0-4126-aeae-46aa84ba3d1f-kube-api-access-q7pg6\") pod \"memcached-0\" (UID: \"4f87d2b1-82d0-4126-aeae-46aa84ba3d1f\") " pod="openstack/memcached-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.325261 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f87d2b1-82d0-4126-aeae-46aa84ba3d1f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4f87d2b1-82d0-4126-aeae-46aa84ba3d1f\") " pod="openstack/memcached-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.325332 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f87d2b1-82d0-4126-aeae-46aa84ba3d1f-config-data\") pod \"memcached-0\" (UID: \"4f87d2b1-82d0-4126-aeae-46aa84ba3d1f\") " pod="openstack/memcached-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.326027 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f87d2b1-82d0-4126-aeae-46aa84ba3d1f-config-data\") pod \"memcached-0\" (UID: \"4f87d2b1-82d0-4126-aeae-46aa84ba3d1f\") " pod="openstack/memcached-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.327208 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4f87d2b1-82d0-4126-aeae-46aa84ba3d1f-kolla-config\") pod \"memcached-0\" (UID: \"4f87d2b1-82d0-4126-aeae-46aa84ba3d1f\") " pod="openstack/memcached-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.329324 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f87d2b1-82d0-4126-aeae-46aa84ba3d1f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4f87d2b1-82d0-4126-aeae-46aa84ba3d1f\") " pod="openstack/memcached-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.329768 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f87d2b1-82d0-4126-aeae-46aa84ba3d1f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4f87d2b1-82d0-4126-aeae-46aa84ba3d1f\") " pod="openstack/memcached-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.347953 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7pg6\" (UniqueName: \"kubernetes.io/projected/4f87d2b1-82d0-4126-aeae-46aa84ba3d1f-kube-api-access-q7pg6\") pod \"memcached-0\" (UID: \"4f87d2b1-82d0-4126-aeae-46aa84ba3d1f\") " pod="openstack/memcached-0" Apr 02 13:58:18 crc kubenswrapper[4732]: I0402 13:58:18.447681 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Apr 02 13:58:20 crc kubenswrapper[4732]: I0402 13:58:20.223870 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Apr 02 13:58:20 crc kubenswrapper[4732]: I0402 13:58:20.225048 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Apr 02 13:58:20 crc kubenswrapper[4732]: I0402 13:58:20.227197 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-nzspv" Apr 02 13:58:20 crc kubenswrapper[4732]: I0402 13:58:20.237520 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Apr 02 13:58:20 crc kubenswrapper[4732]: I0402 13:58:20.368741 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwbkx\" (UniqueName: \"kubernetes.io/projected/ac229e50-5412-4f18-be3c-4a364b95dcf2-kube-api-access-hwbkx\") pod \"kube-state-metrics-0\" (UID: \"ac229e50-5412-4f18-be3c-4a364b95dcf2\") " pod="openstack/kube-state-metrics-0" Apr 02 13:58:20 crc kubenswrapper[4732]: I0402 13:58:20.470446 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwbkx\" (UniqueName: \"kubernetes.io/projected/ac229e50-5412-4f18-be3c-4a364b95dcf2-kube-api-access-hwbkx\") pod \"kube-state-metrics-0\" (UID: \"ac229e50-5412-4f18-be3c-4a364b95dcf2\") " pod="openstack/kube-state-metrics-0" Apr 02 13:58:20 crc kubenswrapper[4732]: I0402 13:58:20.489390 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwbkx\" (UniqueName: \"kubernetes.io/projected/ac229e50-5412-4f18-be3c-4a364b95dcf2-kube-api-access-hwbkx\") pod \"kube-state-metrics-0\" (UID: \"ac229e50-5412-4f18-be3c-4a364b95dcf2\") " pod="openstack/kube-state-metrics-0" Apr 02 13:58:20 crc kubenswrapper[4732]: I0402 13:58:20.595996 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.084512 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-5222s"] Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.086263 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5222s" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.088634 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.088878 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.089444 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-scxg8" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.093888 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-l4ttl"] Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.095536 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-l4ttl" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.103469 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5222s"] Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.121655 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-l4ttl"] Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.224414 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5eba7503-ee7b-40ba-a0dc-e11fad40c2b7-var-run\") pod \"ovn-controller-ovs-l4ttl\" (UID: \"5eba7503-ee7b-40ba-a0dc-e11fad40c2b7\") " pod="openstack/ovn-controller-ovs-l4ttl" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.224477 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8af6391f-4f8b-4473-8e7c-186c9c838527-ovn-controller-tls-certs\") pod \"ovn-controller-5222s\" (UID: \"8af6391f-4f8b-4473-8e7c-186c9c838527\") " pod="openstack/ovn-controller-5222s" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.224826 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxrzr\" (UniqueName: \"kubernetes.io/projected/5eba7503-ee7b-40ba-a0dc-e11fad40c2b7-kube-api-access-nxrzr\") pod \"ovn-controller-ovs-l4ttl\" (UID: \"5eba7503-ee7b-40ba-a0dc-e11fad40c2b7\") " pod="openstack/ovn-controller-ovs-l4ttl" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.224866 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5eba7503-ee7b-40ba-a0dc-e11fad40c2b7-var-lib\") pod \"ovn-controller-ovs-l4ttl\" (UID: \"5eba7503-ee7b-40ba-a0dc-e11fad40c2b7\") " pod="openstack/ovn-controller-ovs-l4ttl" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.224882 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5eba7503-ee7b-40ba-a0dc-e11fad40c2b7-scripts\") pod \"ovn-controller-ovs-l4ttl\" (UID: \"5eba7503-ee7b-40ba-a0dc-e11fad40c2b7\") " pod="openstack/ovn-controller-ovs-l4ttl" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.224911 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8af6391f-4f8b-4473-8e7c-186c9c838527-var-log-ovn\") pod \"ovn-controller-5222s\" (UID: \"8af6391f-4f8b-4473-8e7c-186c9c838527\") " pod="openstack/ovn-controller-5222s" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.225076 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af6391f-4f8b-4473-8e7c-186c9c838527-combined-ca-bundle\") pod \"ovn-controller-5222s\" (UID: \"8af6391f-4f8b-4473-8e7c-186c9c838527\") " pod="openstack/ovn-controller-5222s" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.225115 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5eba7503-ee7b-40ba-a0dc-e11fad40c2b7-etc-ovs\") pod \"ovn-controller-ovs-l4ttl\" (UID: \"5eba7503-ee7b-40ba-a0dc-e11fad40c2b7\") " pod="openstack/ovn-controller-ovs-l4ttl" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.225164 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8af6391f-4f8b-4473-8e7c-186c9c838527-var-run-ovn\") pod \"ovn-controller-5222s\" (UID: \"8af6391f-4f8b-4473-8e7c-186c9c838527\") " pod="openstack/ovn-controller-5222s" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.225261 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5zxn\" (UniqueName: \"kubernetes.io/projected/8af6391f-4f8b-4473-8e7c-186c9c838527-kube-api-access-t5zxn\") pod \"ovn-controller-5222s\" (UID: \"8af6391f-4f8b-4473-8e7c-186c9c838527\") " pod="openstack/ovn-controller-5222s" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.225288 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8af6391f-4f8b-4473-8e7c-186c9c838527-var-run\") pod \"ovn-controller-5222s\" (UID: \"8af6391f-4f8b-4473-8e7c-186c9c838527\") " pod="openstack/ovn-controller-5222s" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.225340 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8af6391f-4f8b-4473-8e7c-186c9c838527-scripts\") pod \"ovn-controller-5222s\" (UID: \"8af6391f-4f8b-4473-8e7c-186c9c838527\") " pod="openstack/ovn-controller-5222s" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.225366 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5eba7503-ee7b-40ba-a0dc-e11fad40c2b7-var-log\") pod \"ovn-controller-ovs-l4ttl\" (UID: \"5eba7503-ee7b-40ba-a0dc-e11fad40c2b7\") " pod="openstack/ovn-controller-ovs-l4ttl" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.327642 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5zxn\" (UniqueName: \"kubernetes.io/projected/8af6391f-4f8b-4473-8e7c-186c9c838527-kube-api-access-t5zxn\") pod \"ovn-controller-5222s\" (UID: \"8af6391f-4f8b-4473-8e7c-186c9c838527\") " pod="openstack/ovn-controller-5222s" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.327686 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8af6391f-4f8b-4473-8e7c-186c9c838527-var-run\") pod \"ovn-controller-5222s\" (UID: \"8af6391f-4f8b-4473-8e7c-186c9c838527\") " pod="openstack/ovn-controller-5222s" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.327719 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8af6391f-4f8b-4473-8e7c-186c9c838527-scripts\") pod \"ovn-controller-5222s\" (UID: \"8af6391f-4f8b-4473-8e7c-186c9c838527\") " pod="openstack/ovn-controller-5222s" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.327738 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5eba7503-ee7b-40ba-a0dc-e11fad40c2b7-var-log\") pod \"ovn-controller-ovs-l4ttl\" (UID: \"5eba7503-ee7b-40ba-a0dc-e11fad40c2b7\") " pod="openstack/ovn-controller-ovs-l4ttl" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.327761 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5eba7503-ee7b-40ba-a0dc-e11fad40c2b7-var-run\") pod \"ovn-controller-ovs-l4ttl\" (UID: \"5eba7503-ee7b-40ba-a0dc-e11fad40c2b7\") " pod="openstack/ovn-controller-ovs-l4ttl" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.327799 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8af6391f-4f8b-4473-8e7c-186c9c838527-ovn-controller-tls-certs\") pod \"ovn-controller-5222s\" (UID: \"8af6391f-4f8b-4473-8e7c-186c9c838527\") " pod="openstack/ovn-controller-5222s" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.327821 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxrzr\" (UniqueName: \"kubernetes.io/projected/5eba7503-ee7b-40ba-a0dc-e11fad40c2b7-kube-api-access-nxrzr\") pod \"ovn-controller-ovs-l4ttl\" (UID: \"5eba7503-ee7b-40ba-a0dc-e11fad40c2b7\") " pod="openstack/ovn-controller-ovs-l4ttl" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.327864 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5eba7503-ee7b-40ba-a0dc-e11fad40c2b7-var-lib\") pod \"ovn-controller-ovs-l4ttl\" (UID: \"5eba7503-ee7b-40ba-a0dc-e11fad40c2b7\") " pod="openstack/ovn-controller-ovs-l4ttl" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.327880 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5eba7503-ee7b-40ba-a0dc-e11fad40c2b7-scripts\") pod \"ovn-controller-ovs-l4ttl\" (UID: \"5eba7503-ee7b-40ba-a0dc-e11fad40c2b7\") " pod="openstack/ovn-controller-ovs-l4ttl" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.327902 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8af6391f-4f8b-4473-8e7c-186c9c838527-var-log-ovn\") pod \"ovn-controller-5222s\" (UID: \"8af6391f-4f8b-4473-8e7c-186c9c838527\") " pod="openstack/ovn-controller-5222s" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.327924 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af6391f-4f8b-4473-8e7c-186c9c838527-combined-ca-bundle\") pod \"ovn-controller-5222s\" (UID: \"8af6391f-4f8b-4473-8e7c-186c9c838527\") " pod="openstack/ovn-controller-5222s" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.327939 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5eba7503-ee7b-40ba-a0dc-e11fad40c2b7-etc-ovs\") pod \"ovn-controller-ovs-l4ttl\" (UID: \"5eba7503-ee7b-40ba-a0dc-e11fad40c2b7\") " pod="openstack/ovn-controller-ovs-l4ttl" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.327958 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8af6391f-4f8b-4473-8e7c-186c9c838527-var-run-ovn\") pod \"ovn-controller-5222s\" (UID: \"8af6391f-4f8b-4473-8e7c-186c9c838527\") " pod="openstack/ovn-controller-5222s" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.329070 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8af6391f-4f8b-4473-8e7c-186c9c838527-var-log-ovn\") pod \"ovn-controller-5222s\" (UID: \"8af6391f-4f8b-4473-8e7c-186c9c838527\") " pod="openstack/ovn-controller-5222s" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.329138 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5eba7503-ee7b-40ba-a0dc-e11fad40c2b7-etc-ovs\") pod \"ovn-controller-ovs-l4ttl\" (UID: \"5eba7503-ee7b-40ba-a0dc-e11fad40c2b7\") " pod="openstack/ovn-controller-ovs-l4ttl" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.329178 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5eba7503-ee7b-40ba-a0dc-e11fad40c2b7-var-lib\") pod \"ovn-controller-ovs-l4ttl\" (UID: \"5eba7503-ee7b-40ba-a0dc-e11fad40c2b7\") " pod="openstack/ovn-controller-ovs-l4ttl" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.329210 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8af6391f-4f8b-4473-8e7c-186c9c838527-var-run\") pod \"ovn-controller-5222s\" (UID: \"8af6391f-4f8b-4473-8e7c-186c9c838527\") " pod="openstack/ovn-controller-5222s" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.329229 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5eba7503-ee7b-40ba-a0dc-e11fad40c2b7-var-run\") pod \"ovn-controller-ovs-l4ttl\" (UID: \"5eba7503-ee7b-40ba-a0dc-e11fad40c2b7\") " pod="openstack/ovn-controller-ovs-l4ttl" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.329255 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5eba7503-ee7b-40ba-a0dc-e11fad40c2b7-var-log\") pod \"ovn-controller-ovs-l4ttl\" (UID: \"5eba7503-ee7b-40ba-a0dc-e11fad40c2b7\") " pod="openstack/ovn-controller-ovs-l4ttl" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.331417 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5eba7503-ee7b-40ba-a0dc-e11fad40c2b7-scripts\") pod \"ovn-controller-ovs-l4ttl\" (UID: \"5eba7503-ee7b-40ba-a0dc-e11fad40c2b7\") " pod="openstack/ovn-controller-ovs-l4ttl" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.334677 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af6391f-4f8b-4473-8e7c-186c9c838527-combined-ca-bundle\") pod \"ovn-controller-5222s\" (UID: \"8af6391f-4f8b-4473-8e7c-186c9c838527\") " pod="openstack/ovn-controller-5222s" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.334974 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8af6391f-4f8b-4473-8e7c-186c9c838527-scripts\") pod \"ovn-controller-5222s\" (UID: \"8af6391f-4f8b-4473-8e7c-186c9c838527\") " pod="openstack/ovn-controller-5222s" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.335050 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8af6391f-4f8b-4473-8e7c-186c9c838527-var-run-ovn\") pod \"ovn-controller-5222s\" (UID: \"8af6391f-4f8b-4473-8e7c-186c9c838527\") " pod="openstack/ovn-controller-5222s" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.335408 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8af6391f-4f8b-4473-8e7c-186c9c838527-ovn-controller-tls-certs\") pod \"ovn-controller-5222s\" (UID: \"8af6391f-4f8b-4473-8e7c-186c9c838527\") " pod="openstack/ovn-controller-5222s" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.345700 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5zxn\" (UniqueName: \"kubernetes.io/projected/8af6391f-4f8b-4473-8e7c-186c9c838527-kube-api-access-t5zxn\") pod \"ovn-controller-5222s\" (UID: \"8af6391f-4f8b-4473-8e7c-186c9c838527\") " pod="openstack/ovn-controller-5222s" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.358264 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxrzr\" (UniqueName: \"kubernetes.io/projected/5eba7503-ee7b-40ba-a0dc-e11fad40c2b7-kube-api-access-nxrzr\") pod \"ovn-controller-ovs-l4ttl\" (UID: \"5eba7503-ee7b-40ba-a0dc-e11fad40c2b7\") " pod="openstack/ovn-controller-ovs-l4ttl" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.407465 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5222s" Apr 02 13:58:24 crc kubenswrapper[4732]: I0402 13:58:24.416818 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-l4ttl" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.341359 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.343867 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.348872 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.349037 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.349291 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-ws6jn" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.350471 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.351240 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.401586 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.446753 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8dff454-a625-4309-92b6-8ab92d4bd60a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d8dff454-a625-4309-92b6-8ab92d4bd60a\") " pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.446874 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8dff454-a625-4309-92b6-8ab92d4bd60a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d8dff454-a625-4309-92b6-8ab92d4bd60a\") " pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.446952 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8dff454-a625-4309-92b6-8ab92d4bd60a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d8dff454-a625-4309-92b6-8ab92d4bd60a\") " pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.447068 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8dff454-a625-4309-92b6-8ab92d4bd60a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d8dff454-a625-4309-92b6-8ab92d4bd60a\") " pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.447105 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tbfw\" (UniqueName: \"kubernetes.io/projected/d8dff454-a625-4309-92b6-8ab92d4bd60a-kube-api-access-5tbfw\") pod \"ovsdbserver-nb-0\" (UID: \"d8dff454-a625-4309-92b6-8ab92d4bd60a\") " pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.447174 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8dff454-a625-4309-92b6-8ab92d4bd60a-config\") pod \"ovsdbserver-nb-0\" (UID: \"d8dff454-a625-4309-92b6-8ab92d4bd60a\") " pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.448984 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d8dff454-a625-4309-92b6-8ab92d4bd60a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d8dff454-a625-4309-92b6-8ab92d4bd60a\") " pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.449082 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d8dff454-a625-4309-92b6-8ab92d4bd60a\") " pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.550775 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8dff454-a625-4309-92b6-8ab92d4bd60a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d8dff454-a625-4309-92b6-8ab92d4bd60a\") " pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.550823 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8dff454-a625-4309-92b6-8ab92d4bd60a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d8dff454-a625-4309-92b6-8ab92d4bd60a\") " pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.550844 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8dff454-a625-4309-92b6-8ab92d4bd60a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d8dff454-a625-4309-92b6-8ab92d4bd60a\") " pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.550878 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8dff454-a625-4309-92b6-8ab92d4bd60a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d8dff454-a625-4309-92b6-8ab92d4bd60a\") " pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.550898 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tbfw\" (UniqueName: \"kubernetes.io/projected/d8dff454-a625-4309-92b6-8ab92d4bd60a-kube-api-access-5tbfw\") pod \"ovsdbserver-nb-0\" (UID: \"d8dff454-a625-4309-92b6-8ab92d4bd60a\") " pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.550921 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8dff454-a625-4309-92b6-8ab92d4bd60a-config\") pod \"ovsdbserver-nb-0\" (UID: \"d8dff454-a625-4309-92b6-8ab92d4bd60a\") " pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.550968 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d8dff454-a625-4309-92b6-8ab92d4bd60a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d8dff454-a625-4309-92b6-8ab92d4bd60a\") " pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.551008 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d8dff454-a625-4309-92b6-8ab92d4bd60a\") " pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.551293 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d8dff454-a625-4309-92b6-8ab92d4bd60a\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.552004 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d8dff454-a625-4309-92b6-8ab92d4bd60a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d8dff454-a625-4309-92b6-8ab92d4bd60a\") " pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.552254 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8dff454-a625-4309-92b6-8ab92d4bd60a-config\") pod \"ovsdbserver-nb-0\" (UID: \"d8dff454-a625-4309-92b6-8ab92d4bd60a\") " pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.555328 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8dff454-a625-4309-92b6-8ab92d4bd60a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d8dff454-a625-4309-92b6-8ab92d4bd60a\") " pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.556055 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8dff454-a625-4309-92b6-8ab92d4bd60a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d8dff454-a625-4309-92b6-8ab92d4bd60a\") " pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.556328 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8dff454-a625-4309-92b6-8ab92d4bd60a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d8dff454-a625-4309-92b6-8ab92d4bd60a\") " pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.558656 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8dff454-a625-4309-92b6-8ab92d4bd60a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d8dff454-a625-4309-92b6-8ab92d4bd60a\") " pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.572770 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tbfw\" (UniqueName: \"kubernetes.io/projected/d8dff454-a625-4309-92b6-8ab92d4bd60a-kube-api-access-5tbfw\") pod \"ovsdbserver-nb-0\" (UID: \"d8dff454-a625-4309-92b6-8ab92d4bd60a\") " pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.577281 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d8dff454-a625-4309-92b6-8ab92d4bd60a\") " pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:25 crc kubenswrapper[4732]: I0402 13:58:25.664633 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:26 crc kubenswrapper[4732]: I0402 13:58:26.873308 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Apr 02 13:58:26 crc kubenswrapper[4732]: I0402 13:58:26.875141 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:26 crc kubenswrapper[4732]: I0402 13:58:26.879600 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Apr 02 13:58:26 crc kubenswrapper[4732]: I0402 13:58:26.879770 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Apr 02 13:58:26 crc kubenswrapper[4732]: I0402 13:58:26.879635 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-h8r42" Apr 02 13:58:26 crc kubenswrapper[4732]: I0402 13:58:26.883733 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Apr 02 13:58:26 crc kubenswrapper[4732]: I0402 13:58:26.885829 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Apr 02 13:58:26 crc kubenswrapper[4732]: I0402 13:58:26.972238 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f84d20f6-82ec-45d6-8487-4ed2ed90b286-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f84d20f6-82ec-45d6-8487-4ed2ed90b286\") " pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:26 crc kubenswrapper[4732]: I0402 13:58:26.972294 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sj84\" (UniqueName: \"kubernetes.io/projected/f84d20f6-82ec-45d6-8487-4ed2ed90b286-kube-api-access-5sj84\") pod \"ovsdbserver-sb-0\" (UID: \"f84d20f6-82ec-45d6-8487-4ed2ed90b286\") " pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:26 crc kubenswrapper[4732]: I0402 13:58:26.972347 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f84d20f6-82ec-45d6-8487-4ed2ed90b286-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f84d20f6-82ec-45d6-8487-4ed2ed90b286\") " pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:26 crc kubenswrapper[4732]: I0402 13:58:26.972369 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f84d20f6-82ec-45d6-8487-4ed2ed90b286-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f84d20f6-82ec-45d6-8487-4ed2ed90b286\") " pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:26 crc kubenswrapper[4732]: I0402 13:58:26.972433 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f84d20f6-82ec-45d6-8487-4ed2ed90b286\") " pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:26 crc kubenswrapper[4732]: I0402 13:58:26.972468 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f84d20f6-82ec-45d6-8487-4ed2ed90b286-config\") pod \"ovsdbserver-sb-0\" (UID: \"f84d20f6-82ec-45d6-8487-4ed2ed90b286\") " pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:26 crc kubenswrapper[4732]: I0402 13:58:26.972521 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f84d20f6-82ec-45d6-8487-4ed2ed90b286-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f84d20f6-82ec-45d6-8487-4ed2ed90b286\") " pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:26 crc kubenswrapper[4732]: I0402 13:58:26.972607 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f84d20f6-82ec-45d6-8487-4ed2ed90b286-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f84d20f6-82ec-45d6-8487-4ed2ed90b286\") " pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:27 crc kubenswrapper[4732]: I0402 13:58:27.074760 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f84d20f6-82ec-45d6-8487-4ed2ed90b286-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f84d20f6-82ec-45d6-8487-4ed2ed90b286\") " pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:27 crc kubenswrapper[4732]: I0402 13:58:27.074822 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f84d20f6-82ec-45d6-8487-4ed2ed90b286-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f84d20f6-82ec-45d6-8487-4ed2ed90b286\") " pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:27 crc kubenswrapper[4732]: I0402 13:58:27.074848 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f84d20f6-82ec-45d6-8487-4ed2ed90b286\") " pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:27 crc kubenswrapper[4732]: I0402 13:58:27.074872 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f84d20f6-82ec-45d6-8487-4ed2ed90b286-config\") pod \"ovsdbserver-sb-0\" (UID: \"f84d20f6-82ec-45d6-8487-4ed2ed90b286\") " pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:27 crc kubenswrapper[4732]: I0402 13:58:27.074915 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f84d20f6-82ec-45d6-8487-4ed2ed90b286-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f84d20f6-82ec-45d6-8487-4ed2ed90b286\") " pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:27 crc kubenswrapper[4732]: I0402 13:58:27.074973 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f84d20f6-82ec-45d6-8487-4ed2ed90b286-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f84d20f6-82ec-45d6-8487-4ed2ed90b286\") " pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:27 crc kubenswrapper[4732]: I0402 13:58:27.075022 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f84d20f6-82ec-45d6-8487-4ed2ed90b286-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f84d20f6-82ec-45d6-8487-4ed2ed90b286\") " pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:27 crc kubenswrapper[4732]: I0402 13:58:27.075053 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sj84\" (UniqueName: \"kubernetes.io/projected/f84d20f6-82ec-45d6-8487-4ed2ed90b286-kube-api-access-5sj84\") pod \"ovsdbserver-sb-0\" (UID: \"f84d20f6-82ec-45d6-8487-4ed2ed90b286\") " pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:27 crc kubenswrapper[4732]: I0402 13:58:27.075257 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f84d20f6-82ec-45d6-8487-4ed2ed90b286\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:27 crc kubenswrapper[4732]: I0402 13:58:27.075398 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f84d20f6-82ec-45d6-8487-4ed2ed90b286-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f84d20f6-82ec-45d6-8487-4ed2ed90b286\") " pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:27 crc kubenswrapper[4732]: I0402 13:58:27.075962 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f84d20f6-82ec-45d6-8487-4ed2ed90b286-config\") pod \"ovsdbserver-sb-0\" (UID: \"f84d20f6-82ec-45d6-8487-4ed2ed90b286\") " pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:27 crc kubenswrapper[4732]: I0402 13:58:27.076818 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f84d20f6-82ec-45d6-8487-4ed2ed90b286-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f84d20f6-82ec-45d6-8487-4ed2ed90b286\") " pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:27 crc kubenswrapper[4732]: I0402 13:58:27.081470 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f84d20f6-82ec-45d6-8487-4ed2ed90b286-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f84d20f6-82ec-45d6-8487-4ed2ed90b286\") " pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:27 crc kubenswrapper[4732]: I0402 13:58:27.081972 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f84d20f6-82ec-45d6-8487-4ed2ed90b286-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f84d20f6-82ec-45d6-8487-4ed2ed90b286\") " pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:27 crc kubenswrapper[4732]: I0402 13:58:27.089693 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f84d20f6-82ec-45d6-8487-4ed2ed90b286-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f84d20f6-82ec-45d6-8487-4ed2ed90b286\") " pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:27 crc kubenswrapper[4732]: I0402 13:58:27.090803 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sj84\" (UniqueName: \"kubernetes.io/projected/f84d20f6-82ec-45d6-8487-4ed2ed90b286-kube-api-access-5sj84\") pod \"ovsdbserver-sb-0\" (UID: \"f84d20f6-82ec-45d6-8487-4ed2ed90b286\") " pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:27 crc kubenswrapper[4732]: I0402 13:58:27.100433 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f84d20f6-82ec-45d6-8487-4ed2ed90b286\") " pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:27 crc kubenswrapper[4732]: I0402 13:58:27.197953 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:28 crc kubenswrapper[4732]: E0402 13:58:28.275594 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Apr 02 13:58:28 crc kubenswrapper[4732]: E0402 13:58:28.276270 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fd4rd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-rf4w6_openstack(543e9524-dc5c-4f94-8577-3fa8493c61d4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Apr 02 13:58:28 crc kubenswrapper[4732]: E0402 13:58:28.277565 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-rf4w6" podUID="543e9524-dc5c-4f94-8577-3fa8493c61d4" Apr 02 13:58:28 crc kubenswrapper[4732]: E0402 13:58:28.297909 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Apr 02 13:58:28 crc kubenswrapper[4732]: E0402 13:58:28.298135 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bhhpn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-8pjqd_openstack(81c6acdb-a950-4cc4-a77d-dff56acc00c9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Apr 02 13:58:28 crc kubenswrapper[4732]: E0402 13:58:28.299969 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-8pjqd" podUID="81c6acdb-a950-4cc4-a77d-dff56acc00c9" Apr 02 13:58:28 crc kubenswrapper[4732]: I0402 13:58:28.633157 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-8pjqd" Apr 02 13:58:28 crc kubenswrapper[4732]: I0402 13:58:28.712448 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhhpn\" (UniqueName: \"kubernetes.io/projected/81c6acdb-a950-4cc4-a77d-dff56acc00c9-kube-api-access-bhhpn\") pod \"81c6acdb-a950-4cc4-a77d-dff56acc00c9\" (UID: \"81c6acdb-a950-4cc4-a77d-dff56acc00c9\") " Apr 02 13:58:28 crc kubenswrapper[4732]: I0402 13:58:28.712525 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81c6acdb-a950-4cc4-a77d-dff56acc00c9-config\") pod \"81c6acdb-a950-4cc4-a77d-dff56acc00c9\" (UID: \"81c6acdb-a950-4cc4-a77d-dff56acc00c9\") " Apr 02 13:58:28 crc kubenswrapper[4732]: I0402 13:58:28.712577 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81c6acdb-a950-4cc4-a77d-dff56acc00c9-dns-svc\") pod \"81c6acdb-a950-4cc4-a77d-dff56acc00c9\" (UID: \"81c6acdb-a950-4cc4-a77d-dff56acc00c9\") " Apr 02 13:58:28 crc kubenswrapper[4732]: I0402 13:58:28.713483 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81c6acdb-a950-4cc4-a77d-dff56acc00c9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "81c6acdb-a950-4cc4-a77d-dff56acc00c9" (UID: "81c6acdb-a950-4cc4-a77d-dff56acc00c9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:58:28 crc kubenswrapper[4732]: I0402 13:58:28.718265 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81c6acdb-a950-4cc4-a77d-dff56acc00c9-config" (OuterVolumeSpecName: "config") pod "81c6acdb-a950-4cc4-a77d-dff56acc00c9" (UID: "81c6acdb-a950-4cc4-a77d-dff56acc00c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:58:28 crc kubenswrapper[4732]: I0402 13:58:28.731092 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81c6acdb-a950-4cc4-a77d-dff56acc00c9-kube-api-access-bhhpn" (OuterVolumeSpecName: "kube-api-access-bhhpn") pod "81c6acdb-a950-4cc4-a77d-dff56acc00c9" (UID: "81c6acdb-a950-4cc4-a77d-dff56acc00c9"). InnerVolumeSpecName "kube-api-access-bhhpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:58:28 crc kubenswrapper[4732]: I0402 13:58:28.807194 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rf4w6" Apr 02 13:58:28 crc kubenswrapper[4732]: I0402 13:58:28.818883 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81c6acdb-a950-4cc4-a77d-dff56acc00c9-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:28 crc kubenswrapper[4732]: I0402 13:58:28.818919 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhhpn\" (UniqueName: \"kubernetes.io/projected/81c6acdb-a950-4cc4-a77d-dff56acc00c9-kube-api-access-bhhpn\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:28 crc kubenswrapper[4732]: I0402 13:58:28.818955 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81c6acdb-a950-4cc4-a77d-dff56acc00c9-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:28 crc kubenswrapper[4732]: I0402 13:58:28.921178 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/543e9524-dc5c-4f94-8577-3fa8493c61d4-config\") pod \"543e9524-dc5c-4f94-8577-3fa8493c61d4\" (UID: \"543e9524-dc5c-4f94-8577-3fa8493c61d4\") " Apr 02 13:58:28 crc kubenswrapper[4732]: I0402 13:58:28.921668 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/543e9524-dc5c-4f94-8577-3fa8493c61d4-config" (OuterVolumeSpecName: "config") pod "543e9524-dc5c-4f94-8577-3fa8493c61d4" (UID: "543e9524-dc5c-4f94-8577-3fa8493c61d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:58:28 crc kubenswrapper[4732]: I0402 13:58:28.921806 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd4rd\" (UniqueName: \"kubernetes.io/projected/543e9524-dc5c-4f94-8577-3fa8493c61d4-kube-api-access-fd4rd\") pod \"543e9524-dc5c-4f94-8577-3fa8493c61d4\" (UID: \"543e9524-dc5c-4f94-8577-3fa8493c61d4\") " Apr 02 13:58:28 crc kubenswrapper[4732]: I0402 13:58:28.922161 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/543e9524-dc5c-4f94-8577-3fa8493c61d4-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:28 crc kubenswrapper[4732]: I0402 13:58:28.930826 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/543e9524-dc5c-4f94-8577-3fa8493c61d4-kube-api-access-fd4rd" (OuterVolumeSpecName: "kube-api-access-fd4rd") pod "543e9524-dc5c-4f94-8577-3fa8493c61d4" (UID: "543e9524-dc5c-4f94-8577-3fa8493c61d4"). InnerVolumeSpecName "kube-api-access-fd4rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:58:29 crc kubenswrapper[4732]: I0402 13:58:29.023403 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd4rd\" (UniqueName: \"kubernetes.io/projected/543e9524-dc5c-4f94-8577-3fa8493c61d4-kube-api-access-fd4rd\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:29 crc kubenswrapper[4732]: I0402 13:58:29.201544 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Apr 02 13:58:29 crc kubenswrapper[4732]: I0402 13:58:29.350238 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rf4w6" Apr 02 13:58:29 crc kubenswrapper[4732]: I0402 13:58:29.350256 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-rf4w6" event={"ID":"543e9524-dc5c-4f94-8577-3fa8493c61d4","Type":"ContainerDied","Data":"64eb3a43004c2166defb609e0bafcd2069de40ea6582f67a3b40535bc5348538"} Apr 02 13:58:29 crc kubenswrapper[4732]: I0402 13:58:29.351635 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"56762f05-a513-4f47-8cf7-5d19bb58c5bd","Type":"ContainerStarted","Data":"078cba21939a3f946508dd32dd3c808e4ad7cc6a198efcc2e69a66c6c08a1410"} Apr 02 13:58:29 crc kubenswrapper[4732]: I0402 13:58:29.353046 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-8pjqd" event={"ID":"81c6acdb-a950-4cc4-a77d-dff56acc00c9","Type":"ContainerDied","Data":"d1f114bde1305d650651f11ceef64292411176f8d0c4539e57e115e41386016c"} Apr 02 13:58:29 crc kubenswrapper[4732]: I0402 13:58:29.353135 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-8pjqd" Apr 02 13:58:29 crc kubenswrapper[4732]: I0402 13:58:29.356155 4732 generic.go:334] "Generic (PLEG): container finished" podID="38a12349-b9ff-4123-ba3b-96edc0cf2bc6" containerID="3f135ea198a03663ed9d5efdf74058e2dfddec3f91057f0ffe0f8948dd604ea1" exitCode=0 Apr 02 13:58:29 crc kubenswrapper[4732]: I0402 13:58:29.356262 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rjf6j" event={"ID":"38a12349-b9ff-4123-ba3b-96edc0cf2bc6","Type":"ContainerDied","Data":"3f135ea198a03663ed9d5efdf74058e2dfddec3f91057f0ffe0f8948dd604ea1"} Apr 02 13:58:29 crc kubenswrapper[4732]: I0402 13:58:29.360434 4732 generic.go:334] "Generic (PLEG): container finished" podID="97c8b10a-f931-4b1d-967b-3c3eeac5ca4c" containerID="da8e382ce98c4e454830499385061e203ffd5fa0880fed8514d5c3b0335ea7b7" exitCode=0 Apr 02 13:58:29 crc kubenswrapper[4732]: I0402 13:58:29.360490 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-285k7" event={"ID":"97c8b10a-f931-4b1d-967b-3c3eeac5ca4c","Type":"ContainerDied","Data":"da8e382ce98c4e454830499385061e203ffd5fa0880fed8514d5c3b0335ea7b7"} Apr 02 13:58:29 crc kubenswrapper[4732]: I0402 13:58:29.384257 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Apr 02 13:58:29 crc kubenswrapper[4732]: W0402 13:58:29.413151 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f87d2b1_82d0_4126_aeae_46aa84ba3d1f.slice/crio-29359fe52b43b321af529d3f1af8c0090c729b0398091a22608b87d8d6a2e575 WatchSource:0}: Error finding container 29359fe52b43b321af529d3f1af8c0090c729b0398091a22608b87d8d6a2e575: Status 404 returned error can't find the container with id 29359fe52b43b321af529d3f1af8c0090c729b0398091a22608b87d8d6a2e575 Apr 02 13:58:29 crc kubenswrapper[4732]: I0402 13:58:29.446570 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Apr 02 13:58:29 crc kubenswrapper[4732]: W0402 13:58:29.457461 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fbe66fb_6f02_432d_8acf_50fec5339d96.slice/crio-1dfc883834da3efc11f78e82eb06f9e64e43244ff068e2020b06df774895a9e9 WatchSource:0}: Error finding container 1dfc883834da3efc11f78e82eb06f9e64e43244ff068e2020b06df774895a9e9: Status 404 returned error can't find the container with id 1dfc883834da3efc11f78e82eb06f9e64e43244ff068e2020b06df774895a9e9 Apr 02 13:58:29 crc kubenswrapper[4732]: I0402 13:58:29.461600 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Apr 02 13:58:29 crc kubenswrapper[4732]: I0402 13:58:29.470381 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Apr 02 13:58:29 crc kubenswrapper[4732]: W0402 13:58:29.484233 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0bb93d2_9da7_4667_9079_b403332d31e0.slice/crio-7c3447dab6d87e81460146e62ebcd33bb9652039825e72e010c68e5b1702fec7 WatchSource:0}: Error finding container 7c3447dab6d87e81460146e62ebcd33bb9652039825e72e010c68e5b1702fec7: Status 404 returned error can't find the container with id 7c3447dab6d87e81460146e62ebcd33bb9652039825e72e010c68e5b1702fec7 Apr 02 13:58:29 crc kubenswrapper[4732]: I0402 13:58:29.485696 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Apr 02 13:58:29 crc kubenswrapper[4732]: I0402 13:58:29.496504 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5222s"] Apr 02 13:58:29 crc kubenswrapper[4732]: I0402 13:58:29.512351 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-l4ttl"] Apr 02 13:58:29 crc kubenswrapper[4732]: W0402 13:58:29.521109 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8af6391f_4f8b_4473_8e7c_186c9c838527.slice/crio-7a1fe9aafb6f32461aaa2449918857d3889ae31541ba0cf19957454ecdb42609 WatchSource:0}: Error finding container 7a1fe9aafb6f32461aaa2449918857d3889ae31541ba0cf19957454ecdb42609: Status 404 returned error can't find the container with id 7a1fe9aafb6f32461aaa2449918857d3889ae31541ba0cf19957454ecdb42609 Apr 02 13:58:29 crc kubenswrapper[4732]: W0402 13:58:29.527109 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5eba7503_ee7b_40ba_a0dc_e11fad40c2b7.slice/crio-ce1153f038b62b8290e2a5d561f27c36713a72f709f18f8dab14087d76cf4df2 WatchSource:0}: Error finding container ce1153f038b62b8290e2a5d561f27c36713a72f709f18f8dab14087d76cf4df2: Status 404 returned error can't find the container with id ce1153f038b62b8290e2a5d561f27c36713a72f709f18f8dab14087d76cf4df2 Apr 02 13:58:29 crc kubenswrapper[4732]: I0402 13:58:29.613115 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rf4w6"] Apr 02 13:58:29 crc kubenswrapper[4732]: I0402 13:58:29.620576 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rf4w6"] Apr 02 13:58:29 crc kubenswrapper[4732]: E0402 13:58:29.626802 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod543e9524_dc5c_4f94_8577_3fa8493c61d4.slice\": RecentStats: unable to find data in memory cache]" Apr 02 13:58:29 crc kubenswrapper[4732]: I0402 13:58:29.633030 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-8pjqd"] Apr 02 13:58:29 crc kubenswrapper[4732]: I0402 13:58:29.637989 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-8pjqd"] Apr 02 13:58:29 crc kubenswrapper[4732]: I0402 13:58:29.643457 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Apr 02 13:58:30 crc kubenswrapper[4732]: I0402 13:58:30.143823 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Apr 02 13:58:30 crc kubenswrapper[4732]: W0402 13:58:30.149922 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8dff454_a625_4309_92b6_8ab92d4bd60a.slice/crio-2aac7412b392473daab18fa323c702444bcc5b7d5341f10090b5fecfac8e6e07 WatchSource:0}: Error finding container 2aac7412b392473daab18fa323c702444bcc5b7d5341f10090b5fecfac8e6e07: Status 404 returned error can't find the container with id 2aac7412b392473daab18fa323c702444bcc5b7d5341f10090b5fecfac8e6e07 Apr 02 13:58:30 crc kubenswrapper[4732]: I0402 13:58:30.373359 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-285k7" event={"ID":"97c8b10a-f931-4b1d-967b-3c3eeac5ca4c","Type":"ContainerStarted","Data":"45d157fae86f8fd3a767cf39dc92ea252cf9e8868c1e7261ea35aca56215a314"} Apr 02 13:58:30 crc kubenswrapper[4732]: I0402 13:58:30.373699 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-285k7" Apr 02 13:58:30 crc kubenswrapper[4732]: I0402 13:58:30.375889 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ac229e50-5412-4f18-be3c-4a364b95dcf2","Type":"ContainerStarted","Data":"4169df95bae865b1b2cfc81f7735b1271d2d0e0bdd8fdd18cd6f2115f95ff22e"} Apr 02 13:58:30 crc kubenswrapper[4732]: I0402 13:58:30.377993 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-l4ttl" event={"ID":"5eba7503-ee7b-40ba-a0dc-e11fad40c2b7","Type":"ContainerStarted","Data":"ce1153f038b62b8290e2a5d561f27c36713a72f709f18f8dab14087d76cf4df2"} Apr 02 13:58:30 crc kubenswrapper[4732]: I0402 13:58:30.380280 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rjf6j" event={"ID":"38a12349-b9ff-4123-ba3b-96edc0cf2bc6","Type":"ContainerStarted","Data":"d6626f18c4d55241c0910c46f125ab90840fcd25e84f3ab515dea20bc6197255"} Apr 02 13:58:30 crc kubenswrapper[4732]: I0402 13:58:30.380387 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-rjf6j" Apr 02 13:58:30 crc kubenswrapper[4732]: I0402 13:58:30.382023 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2fbe66fb-6f02-432d-8acf-50fec5339d96","Type":"ContainerStarted","Data":"1dfc883834da3efc11f78e82eb06f9e64e43244ff068e2020b06df774895a9e9"} Apr 02 13:58:30 crc kubenswrapper[4732]: I0402 13:58:30.383317 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"688bac91-aede-4c9f-a063-6469bb03db8c","Type":"ContainerStarted","Data":"03571f0f677cb86efbbef36c66d44febb4c4f8415317e082598c9c59cbfddf0f"} Apr 02 13:58:30 crc kubenswrapper[4732]: I0402 13:58:30.384363 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f84d20f6-82ec-45d6-8487-4ed2ed90b286","Type":"ContainerStarted","Data":"a1ce80c06d0b55c417d29f690bf0e8f2aa5c197a81daa354accdd63d6d628f80"} Apr 02 13:58:30 crc kubenswrapper[4732]: I0402 13:58:30.385600 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d8dff454-a625-4309-92b6-8ab92d4bd60a","Type":"ContainerStarted","Data":"2aac7412b392473daab18fa323c702444bcc5b7d5341f10090b5fecfac8e6e07"} Apr 02 13:58:30 crc kubenswrapper[4732]: I0402 13:58:30.393333 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5222s" event={"ID":"8af6391f-4f8b-4473-8e7c-186c9c838527","Type":"ContainerStarted","Data":"7a1fe9aafb6f32461aaa2449918857d3889ae31541ba0cf19957454ecdb42609"} Apr 02 13:58:30 crc kubenswrapper[4732]: I0402 13:58:30.395520 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4f87d2b1-82d0-4126-aeae-46aa84ba3d1f","Type":"ContainerStarted","Data":"29359fe52b43b321af529d3f1af8c0090c729b0398091a22608b87d8d6a2e575"} Apr 02 13:58:30 crc kubenswrapper[4732]: I0402 13:58:30.395926 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-285k7" podStartSLOduration=6.695390151 podStartE2EDuration="17.395907796s" podCreationTimestamp="2026-04-02 13:58:13 +0000 UTC" firstStartedPulling="2026-04-02 13:58:17.877371491 +0000 UTC m=+1254.781779044" lastFinishedPulling="2026-04-02 13:58:28.577889136 +0000 UTC m=+1265.482296689" observedRunningTime="2026-04-02 13:58:30.387954182 +0000 UTC m=+1267.292361755" watchObservedRunningTime="2026-04-02 13:58:30.395907796 +0000 UTC m=+1267.300315349" Apr 02 13:58:30 crc kubenswrapper[4732]: I0402 13:58:30.397466 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b0bb93d2-9da7-4667-9079-b403332d31e0","Type":"ContainerStarted","Data":"7c3447dab6d87e81460146e62ebcd33bb9652039825e72e010c68e5b1702fec7"} Apr 02 13:58:30 crc kubenswrapper[4732]: I0402 13:58:30.412831 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-rjf6j" podStartSLOduration=3.295691272 podStartE2EDuration="17.41281034s" podCreationTimestamp="2026-04-02 13:58:13 +0000 UTC" firstStartedPulling="2026-04-02 13:58:14.463641875 +0000 UTC m=+1251.368049428" lastFinishedPulling="2026-04-02 13:58:28.580760943 +0000 UTC m=+1265.485168496" observedRunningTime="2026-04-02 13:58:30.408378131 +0000 UTC m=+1267.312785684" watchObservedRunningTime="2026-04-02 13:58:30.41281034 +0000 UTC m=+1267.317217893" Apr 02 13:58:30 crc kubenswrapper[4732]: I0402 13:58:30.697343 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="543e9524-dc5c-4f94-8577-3fa8493c61d4" path="/var/lib/kubelet/pods/543e9524-dc5c-4f94-8577-3fa8493c61d4/volumes" Apr 02 13:58:30 crc kubenswrapper[4732]: I0402 13:58:30.697723 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81c6acdb-a950-4cc4-a77d-dff56acc00c9" path="/var/lib/kubelet/pods/81c6acdb-a950-4cc4-a77d-dff56acc00c9/volumes" Apr 02 13:58:34 crc kubenswrapper[4732]: I0402 13:58:34.079770 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-285k7" Apr 02 13:58:34 crc kubenswrapper[4732]: I0402 13:58:34.146501 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-rjf6j" Apr 02 13:58:34 crc kubenswrapper[4732]: I0402 13:58:34.194270 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-285k7"] Apr 02 13:58:34 crc kubenswrapper[4732]: I0402 13:58:34.437313 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-285k7" podUID="97c8b10a-f931-4b1d-967b-3c3eeac5ca4c" containerName="dnsmasq-dns" containerID="cri-o://45d157fae86f8fd3a767cf39dc92ea252cf9e8868c1e7261ea35aca56215a314" gracePeriod=10 Apr 02 13:58:35 crc kubenswrapper[4732]: I0402 13:58:35.449093 4732 generic.go:334] "Generic (PLEG): container finished" podID="97c8b10a-f931-4b1d-967b-3c3eeac5ca4c" containerID="45d157fae86f8fd3a767cf39dc92ea252cf9e8868c1e7261ea35aca56215a314" exitCode=0 Apr 02 13:58:35 crc kubenswrapper[4732]: I0402 13:58:35.449159 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-285k7" event={"ID":"97c8b10a-f931-4b1d-967b-3c3eeac5ca4c","Type":"ContainerDied","Data":"45d157fae86f8fd3a767cf39dc92ea252cf9e8868c1e7261ea35aca56215a314"} Apr 02 13:58:38 crc kubenswrapper[4732]: I0402 13:58:38.432942 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-285k7" Apr 02 13:58:38 crc kubenswrapper[4732]: I0402 13:58:38.476879 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-285k7" event={"ID":"97c8b10a-f931-4b1d-967b-3c3eeac5ca4c","Type":"ContainerDied","Data":"79d4b465040609920d0d6234d66a5ad8f0ed34392e4095355bf8fa341443985a"} Apr 02 13:58:38 crc kubenswrapper[4732]: I0402 13:58:38.476948 4732 scope.go:117] "RemoveContainer" containerID="45d157fae86f8fd3a767cf39dc92ea252cf9e8868c1e7261ea35aca56215a314" Apr 02 13:58:38 crc kubenswrapper[4732]: I0402 13:58:38.476948 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-285k7" Apr 02 13:58:38 crc kubenswrapper[4732]: I0402 13:58:38.487108 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97c8b10a-f931-4b1d-967b-3c3eeac5ca4c-dns-svc\") pod \"97c8b10a-f931-4b1d-967b-3c3eeac5ca4c\" (UID: \"97c8b10a-f931-4b1d-967b-3c3eeac5ca4c\") " Apr 02 13:58:38 crc kubenswrapper[4732]: I0402 13:58:38.487172 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9rbj\" (UniqueName: \"kubernetes.io/projected/97c8b10a-f931-4b1d-967b-3c3eeac5ca4c-kube-api-access-s9rbj\") pod \"97c8b10a-f931-4b1d-967b-3c3eeac5ca4c\" (UID: \"97c8b10a-f931-4b1d-967b-3c3eeac5ca4c\") " Apr 02 13:58:38 crc kubenswrapper[4732]: I0402 13:58:38.487251 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97c8b10a-f931-4b1d-967b-3c3eeac5ca4c-config\") pod \"97c8b10a-f931-4b1d-967b-3c3eeac5ca4c\" (UID: \"97c8b10a-f931-4b1d-967b-3c3eeac5ca4c\") " Apr 02 13:58:38 crc kubenswrapper[4732]: I0402 13:58:38.491157 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97c8b10a-f931-4b1d-967b-3c3eeac5ca4c-kube-api-access-s9rbj" (OuterVolumeSpecName: "kube-api-access-s9rbj") pod "97c8b10a-f931-4b1d-967b-3c3eeac5ca4c" (UID: "97c8b10a-f931-4b1d-967b-3c3eeac5ca4c"). InnerVolumeSpecName "kube-api-access-s9rbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:58:38 crc kubenswrapper[4732]: I0402 13:58:38.525575 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97c8b10a-f931-4b1d-967b-3c3eeac5ca4c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "97c8b10a-f931-4b1d-967b-3c3eeac5ca4c" (UID: "97c8b10a-f931-4b1d-967b-3c3eeac5ca4c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:58:38 crc kubenswrapper[4732]: I0402 13:58:38.529879 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97c8b10a-f931-4b1d-967b-3c3eeac5ca4c-config" (OuterVolumeSpecName: "config") pod "97c8b10a-f931-4b1d-967b-3c3eeac5ca4c" (UID: "97c8b10a-f931-4b1d-967b-3c3eeac5ca4c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:58:38 crc kubenswrapper[4732]: I0402 13:58:38.589469 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97c8b10a-f931-4b1d-967b-3c3eeac5ca4c-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:38 crc kubenswrapper[4732]: I0402 13:58:38.589793 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9rbj\" (UniqueName: \"kubernetes.io/projected/97c8b10a-f931-4b1d-967b-3c3eeac5ca4c-kube-api-access-s9rbj\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:38 crc kubenswrapper[4732]: I0402 13:58:38.589810 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97c8b10a-f931-4b1d-967b-3c3eeac5ca4c-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:38 crc kubenswrapper[4732]: I0402 13:58:38.748453 4732 scope.go:117] "RemoveContainer" containerID="da8e382ce98c4e454830499385061e203ffd5fa0880fed8514d5c3b0335ea7b7" Apr 02 13:58:38 crc kubenswrapper[4732]: I0402 13:58:38.803440 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-285k7"] Apr 02 13:58:38 crc kubenswrapper[4732]: I0402 13:58:38.811399 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-285k7"] Apr 02 13:58:39 crc kubenswrapper[4732]: I0402 13:58:39.490196 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d8dff454-a625-4309-92b6-8ab92d4bd60a","Type":"ContainerStarted","Data":"74db822b4ec204881756b4e754faaa887de15f4c5dcee64eb635f633b1514488"} Apr 02 13:58:39 crc kubenswrapper[4732]: I0402 13:58:39.492438 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2fbe66fb-6f02-432d-8acf-50fec5339d96","Type":"ContainerStarted","Data":"0ccea86247ddac6e1ac9c631b8a4c46a7837df562aa22cf084b08678bf103315"} Apr 02 13:58:39 crc kubenswrapper[4732]: I0402 13:58:39.495332 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4f87d2b1-82d0-4126-aeae-46aa84ba3d1f","Type":"ContainerStarted","Data":"0ba82d3361f03333cfe18e62d5e5ca0641a42de80fddefe7babb909cd0f21d18"} Apr 02 13:58:39 crc kubenswrapper[4732]: I0402 13:58:39.495475 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Apr 02 13:58:39 crc kubenswrapper[4732]: I0402 13:58:39.497960 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f84d20f6-82ec-45d6-8487-4ed2ed90b286","Type":"ContainerStarted","Data":"762d15fdb1f2685e6d1523af822016473cf27e789f8c8d296ea71a2ba345fb25"} Apr 02 13:58:39 crc kubenswrapper[4732]: I0402 13:58:39.502056 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-l4ttl" event={"ID":"5eba7503-ee7b-40ba-a0dc-e11fad40c2b7","Type":"ContainerStarted","Data":"2e430760f4ca77b9b316a552a8e8c01666d89f7598813d55ff8f3ee8ab65e5dd"} Apr 02 13:58:39 crc kubenswrapper[4732]: I0402 13:58:39.537842 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=12.872841467 podStartE2EDuration="21.537821374s" podCreationTimestamp="2026-04-02 13:58:18 +0000 UTC" firstStartedPulling="2026-04-02 13:58:29.414950557 +0000 UTC m=+1266.319358110" lastFinishedPulling="2026-04-02 13:58:38.079930464 +0000 UTC m=+1274.984338017" observedRunningTime="2026-04-02 13:58:39.527117937 +0000 UTC m=+1276.431525500" watchObservedRunningTime="2026-04-02 13:58:39.537821374 +0000 UTC m=+1276.442228927" Apr 02 13:58:40 crc kubenswrapper[4732]: I0402 13:58:40.512960 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5222s" event={"ID":"8af6391f-4f8b-4473-8e7c-186c9c838527","Type":"ContainerStarted","Data":"0e6ab7ecfd52592e7b78ca9837017409218c7d35e7103eecf3d5ac7451d089a3"} Apr 02 13:58:40 crc kubenswrapper[4732]: I0402 13:58:40.513354 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-5222s" Apr 02 13:58:40 crc kubenswrapper[4732]: I0402 13:58:40.518832 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"688bac91-aede-4c9f-a063-6469bb03db8c","Type":"ContainerStarted","Data":"bc8c0032c2834befc4ff8e146fa89d52c5c4d16ca29ff52d8f4d3cc6c4728728"} Apr 02 13:58:40 crc kubenswrapper[4732]: I0402 13:58:40.521165 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ac229e50-5412-4f18-be3c-4a364b95dcf2","Type":"ContainerStarted","Data":"07e46c22b4938ab8c21eaa09b4679a3900fc1e4c9b8afef0dd97cc3226fa06e8"} Apr 02 13:58:40 crc kubenswrapper[4732]: I0402 13:58:40.521266 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Apr 02 13:58:40 crc kubenswrapper[4732]: I0402 13:58:40.523705 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b0bb93d2-9da7-4667-9079-b403332d31e0","Type":"ContainerStarted","Data":"991fee5892b181cbf9eaa8f0e526c1dca54ed5e2932b158ac3a0bf0139afeaf4"} Apr 02 13:58:40 crc kubenswrapper[4732]: I0402 13:58:40.526190 4732 generic.go:334] "Generic (PLEG): container finished" podID="5eba7503-ee7b-40ba-a0dc-e11fad40c2b7" containerID="2e430760f4ca77b9b316a552a8e8c01666d89f7598813d55ff8f3ee8ab65e5dd" exitCode=0 Apr 02 13:58:40 crc kubenswrapper[4732]: I0402 13:58:40.526280 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-l4ttl" event={"ID":"5eba7503-ee7b-40ba-a0dc-e11fad40c2b7","Type":"ContainerDied","Data":"2e430760f4ca77b9b316a552a8e8c01666d89f7598813d55ff8f3ee8ab65e5dd"} Apr 02 13:58:40 crc kubenswrapper[4732]: I0402 13:58:40.529030 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"56762f05-a513-4f47-8cf7-5d19bb58c5bd","Type":"ContainerStarted","Data":"a107fd73961c43c85df6b57282cf11a8acd3acae427dc6edddcade0f4ab33f9d"} Apr 02 13:58:40 crc kubenswrapper[4732]: I0402 13:58:40.566190 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-5222s" podStartSLOduration=7.31041502 podStartE2EDuration="16.566171078s" podCreationTimestamp="2026-04-02 13:58:24 +0000 UTC" firstStartedPulling="2026-04-02 13:58:29.523352531 +0000 UTC m=+1266.427760084" lastFinishedPulling="2026-04-02 13:58:38.779108589 +0000 UTC m=+1275.683516142" observedRunningTime="2026-04-02 13:58:40.535203766 +0000 UTC m=+1277.439611319" watchObservedRunningTime="2026-04-02 13:58:40.566171078 +0000 UTC m=+1277.470578631" Apr 02 13:58:40 crc kubenswrapper[4732]: I0402 13:58:40.658938 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=11.334380513 podStartE2EDuration="20.65891614s" podCreationTimestamp="2026-04-02 13:58:20 +0000 UTC" firstStartedPulling="2026-04-02 13:58:29.478146116 +0000 UTC m=+1266.382553669" lastFinishedPulling="2026-04-02 13:58:38.802681743 +0000 UTC m=+1275.707089296" observedRunningTime="2026-04-02 13:58:40.655302843 +0000 UTC m=+1277.559710406" watchObservedRunningTime="2026-04-02 13:58:40.65891614 +0000 UTC m=+1277.563323693" Apr 02 13:58:40 crc kubenswrapper[4732]: I0402 13:58:40.692799 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97c8b10a-f931-4b1d-967b-3c3eeac5ca4c" path="/var/lib/kubelet/pods/97c8b10a-f931-4b1d-967b-3c3eeac5ca4c/volumes" Apr 02 13:58:42 crc kubenswrapper[4732]: I0402 13:58:42.549019 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f84d20f6-82ec-45d6-8487-4ed2ed90b286","Type":"ContainerStarted","Data":"8fa94fd6b6a691c3e2ccdbb5c7f425aa54482a4c649648fd70bb2779e61f498d"} Apr 02 13:58:42 crc kubenswrapper[4732]: I0402 13:58:42.552253 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-l4ttl" event={"ID":"5eba7503-ee7b-40ba-a0dc-e11fad40c2b7","Type":"ContainerStarted","Data":"c74384e94990fe2827ac960b1d7a3b9b65866599a66b7575e7a3346f49853011"} Apr 02 13:58:42 crc kubenswrapper[4732]: I0402 13:58:42.554511 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d8dff454-a625-4309-92b6-8ab92d4bd60a","Type":"ContainerStarted","Data":"72d73cca99a91eaff9c502f890b5c9747a6826a45d31292394de8b353a88215a"} Apr 02 13:58:42 crc kubenswrapper[4732]: I0402 13:58:42.576240 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=5.065830595 podStartE2EDuration="17.576221441s" podCreationTimestamp="2026-04-02 13:58:25 +0000 UTC" firstStartedPulling="2026-04-02 13:58:29.633740729 +0000 UTC m=+1266.538148282" lastFinishedPulling="2026-04-02 13:58:42.144131575 +0000 UTC m=+1279.048539128" observedRunningTime="2026-04-02 13:58:42.568825732 +0000 UTC m=+1279.473233315" watchObservedRunningTime="2026-04-02 13:58:42.576221441 +0000 UTC m=+1279.480628994" Apr 02 13:58:42 crc kubenswrapper[4732]: I0402 13:58:42.591312 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=6.616612559 podStartE2EDuration="18.591295016s" podCreationTimestamp="2026-04-02 13:58:24 +0000 UTC" firstStartedPulling="2026-04-02 13:58:30.152337188 +0000 UTC m=+1267.056744741" lastFinishedPulling="2026-04-02 13:58:42.127019615 +0000 UTC m=+1279.031427198" observedRunningTime="2026-04-02 13:58:42.586053005 +0000 UTC m=+1279.490460558" watchObservedRunningTime="2026-04-02 13:58:42.591295016 +0000 UTC m=+1279.495702569" Apr 02 13:58:43 crc kubenswrapper[4732]: I0402 13:58:43.449797 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Apr 02 13:58:43 crc kubenswrapper[4732]: I0402 13:58:43.565898 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-l4ttl" event={"ID":"5eba7503-ee7b-40ba-a0dc-e11fad40c2b7","Type":"ContainerStarted","Data":"074e1511e6f0d1ec6b5e558afc595b64ebca5ff72e6125d8f51feeb40a62d9a4"} Apr 02 13:58:43 crc kubenswrapper[4732]: I0402 13:58:43.566807 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-l4ttl" Apr 02 13:58:43 crc kubenswrapper[4732]: I0402 13:58:43.567049 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-l4ttl" Apr 02 13:58:43 crc kubenswrapper[4732]: I0402 13:58:43.568599 4732 generic.go:334] "Generic (PLEG): container finished" podID="2fbe66fb-6f02-432d-8acf-50fec5339d96" containerID="0ccea86247ddac6e1ac9c631b8a4c46a7837df562aa22cf084b08678bf103315" exitCode=0 Apr 02 13:58:43 crc kubenswrapper[4732]: I0402 13:58:43.568702 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2fbe66fb-6f02-432d-8acf-50fec5339d96","Type":"ContainerDied","Data":"0ccea86247ddac6e1ac9c631b8a4c46a7837df562aa22cf084b08678bf103315"} Apr 02 13:58:43 crc kubenswrapper[4732]: I0402 13:58:43.570311 4732 generic.go:334] "Generic (PLEG): container finished" podID="688bac91-aede-4c9f-a063-6469bb03db8c" containerID="bc8c0032c2834befc4ff8e146fa89d52c5c4d16ca29ff52d8f4d3cc6c4728728" exitCode=0 Apr 02 13:58:43 crc kubenswrapper[4732]: I0402 13:58:43.570376 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"688bac91-aede-4c9f-a063-6469bb03db8c","Type":"ContainerDied","Data":"bc8c0032c2834befc4ff8e146fa89d52c5c4d16ca29ff52d8f4d3cc6c4728728"} Apr 02 13:58:43 crc kubenswrapper[4732]: I0402 13:58:43.588992 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-l4ttl" podStartSLOduration=10.770629506 podStartE2EDuration="19.588977225s" podCreationTimestamp="2026-04-02 13:58:24 +0000 UTC" firstStartedPulling="2026-04-02 13:58:29.532931489 +0000 UTC m=+1266.437339042" lastFinishedPulling="2026-04-02 13:58:38.351279178 +0000 UTC m=+1275.255686761" observedRunningTime="2026-04-02 13:58:43.586312134 +0000 UTC m=+1280.490719687" watchObservedRunningTime="2026-04-02 13:58:43.588977225 +0000 UTC m=+1280.493384778" Apr 02 13:58:43 crc kubenswrapper[4732]: I0402 13:58:43.665163 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:43 crc kubenswrapper[4732]: I0402 13:58:43.701325 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:44 crc kubenswrapper[4732]: I0402 13:58:44.579898 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"688bac91-aede-4c9f-a063-6469bb03db8c","Type":"ContainerStarted","Data":"9e5ab433ea863f5bc30e36adae846fad539ad96bc484ee381969429e80447673"} Apr 02 13:58:44 crc kubenswrapper[4732]: I0402 13:58:44.582925 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2fbe66fb-6f02-432d-8acf-50fec5339d96","Type":"ContainerStarted","Data":"fcdec3ff877b29c3524fd7e7bdc22095f6ee27fc3dc1e85954d3c491acf5b20f"} Apr 02 13:58:44 crc kubenswrapper[4732]: I0402 13:58:44.582961 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:44 crc kubenswrapper[4732]: I0402 13:58:44.603261 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=19.295469033 podStartE2EDuration="28.603245659s" podCreationTimestamp="2026-04-02 13:58:16 +0000 UTC" firstStartedPulling="2026-04-02 13:58:29.470236103 +0000 UTC m=+1266.374643656" lastFinishedPulling="2026-04-02 13:58:38.778012729 +0000 UTC m=+1275.682420282" observedRunningTime="2026-04-02 13:58:44.598731578 +0000 UTC m=+1281.503139131" watchObservedRunningTime="2026-04-02 13:58:44.603245659 +0000 UTC m=+1281.507653212" Apr 02 13:58:44 crc kubenswrapper[4732]: I0402 13:58:44.623028 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=20.344699436 podStartE2EDuration="29.62300192s" podCreationTimestamp="2026-04-02 13:58:15 +0000 UTC" firstStartedPulling="2026-04-02 13:58:29.470347456 +0000 UTC m=+1266.374755009" lastFinishedPulling="2026-04-02 13:58:38.74864994 +0000 UTC m=+1275.653057493" observedRunningTime="2026-04-02 13:58:44.620813212 +0000 UTC m=+1281.525220765" watchObservedRunningTime="2026-04-02 13:58:44.62300192 +0000 UTC m=+1281.527409483" Apr 02 13:58:44 crc kubenswrapper[4732]: I0402 13:58:44.626945 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Apr 02 13:58:44 crc kubenswrapper[4732]: I0402 13:58:44.898501 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2ck6j"] Apr 02 13:58:44 crc kubenswrapper[4732]: E0402 13:58:44.898844 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97c8b10a-f931-4b1d-967b-3c3eeac5ca4c" containerName="dnsmasq-dns" Apr 02 13:58:44 crc kubenswrapper[4732]: I0402 13:58:44.898865 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c8b10a-f931-4b1d-967b-3c3eeac5ca4c" containerName="dnsmasq-dns" Apr 02 13:58:44 crc kubenswrapper[4732]: E0402 13:58:44.898888 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97c8b10a-f931-4b1d-967b-3c3eeac5ca4c" containerName="init" Apr 02 13:58:44 crc kubenswrapper[4732]: I0402 13:58:44.898896 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c8b10a-f931-4b1d-967b-3c3eeac5ca4c" containerName="init" Apr 02 13:58:44 crc kubenswrapper[4732]: I0402 13:58:44.899093 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="97c8b10a-f931-4b1d-967b-3c3eeac5ca4c" containerName="dnsmasq-dns" Apr 02 13:58:44 crc kubenswrapper[4732]: I0402 13:58:44.899957 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-2ck6j" Apr 02 13:58:44 crc kubenswrapper[4732]: I0402 13:58:44.906337 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Apr 02 13:58:44 crc kubenswrapper[4732]: I0402 13:58:44.922120 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2ck6j"] Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.001723 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-7swjs"] Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.007596 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7swjs" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.008307 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hhrv\" (UniqueName: \"kubernetes.io/projected/a3e7c7b6-f808-484a-9304-de2f0e619f56-kube-api-access-8hhrv\") pod \"dnsmasq-dns-5bf47b49b7-2ck6j\" (UID: \"a3e7c7b6-f808-484a-9304-de2f0e619f56\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2ck6j" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.008405 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3e7c7b6-f808-484a-9304-de2f0e619f56-config\") pod \"dnsmasq-dns-5bf47b49b7-2ck6j\" (UID: \"a3e7c7b6-f808-484a-9304-de2f0e619f56\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2ck6j" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.008428 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3e7c7b6-f808-484a-9304-de2f0e619f56-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-2ck6j\" (UID: \"a3e7c7b6-f808-484a-9304-de2f0e619f56\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2ck6j" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.008510 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3e7c7b6-f808-484a-9304-de2f0e619f56-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-2ck6j\" (UID: \"a3e7c7b6-f808-484a-9304-de2f0e619f56\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2ck6j" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.012653 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.020470 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-7swjs"] Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.110603 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/731d113e-365b-4d68-a0e9-402bb8a8e9b7-combined-ca-bundle\") pod \"ovn-controller-metrics-7swjs\" (UID: \"731d113e-365b-4d68-a0e9-402bb8a8e9b7\") " pod="openstack/ovn-controller-metrics-7swjs" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.110684 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hhrv\" (UniqueName: \"kubernetes.io/projected/a3e7c7b6-f808-484a-9304-de2f0e619f56-kube-api-access-8hhrv\") pod \"dnsmasq-dns-5bf47b49b7-2ck6j\" (UID: \"a3e7c7b6-f808-484a-9304-de2f0e619f56\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2ck6j" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.110723 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3e7c7b6-f808-484a-9304-de2f0e619f56-config\") pod \"dnsmasq-dns-5bf47b49b7-2ck6j\" (UID: \"a3e7c7b6-f808-484a-9304-de2f0e619f56\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2ck6j" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.110792 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3e7c7b6-f808-484a-9304-de2f0e619f56-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-2ck6j\" (UID: \"a3e7c7b6-f808-484a-9304-de2f0e619f56\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2ck6j" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.110932 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3e7c7b6-f808-484a-9304-de2f0e619f56-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-2ck6j\" (UID: \"a3e7c7b6-f808-484a-9304-de2f0e619f56\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2ck6j" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.110984 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh9p5\" (UniqueName: \"kubernetes.io/projected/731d113e-365b-4d68-a0e9-402bb8a8e9b7-kube-api-access-nh9p5\") pod \"ovn-controller-metrics-7swjs\" (UID: \"731d113e-365b-4d68-a0e9-402bb8a8e9b7\") " pod="openstack/ovn-controller-metrics-7swjs" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.111013 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/731d113e-365b-4d68-a0e9-402bb8a8e9b7-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7swjs\" (UID: \"731d113e-365b-4d68-a0e9-402bb8a8e9b7\") " pod="openstack/ovn-controller-metrics-7swjs" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.111046 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/731d113e-365b-4d68-a0e9-402bb8a8e9b7-config\") pod \"ovn-controller-metrics-7swjs\" (UID: \"731d113e-365b-4d68-a0e9-402bb8a8e9b7\") " pod="openstack/ovn-controller-metrics-7swjs" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.111170 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/731d113e-365b-4d68-a0e9-402bb8a8e9b7-ovn-rundir\") pod \"ovn-controller-metrics-7swjs\" (UID: \"731d113e-365b-4d68-a0e9-402bb8a8e9b7\") " pod="openstack/ovn-controller-metrics-7swjs" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.111254 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/731d113e-365b-4d68-a0e9-402bb8a8e9b7-ovs-rundir\") pod \"ovn-controller-metrics-7swjs\" (UID: \"731d113e-365b-4d68-a0e9-402bb8a8e9b7\") " pod="openstack/ovn-controller-metrics-7swjs" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.111836 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3e7c7b6-f808-484a-9304-de2f0e619f56-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-2ck6j\" (UID: \"a3e7c7b6-f808-484a-9304-de2f0e619f56\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2ck6j" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.111860 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3e7c7b6-f808-484a-9304-de2f0e619f56-config\") pod \"dnsmasq-dns-5bf47b49b7-2ck6j\" (UID: \"a3e7c7b6-f808-484a-9304-de2f0e619f56\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2ck6j" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.111859 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3e7c7b6-f808-484a-9304-de2f0e619f56-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-2ck6j\" (UID: \"a3e7c7b6-f808-484a-9304-de2f0e619f56\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2ck6j" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.130308 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hhrv\" (UniqueName: \"kubernetes.io/projected/a3e7c7b6-f808-484a-9304-de2f0e619f56-kube-api-access-8hhrv\") pod \"dnsmasq-dns-5bf47b49b7-2ck6j\" (UID: \"a3e7c7b6-f808-484a-9304-de2f0e619f56\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2ck6j" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.199250 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.212709 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/731d113e-365b-4d68-a0e9-402bb8a8e9b7-ovs-rundir\") pod \"ovn-controller-metrics-7swjs\" (UID: \"731d113e-365b-4d68-a0e9-402bb8a8e9b7\") " pod="openstack/ovn-controller-metrics-7swjs" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.212824 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/731d113e-365b-4d68-a0e9-402bb8a8e9b7-combined-ca-bundle\") pod \"ovn-controller-metrics-7swjs\" (UID: \"731d113e-365b-4d68-a0e9-402bb8a8e9b7\") " pod="openstack/ovn-controller-metrics-7swjs" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.212901 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh9p5\" (UniqueName: \"kubernetes.io/projected/731d113e-365b-4d68-a0e9-402bb8a8e9b7-kube-api-access-nh9p5\") pod \"ovn-controller-metrics-7swjs\" (UID: \"731d113e-365b-4d68-a0e9-402bb8a8e9b7\") " pod="openstack/ovn-controller-metrics-7swjs" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.212923 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/731d113e-365b-4d68-a0e9-402bb8a8e9b7-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7swjs\" (UID: \"731d113e-365b-4d68-a0e9-402bb8a8e9b7\") " pod="openstack/ovn-controller-metrics-7swjs" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.212951 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/731d113e-365b-4d68-a0e9-402bb8a8e9b7-config\") pod \"ovn-controller-metrics-7swjs\" (UID: \"731d113e-365b-4d68-a0e9-402bb8a8e9b7\") " pod="openstack/ovn-controller-metrics-7swjs" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.212972 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/731d113e-365b-4d68-a0e9-402bb8a8e9b7-ovn-rundir\") pod \"ovn-controller-metrics-7swjs\" (UID: \"731d113e-365b-4d68-a0e9-402bb8a8e9b7\") " pod="openstack/ovn-controller-metrics-7swjs" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.213674 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/731d113e-365b-4d68-a0e9-402bb8a8e9b7-ovs-rundir\") pod \"ovn-controller-metrics-7swjs\" (UID: \"731d113e-365b-4d68-a0e9-402bb8a8e9b7\") " pod="openstack/ovn-controller-metrics-7swjs" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.215501 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/731d113e-365b-4d68-a0e9-402bb8a8e9b7-config\") pod \"ovn-controller-metrics-7swjs\" (UID: \"731d113e-365b-4d68-a0e9-402bb8a8e9b7\") " pod="openstack/ovn-controller-metrics-7swjs" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.216332 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/731d113e-365b-4d68-a0e9-402bb8a8e9b7-ovn-rundir\") pod \"ovn-controller-metrics-7swjs\" (UID: \"731d113e-365b-4d68-a0e9-402bb8a8e9b7\") " pod="openstack/ovn-controller-metrics-7swjs" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.218214 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-2ck6j" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.222199 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/731d113e-365b-4d68-a0e9-402bb8a8e9b7-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7swjs\" (UID: \"731d113e-365b-4d68-a0e9-402bb8a8e9b7\") " pod="openstack/ovn-controller-metrics-7swjs" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.222338 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/731d113e-365b-4d68-a0e9-402bb8a8e9b7-combined-ca-bundle\") pod \"ovn-controller-metrics-7swjs\" (UID: \"731d113e-365b-4d68-a0e9-402bb8a8e9b7\") " pod="openstack/ovn-controller-metrics-7swjs" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.229317 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh9p5\" (UniqueName: \"kubernetes.io/projected/731d113e-365b-4d68-a0e9-402bb8a8e9b7-kube-api-access-nh9p5\") pod \"ovn-controller-metrics-7swjs\" (UID: \"731d113e-365b-4d68-a0e9-402bb8a8e9b7\") " pod="openstack/ovn-controller-metrics-7swjs" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.243880 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.323870 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7swjs" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.328104 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2ck6j"] Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.352344 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-s2x8q"] Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.363631 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-s2x8q" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.374363 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.391006 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-s2x8q"] Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.431937 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad33310c-134f-4e39-a05b-9077f23159fc-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-s2x8q\" (UID: \"ad33310c-134f-4e39-a05b-9077f23159fc\") " pod="openstack/dnsmasq-dns-8554648995-s2x8q" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.432232 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad33310c-134f-4e39-a05b-9077f23159fc-dns-svc\") pod \"dnsmasq-dns-8554648995-s2x8q\" (UID: \"ad33310c-134f-4e39-a05b-9077f23159fc\") " pod="openstack/dnsmasq-dns-8554648995-s2x8q" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.432259 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7bq6\" (UniqueName: \"kubernetes.io/projected/ad33310c-134f-4e39-a05b-9077f23159fc-kube-api-access-n7bq6\") pod \"dnsmasq-dns-8554648995-s2x8q\" (UID: \"ad33310c-134f-4e39-a05b-9077f23159fc\") " pod="openstack/dnsmasq-dns-8554648995-s2x8q" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.432289 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad33310c-134f-4e39-a05b-9077f23159fc-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-s2x8q\" (UID: \"ad33310c-134f-4e39-a05b-9077f23159fc\") " pod="openstack/dnsmasq-dns-8554648995-s2x8q" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.432504 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad33310c-134f-4e39-a05b-9077f23159fc-config\") pod \"dnsmasq-dns-8554648995-s2x8q\" (UID: \"ad33310c-134f-4e39-a05b-9077f23159fc\") " pod="openstack/dnsmasq-dns-8554648995-s2x8q" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.534015 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad33310c-134f-4e39-a05b-9077f23159fc-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-s2x8q\" (UID: \"ad33310c-134f-4e39-a05b-9077f23159fc\") " pod="openstack/dnsmasq-dns-8554648995-s2x8q" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.534061 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad33310c-134f-4e39-a05b-9077f23159fc-dns-svc\") pod \"dnsmasq-dns-8554648995-s2x8q\" (UID: \"ad33310c-134f-4e39-a05b-9077f23159fc\") " pod="openstack/dnsmasq-dns-8554648995-s2x8q" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.534087 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7bq6\" (UniqueName: \"kubernetes.io/projected/ad33310c-134f-4e39-a05b-9077f23159fc-kube-api-access-n7bq6\") pod \"dnsmasq-dns-8554648995-s2x8q\" (UID: \"ad33310c-134f-4e39-a05b-9077f23159fc\") " pod="openstack/dnsmasq-dns-8554648995-s2x8q" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.534120 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad33310c-134f-4e39-a05b-9077f23159fc-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-s2x8q\" (UID: \"ad33310c-134f-4e39-a05b-9077f23159fc\") " pod="openstack/dnsmasq-dns-8554648995-s2x8q" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.534159 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad33310c-134f-4e39-a05b-9077f23159fc-config\") pod \"dnsmasq-dns-8554648995-s2x8q\" (UID: \"ad33310c-134f-4e39-a05b-9077f23159fc\") " pod="openstack/dnsmasq-dns-8554648995-s2x8q" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.535131 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad33310c-134f-4e39-a05b-9077f23159fc-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-s2x8q\" (UID: \"ad33310c-134f-4e39-a05b-9077f23159fc\") " pod="openstack/dnsmasq-dns-8554648995-s2x8q" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.536928 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad33310c-134f-4e39-a05b-9077f23159fc-dns-svc\") pod \"dnsmasq-dns-8554648995-s2x8q\" (UID: \"ad33310c-134f-4e39-a05b-9077f23159fc\") " pod="openstack/dnsmasq-dns-8554648995-s2x8q" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.537292 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad33310c-134f-4e39-a05b-9077f23159fc-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-s2x8q\" (UID: \"ad33310c-134f-4e39-a05b-9077f23159fc\") " pod="openstack/dnsmasq-dns-8554648995-s2x8q" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.538140 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad33310c-134f-4e39-a05b-9077f23159fc-config\") pod \"dnsmasq-dns-8554648995-s2x8q\" (UID: \"ad33310c-134f-4e39-a05b-9077f23159fc\") " pod="openstack/dnsmasq-dns-8554648995-s2x8q" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.555200 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7bq6\" (UniqueName: \"kubernetes.io/projected/ad33310c-134f-4e39-a05b-9077f23159fc-kube-api-access-n7bq6\") pod \"dnsmasq-dns-8554648995-s2x8q\" (UID: \"ad33310c-134f-4e39-a05b-9077f23159fc\") " pod="openstack/dnsmasq-dns-8554648995-s2x8q" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.592549 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.633734 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.688511 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-s2x8q" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.724009 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2ck6j"] Apr 02 13:58:45 crc kubenswrapper[4732]: W0402 13:58:45.730830 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3e7c7b6_f808_484a_9304_de2f0e619f56.slice/crio-afdb60b5d6bc896c7d6a242d061a23a60f5b4f2a7c7fab1c3f224c469a313cc3 WatchSource:0}: Error finding container afdb60b5d6bc896c7d6a242d061a23a60f5b4f2a7c7fab1c3f224c469a313cc3: Status 404 returned error can't find the container with id afdb60b5d6bc896c7d6a242d061a23a60f5b4f2a7c7fab1c3f224c469a313cc3 Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.828904 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.830942 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.835780 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-wbttj" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.835813 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.835940 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.836055 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.855024 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.857918 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3da22737-6be3-4ffc-afff-b5d7fb20a283-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3da22737-6be3-4ffc-afff-b5d7fb20a283\") " pod="openstack/ovn-northd-0" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.858002 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3da22737-6be3-4ffc-afff-b5d7fb20a283-scripts\") pod \"ovn-northd-0\" (UID: \"3da22737-6be3-4ffc-afff-b5d7fb20a283\") " pod="openstack/ovn-northd-0" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.858065 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da22737-6be3-4ffc-afff-b5d7fb20a283-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3da22737-6be3-4ffc-afff-b5d7fb20a283\") " pod="openstack/ovn-northd-0" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.858102 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5brhg\" (UniqueName: \"kubernetes.io/projected/3da22737-6be3-4ffc-afff-b5d7fb20a283-kube-api-access-5brhg\") pod \"ovn-northd-0\" (UID: \"3da22737-6be3-4ffc-afff-b5d7fb20a283\") " pod="openstack/ovn-northd-0" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.858121 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da22737-6be3-4ffc-afff-b5d7fb20a283-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3da22737-6be3-4ffc-afff-b5d7fb20a283\") " pod="openstack/ovn-northd-0" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.858146 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da22737-6be3-4ffc-afff-b5d7fb20a283-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3da22737-6be3-4ffc-afff-b5d7fb20a283\") " pod="openstack/ovn-northd-0" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.858172 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3da22737-6be3-4ffc-afff-b5d7fb20a283-config\") pod \"ovn-northd-0\" (UID: \"3da22737-6be3-4ffc-afff-b5d7fb20a283\") " pod="openstack/ovn-northd-0" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.867106 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-7swjs"] Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.959370 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3da22737-6be3-4ffc-afff-b5d7fb20a283-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3da22737-6be3-4ffc-afff-b5d7fb20a283\") " pod="openstack/ovn-northd-0" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.959465 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3da22737-6be3-4ffc-afff-b5d7fb20a283-scripts\") pod \"ovn-northd-0\" (UID: \"3da22737-6be3-4ffc-afff-b5d7fb20a283\") " pod="openstack/ovn-northd-0" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.959519 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da22737-6be3-4ffc-afff-b5d7fb20a283-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3da22737-6be3-4ffc-afff-b5d7fb20a283\") " pod="openstack/ovn-northd-0" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.959555 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5brhg\" (UniqueName: \"kubernetes.io/projected/3da22737-6be3-4ffc-afff-b5d7fb20a283-kube-api-access-5brhg\") pod \"ovn-northd-0\" (UID: \"3da22737-6be3-4ffc-afff-b5d7fb20a283\") " pod="openstack/ovn-northd-0" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.959580 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da22737-6be3-4ffc-afff-b5d7fb20a283-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3da22737-6be3-4ffc-afff-b5d7fb20a283\") " pod="openstack/ovn-northd-0" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.959601 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da22737-6be3-4ffc-afff-b5d7fb20a283-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3da22737-6be3-4ffc-afff-b5d7fb20a283\") " pod="openstack/ovn-northd-0" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.959640 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3da22737-6be3-4ffc-afff-b5d7fb20a283-config\") pod \"ovn-northd-0\" (UID: \"3da22737-6be3-4ffc-afff-b5d7fb20a283\") " pod="openstack/ovn-northd-0" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.960496 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3da22737-6be3-4ffc-afff-b5d7fb20a283-config\") pod \"ovn-northd-0\" (UID: \"3da22737-6be3-4ffc-afff-b5d7fb20a283\") " pod="openstack/ovn-northd-0" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.961141 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3da22737-6be3-4ffc-afff-b5d7fb20a283-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3da22737-6be3-4ffc-afff-b5d7fb20a283\") " pod="openstack/ovn-northd-0" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.961641 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3da22737-6be3-4ffc-afff-b5d7fb20a283-scripts\") pod \"ovn-northd-0\" (UID: \"3da22737-6be3-4ffc-afff-b5d7fb20a283\") " pod="openstack/ovn-northd-0" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.970339 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da22737-6be3-4ffc-afff-b5d7fb20a283-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3da22737-6be3-4ffc-afff-b5d7fb20a283\") " pod="openstack/ovn-northd-0" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.970372 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da22737-6be3-4ffc-afff-b5d7fb20a283-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3da22737-6be3-4ffc-afff-b5d7fb20a283\") " pod="openstack/ovn-northd-0" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.983349 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da22737-6be3-4ffc-afff-b5d7fb20a283-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3da22737-6be3-4ffc-afff-b5d7fb20a283\") " pod="openstack/ovn-northd-0" Apr 02 13:58:45 crc kubenswrapper[4732]: I0402 13:58:45.991213 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5brhg\" (UniqueName: \"kubernetes.io/projected/3da22737-6be3-4ffc-afff-b5d7fb20a283-kube-api-access-5brhg\") pod \"ovn-northd-0\" (UID: \"3da22737-6be3-4ffc-afff-b5d7fb20a283\") " pod="openstack/ovn-northd-0" Apr 02 13:58:46 crc kubenswrapper[4732]: I0402 13:58:46.150006 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-s2x8q"] Apr 02 13:58:46 crc kubenswrapper[4732]: I0402 13:58:46.172270 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Apr 02 13:58:46 crc kubenswrapper[4732]: I0402 13:58:46.188116 4732 scope.go:117] "RemoveContainer" containerID="0e8e27dfabd47f779aa28ec5352a03bfe37b30c693763287ea923bfc71287f19" Apr 02 13:58:46 crc kubenswrapper[4732]: I0402 13:58:46.593684 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Apr 02 13:58:46 crc kubenswrapper[4732]: I0402 13:58:46.607577 4732 generic.go:334] "Generic (PLEG): container finished" podID="ad33310c-134f-4e39-a05b-9077f23159fc" containerID="f70edb85190d9731aba57c61eea26c4041c19134786fa53765d35ee7cca08add" exitCode=0 Apr 02 13:58:46 crc kubenswrapper[4732]: I0402 13:58:46.607688 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-s2x8q" event={"ID":"ad33310c-134f-4e39-a05b-9077f23159fc","Type":"ContainerDied","Data":"f70edb85190d9731aba57c61eea26c4041c19134786fa53765d35ee7cca08add"} Apr 02 13:58:46 crc kubenswrapper[4732]: I0402 13:58:46.607711 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-s2x8q" event={"ID":"ad33310c-134f-4e39-a05b-9077f23159fc","Type":"ContainerStarted","Data":"fc762bad93d14b56b2a30ee9c96c4128ee82b914aca3688ae5af23dd84360630"} Apr 02 13:58:46 crc kubenswrapper[4732]: I0402 13:58:46.609823 4732 generic.go:334] "Generic (PLEG): container finished" podID="a3e7c7b6-f808-484a-9304-de2f0e619f56" containerID="01b474435a2ea6dc78521580bdbd0a0bf56d4eee922e4fe32d149306946d0280" exitCode=0 Apr 02 13:58:46 crc kubenswrapper[4732]: I0402 13:58:46.609858 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-2ck6j" event={"ID":"a3e7c7b6-f808-484a-9304-de2f0e619f56","Type":"ContainerDied","Data":"01b474435a2ea6dc78521580bdbd0a0bf56d4eee922e4fe32d149306946d0280"} Apr 02 13:58:46 crc kubenswrapper[4732]: I0402 13:58:46.609887 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-2ck6j" event={"ID":"a3e7c7b6-f808-484a-9304-de2f0e619f56","Type":"ContainerStarted","Data":"afdb60b5d6bc896c7d6a242d061a23a60f5b4f2a7c7fab1c3f224c469a313cc3"} Apr 02 13:58:46 crc kubenswrapper[4732]: I0402 13:58:46.613873 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7swjs" event={"ID":"731d113e-365b-4d68-a0e9-402bb8a8e9b7","Type":"ContainerStarted","Data":"3752924762d5ec4fce20a1814a90bae5c21e261a52588377e416083aa466e190"} Apr 02 13:58:46 crc kubenswrapper[4732]: I0402 13:58:46.613952 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7swjs" event={"ID":"731d113e-365b-4d68-a0e9-402bb8a8e9b7","Type":"ContainerStarted","Data":"68e557cf9ee3e988d0a94290e0cc221b8bdaf6cb3768cdd142cacfde6953d28a"} Apr 02 13:58:46 crc kubenswrapper[4732]: I0402 13:58:46.662883 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-7swjs" podStartSLOduration=2.662858955 podStartE2EDuration="2.662858955s" podCreationTimestamp="2026-04-02 13:58:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:58:46.654265664 +0000 UTC m=+1283.558673237" watchObservedRunningTime="2026-04-02 13:58:46.662858955 +0000 UTC m=+1283.567266508" Apr 02 13:58:46 crc kubenswrapper[4732]: I0402 13:58:46.666819 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Apr 02 13:58:46 crc kubenswrapper[4732]: I0402 13:58:46.666873 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Apr 02 13:58:46 crc kubenswrapper[4732]: I0402 13:58:46.882135 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-2ck6j" Apr 02 13:58:46 crc kubenswrapper[4732]: I0402 13:58:46.982224 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hhrv\" (UniqueName: \"kubernetes.io/projected/a3e7c7b6-f808-484a-9304-de2f0e619f56-kube-api-access-8hhrv\") pod \"a3e7c7b6-f808-484a-9304-de2f0e619f56\" (UID: \"a3e7c7b6-f808-484a-9304-de2f0e619f56\") " Apr 02 13:58:46 crc kubenswrapper[4732]: I0402 13:58:46.982321 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3e7c7b6-f808-484a-9304-de2f0e619f56-ovsdbserver-nb\") pod \"a3e7c7b6-f808-484a-9304-de2f0e619f56\" (UID: \"a3e7c7b6-f808-484a-9304-de2f0e619f56\") " Apr 02 13:58:46 crc kubenswrapper[4732]: I0402 13:58:46.982382 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3e7c7b6-f808-484a-9304-de2f0e619f56-config\") pod \"a3e7c7b6-f808-484a-9304-de2f0e619f56\" (UID: \"a3e7c7b6-f808-484a-9304-de2f0e619f56\") " Apr 02 13:58:46 crc kubenswrapper[4732]: I0402 13:58:46.982522 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3e7c7b6-f808-484a-9304-de2f0e619f56-dns-svc\") pod \"a3e7c7b6-f808-484a-9304-de2f0e619f56\" (UID: \"a3e7c7b6-f808-484a-9304-de2f0e619f56\") " Apr 02 13:58:46 crc kubenswrapper[4732]: I0402 13:58:46.990335 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3e7c7b6-f808-484a-9304-de2f0e619f56-kube-api-access-8hhrv" (OuterVolumeSpecName: "kube-api-access-8hhrv") pod "a3e7c7b6-f808-484a-9304-de2f0e619f56" (UID: "a3e7c7b6-f808-484a-9304-de2f0e619f56"). InnerVolumeSpecName "kube-api-access-8hhrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:58:47 crc kubenswrapper[4732]: I0402 13:58:47.002752 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3e7c7b6-f808-484a-9304-de2f0e619f56-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a3e7c7b6-f808-484a-9304-de2f0e619f56" (UID: "a3e7c7b6-f808-484a-9304-de2f0e619f56"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:58:47 crc kubenswrapper[4732]: I0402 13:58:47.003267 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3e7c7b6-f808-484a-9304-de2f0e619f56-config" (OuterVolumeSpecName: "config") pod "a3e7c7b6-f808-484a-9304-de2f0e619f56" (UID: "a3e7c7b6-f808-484a-9304-de2f0e619f56"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:58:47 crc kubenswrapper[4732]: I0402 13:58:47.003896 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3e7c7b6-f808-484a-9304-de2f0e619f56-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a3e7c7b6-f808-484a-9304-de2f0e619f56" (UID: "a3e7c7b6-f808-484a-9304-de2f0e619f56"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:58:47 crc kubenswrapper[4732]: I0402 13:58:47.084020 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3e7c7b6-f808-484a-9304-de2f0e619f56-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:47 crc kubenswrapper[4732]: I0402 13:58:47.084061 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hhrv\" (UniqueName: \"kubernetes.io/projected/a3e7c7b6-f808-484a-9304-de2f0e619f56-kube-api-access-8hhrv\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:47 crc kubenswrapper[4732]: I0402 13:58:47.084077 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3e7c7b6-f808-484a-9304-de2f0e619f56-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:47 crc kubenswrapper[4732]: I0402 13:58:47.084089 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3e7c7b6-f808-484a-9304-de2f0e619f56-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:47 crc kubenswrapper[4732]: I0402 13:58:47.625914 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3da22737-6be3-4ffc-afff-b5d7fb20a283","Type":"ContainerStarted","Data":"fe46e8574270edf9c8b02301d432f9b3c9df7925eff5dd16708e3b5061f20449"} Apr 02 13:58:47 crc kubenswrapper[4732]: I0402 13:58:47.629276 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-s2x8q" event={"ID":"ad33310c-134f-4e39-a05b-9077f23159fc","Type":"ContainerStarted","Data":"6b2c1cb8e13d6fc8ea87c8211820e6008a2e7bf3154abfbd32743459e580d826"} Apr 02 13:58:47 crc kubenswrapper[4732]: I0402 13:58:47.629762 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-s2x8q" Apr 02 13:58:47 crc kubenswrapper[4732]: I0402 13:58:47.638080 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-2ck6j" event={"ID":"a3e7c7b6-f808-484a-9304-de2f0e619f56","Type":"ContainerDied","Data":"afdb60b5d6bc896c7d6a242d061a23a60f5b4f2a7c7fab1c3f224c469a313cc3"} Apr 02 13:58:47 crc kubenswrapper[4732]: I0402 13:58:47.638107 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-2ck6j" Apr 02 13:58:47 crc kubenswrapper[4732]: I0402 13:58:47.638140 4732 scope.go:117] "RemoveContainer" containerID="01b474435a2ea6dc78521580bdbd0a0bf56d4eee922e4fe32d149306946d0280" Apr 02 13:58:47 crc kubenswrapper[4732]: I0402 13:58:47.654754 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-s2x8q" podStartSLOduration=2.6547348790000003 podStartE2EDuration="2.654734879s" podCreationTimestamp="2026-04-02 13:58:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:58:47.645651144 +0000 UTC m=+1284.550058717" watchObservedRunningTime="2026-04-02 13:58:47.654734879 +0000 UTC m=+1284.559142432" Apr 02 13:58:47 crc kubenswrapper[4732]: I0402 13:58:47.747536 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2ck6j"] Apr 02 13:58:47 crc kubenswrapper[4732]: I0402 13:58:47.757115 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2ck6j"] Apr 02 13:58:48 crc kubenswrapper[4732]: I0402 13:58:48.140070 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:48 crc kubenswrapper[4732]: I0402 13:58:48.140405 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:48 crc kubenswrapper[4732]: I0402 13:58:48.649693 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3da22737-6be3-4ffc-afff-b5d7fb20a283","Type":"ContainerStarted","Data":"e56281a92d9be649958f8048179ea8cbbb2dd396dcd81bbb7990b3749a92c460"} Apr 02 13:58:48 crc kubenswrapper[4732]: I0402 13:58:48.649758 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3da22737-6be3-4ffc-afff-b5d7fb20a283","Type":"ContainerStarted","Data":"402702042f94da0d4e569198523792fffacbb1cc031d71605dcfd4057e5cd01f"} Apr 02 13:58:48 crc kubenswrapper[4732]: I0402 13:58:48.651277 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Apr 02 13:58:48 crc kubenswrapper[4732]: I0402 13:58:48.669044 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.822973099 podStartE2EDuration="3.669026423s" podCreationTimestamp="2026-04-02 13:58:45 +0000 UTC" firstStartedPulling="2026-04-02 13:58:46.636779604 +0000 UTC m=+1283.541187167" lastFinishedPulling="2026-04-02 13:58:47.482832938 +0000 UTC m=+1284.387240491" observedRunningTime="2026-04-02 13:58:48.666893046 +0000 UTC m=+1285.571300609" watchObservedRunningTime="2026-04-02 13:58:48.669026423 +0000 UTC m=+1285.573433986" Apr 02 13:58:48 crc kubenswrapper[4732]: I0402 13:58:48.690391 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3e7c7b6-f808-484a-9304-de2f0e619f56" path="/var/lib/kubelet/pods/a3e7c7b6-f808-484a-9304-de2f0e619f56/volumes" Apr 02 13:58:48 crc kubenswrapper[4732]: I0402 13:58:48.737949 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Apr 02 13:58:48 crc kubenswrapper[4732]: I0402 13:58:48.807371 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.475908 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-s2x8q"] Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.477533 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-s2x8q" podUID="ad33310c-134f-4e39-a05b-9077f23159fc" containerName="dnsmasq-dns" containerID="cri-o://6b2c1cb8e13d6fc8ea87c8211820e6008a2e7bf3154abfbd32743459e580d826" gracePeriod=10 Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.506329 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6xf79"] Apr 02 13:58:50 crc kubenswrapper[4732]: E0402 13:58:50.506837 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3e7c7b6-f808-484a-9304-de2f0e619f56" containerName="init" Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.506929 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3e7c7b6-f808-484a-9304-de2f0e619f56" containerName="init" Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.507147 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3e7c7b6-f808-484a-9304-de2f0e619f56" containerName="init" Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.508035 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.527860 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6xf79"] Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.611585 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.640238 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56a9e53f-9667-48a5-8065-ad30fd550a7d-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-6xf79\" (UID: \"56a9e53f-9667-48a5-8065-ad30fd550a7d\") " pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.640324 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhx7s\" (UniqueName: \"kubernetes.io/projected/56a9e53f-9667-48a5-8065-ad30fd550a7d-kube-api-access-vhx7s\") pod \"dnsmasq-dns-b8fbc5445-6xf79\" (UID: \"56a9e53f-9667-48a5-8065-ad30fd550a7d\") " pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.640420 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56a9e53f-9667-48a5-8065-ad30fd550a7d-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-6xf79\" (UID: \"56a9e53f-9667-48a5-8065-ad30fd550a7d\") " pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.640469 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56a9e53f-9667-48a5-8065-ad30fd550a7d-config\") pod \"dnsmasq-dns-b8fbc5445-6xf79\" (UID: \"56a9e53f-9667-48a5-8065-ad30fd550a7d\") " pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.640511 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56a9e53f-9667-48a5-8065-ad30fd550a7d-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-6xf79\" (UID: \"56a9e53f-9667-48a5-8065-ad30fd550a7d\") " pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.686404 4732 generic.go:334] "Generic (PLEG): container finished" podID="ad33310c-134f-4e39-a05b-9077f23159fc" containerID="6b2c1cb8e13d6fc8ea87c8211820e6008a2e7bf3154abfbd32743459e580d826" exitCode=0 Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.698902 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-s2x8q" event={"ID":"ad33310c-134f-4e39-a05b-9077f23159fc","Type":"ContainerDied","Data":"6b2c1cb8e13d6fc8ea87c8211820e6008a2e7bf3154abfbd32743459e580d826"} Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.741887 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56a9e53f-9667-48a5-8065-ad30fd550a7d-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-6xf79\" (UID: \"56a9e53f-9667-48a5-8065-ad30fd550a7d\") " pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.741948 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56a9e53f-9667-48a5-8065-ad30fd550a7d-config\") pod \"dnsmasq-dns-b8fbc5445-6xf79\" (UID: \"56a9e53f-9667-48a5-8065-ad30fd550a7d\") " pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.741989 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56a9e53f-9667-48a5-8065-ad30fd550a7d-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-6xf79\" (UID: \"56a9e53f-9667-48a5-8065-ad30fd550a7d\") " pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.742087 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56a9e53f-9667-48a5-8065-ad30fd550a7d-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-6xf79\" (UID: \"56a9e53f-9667-48a5-8065-ad30fd550a7d\") " pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.742134 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhx7s\" (UniqueName: \"kubernetes.io/projected/56a9e53f-9667-48a5-8065-ad30fd550a7d-kube-api-access-vhx7s\") pod \"dnsmasq-dns-b8fbc5445-6xf79\" (UID: \"56a9e53f-9667-48a5-8065-ad30fd550a7d\") " pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.743094 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56a9e53f-9667-48a5-8065-ad30fd550a7d-config\") pod \"dnsmasq-dns-b8fbc5445-6xf79\" (UID: \"56a9e53f-9667-48a5-8065-ad30fd550a7d\") " pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.743161 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56a9e53f-9667-48a5-8065-ad30fd550a7d-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-6xf79\" (UID: \"56a9e53f-9667-48a5-8065-ad30fd550a7d\") " pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.743308 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56a9e53f-9667-48a5-8065-ad30fd550a7d-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-6xf79\" (UID: \"56a9e53f-9667-48a5-8065-ad30fd550a7d\") " pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.743933 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56a9e53f-9667-48a5-8065-ad30fd550a7d-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-6xf79\" (UID: \"56a9e53f-9667-48a5-8065-ad30fd550a7d\") " pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.759430 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:50 crc kubenswrapper[4732]: I0402 13:58:50.775081 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhx7s\" (UniqueName: \"kubernetes.io/projected/56a9e53f-9667-48a5-8065-ad30fd550a7d-kube-api-access-vhx7s\") pod \"dnsmasq-dns-b8fbc5445-6xf79\" (UID: \"56a9e53f-9667-48a5-8065-ad30fd550a7d\") " pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:50.843753 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:50.897277 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:50.997583 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-s2x8q" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.046381 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad33310c-134f-4e39-a05b-9077f23159fc-dns-svc\") pod \"ad33310c-134f-4e39-a05b-9077f23159fc\" (UID: \"ad33310c-134f-4e39-a05b-9077f23159fc\") " Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.046476 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad33310c-134f-4e39-a05b-9077f23159fc-ovsdbserver-sb\") pod \"ad33310c-134f-4e39-a05b-9077f23159fc\" (UID: \"ad33310c-134f-4e39-a05b-9077f23159fc\") " Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.046574 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad33310c-134f-4e39-a05b-9077f23159fc-ovsdbserver-nb\") pod \"ad33310c-134f-4e39-a05b-9077f23159fc\" (UID: \"ad33310c-134f-4e39-a05b-9077f23159fc\") " Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.046602 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7bq6\" (UniqueName: \"kubernetes.io/projected/ad33310c-134f-4e39-a05b-9077f23159fc-kube-api-access-n7bq6\") pod \"ad33310c-134f-4e39-a05b-9077f23159fc\" (UID: \"ad33310c-134f-4e39-a05b-9077f23159fc\") " Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.046645 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad33310c-134f-4e39-a05b-9077f23159fc-config\") pod \"ad33310c-134f-4e39-a05b-9077f23159fc\" (UID: \"ad33310c-134f-4e39-a05b-9077f23159fc\") " Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.051640 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad33310c-134f-4e39-a05b-9077f23159fc-kube-api-access-n7bq6" (OuterVolumeSpecName: "kube-api-access-n7bq6") pod "ad33310c-134f-4e39-a05b-9077f23159fc" (UID: "ad33310c-134f-4e39-a05b-9077f23159fc"). InnerVolumeSpecName "kube-api-access-n7bq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.080420 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad33310c-134f-4e39-a05b-9077f23159fc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ad33310c-134f-4e39-a05b-9077f23159fc" (UID: "ad33310c-134f-4e39-a05b-9077f23159fc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.080697 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad33310c-134f-4e39-a05b-9077f23159fc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ad33310c-134f-4e39-a05b-9077f23159fc" (UID: "ad33310c-134f-4e39-a05b-9077f23159fc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.084272 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad33310c-134f-4e39-a05b-9077f23159fc-config" (OuterVolumeSpecName: "config") pod "ad33310c-134f-4e39-a05b-9077f23159fc" (UID: "ad33310c-134f-4e39-a05b-9077f23159fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.085022 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad33310c-134f-4e39-a05b-9077f23159fc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ad33310c-134f-4e39-a05b-9077f23159fc" (UID: "ad33310c-134f-4e39-a05b-9077f23159fc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.148444 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad33310c-134f-4e39-a05b-9077f23159fc-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.148476 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad33310c-134f-4e39-a05b-9077f23159fc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.148487 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad33310c-134f-4e39-a05b-9077f23159fc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.148499 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7bq6\" (UniqueName: \"kubernetes.io/projected/ad33310c-134f-4e39-a05b-9077f23159fc-kube-api-access-n7bq6\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.148507 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad33310c-134f-4e39-a05b-9077f23159fc-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.696998 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-s2x8q" event={"ID":"ad33310c-134f-4e39-a05b-9077f23159fc","Type":"ContainerDied","Data":"fc762bad93d14b56b2a30ee9c96c4128ee82b914aca3688ae5af23dd84360630"} Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.697041 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-s2x8q" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.697553 4732 scope.go:117] "RemoveContainer" containerID="6b2c1cb8e13d6fc8ea87c8211820e6008a2e7bf3154abfbd32743459e580d826" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.721287 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Apr 02 13:58:51 crc kubenswrapper[4732]: E0402 13:58:51.721647 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad33310c-134f-4e39-a05b-9077f23159fc" containerName="init" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.721659 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad33310c-134f-4e39-a05b-9077f23159fc" containerName="init" Apr 02 13:58:51 crc kubenswrapper[4732]: E0402 13:58:51.721687 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad33310c-134f-4e39-a05b-9077f23159fc" containerName="dnsmasq-dns" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.721696 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad33310c-134f-4e39-a05b-9077f23159fc" containerName="dnsmasq-dns" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.721847 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad33310c-134f-4e39-a05b-9077f23159fc" containerName="dnsmasq-dns" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.727648 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.749822 4732 scope.go:117] "RemoveContainer" containerID="f70edb85190d9731aba57c61eea26c4041c19134786fa53765d35ee7cca08add" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.753330 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.753532 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.753699 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-8xv4h" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.753742 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.773254 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.831950 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-s2x8q"] Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.841595 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-s2x8q"] Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.864651 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529a332e-d2c3-49c5-86d5-e672811d00cd-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"529a332e-d2c3-49c5-86d5-e672811d00cd\") " pod="openstack/swift-storage-0" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.864699 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/529a332e-d2c3-49c5-86d5-e672811d00cd-etc-swift\") pod \"swift-storage-0\" (UID: \"529a332e-d2c3-49c5-86d5-e672811d00cd\") " pod="openstack/swift-storage-0" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.864791 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"529a332e-d2c3-49c5-86d5-e672811d00cd\") " pod="openstack/swift-storage-0" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.864830 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/529a332e-d2c3-49c5-86d5-e672811d00cd-cache\") pod \"swift-storage-0\" (UID: \"529a332e-d2c3-49c5-86d5-e672811d00cd\") " pod="openstack/swift-storage-0" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.864860 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tr49\" (UniqueName: \"kubernetes.io/projected/529a332e-d2c3-49c5-86d5-e672811d00cd-kube-api-access-8tr49\") pod \"swift-storage-0\" (UID: \"529a332e-d2c3-49c5-86d5-e672811d00cd\") " pod="openstack/swift-storage-0" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.864903 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/529a332e-d2c3-49c5-86d5-e672811d00cd-lock\") pod \"swift-storage-0\" (UID: \"529a332e-d2c3-49c5-86d5-e672811d00cd\") " pod="openstack/swift-storage-0" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.966496 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/529a332e-d2c3-49c5-86d5-e672811d00cd-lock\") pod \"swift-storage-0\" (UID: \"529a332e-d2c3-49c5-86d5-e672811d00cd\") " pod="openstack/swift-storage-0" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.966604 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529a332e-d2c3-49c5-86d5-e672811d00cd-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"529a332e-d2c3-49c5-86d5-e672811d00cd\") " pod="openstack/swift-storage-0" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.966646 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/529a332e-d2c3-49c5-86d5-e672811d00cd-etc-swift\") pod \"swift-storage-0\" (UID: \"529a332e-d2c3-49c5-86d5-e672811d00cd\") " pod="openstack/swift-storage-0" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.966715 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"529a332e-d2c3-49c5-86d5-e672811d00cd\") " pod="openstack/swift-storage-0" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.966752 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/529a332e-d2c3-49c5-86d5-e672811d00cd-cache\") pod \"swift-storage-0\" (UID: \"529a332e-d2c3-49c5-86d5-e672811d00cd\") " pod="openstack/swift-storage-0" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.966783 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tr49\" (UniqueName: \"kubernetes.io/projected/529a332e-d2c3-49c5-86d5-e672811d00cd-kube-api-access-8tr49\") pod \"swift-storage-0\" (UID: \"529a332e-d2c3-49c5-86d5-e672811d00cd\") " pod="openstack/swift-storage-0" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.967026 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/529a332e-d2c3-49c5-86d5-e672811d00cd-lock\") pod \"swift-storage-0\" (UID: \"529a332e-d2c3-49c5-86d5-e672811d00cd\") " pod="openstack/swift-storage-0" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.967053 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"529a332e-d2c3-49c5-86d5-e672811d00cd\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/swift-storage-0" Apr 02 13:58:51 crc kubenswrapper[4732]: E0402 13:58:51.967273 4732 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Apr 02 13:58:51 crc kubenswrapper[4732]: E0402 13:58:51.967296 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.967361 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/529a332e-d2c3-49c5-86d5-e672811d00cd-cache\") pod \"swift-storage-0\" (UID: \"529a332e-d2c3-49c5-86d5-e672811d00cd\") " pod="openstack/swift-storage-0" Apr 02 13:58:51 crc kubenswrapper[4732]: E0402 13:58:51.967367 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/529a332e-d2c3-49c5-86d5-e672811d00cd-etc-swift podName:529a332e-d2c3-49c5-86d5-e672811d00cd nodeName:}" failed. No retries permitted until 2026-04-02 13:58:52.467348807 +0000 UTC m=+1289.371756360 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/529a332e-d2c3-49c5-86d5-e672811d00cd-etc-swift") pod "swift-storage-0" (UID: "529a332e-d2c3-49c5-86d5-e672811d00cd") : configmap "swift-ring-files" not found Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.972648 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529a332e-d2c3-49c5-86d5-e672811d00cd-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"529a332e-d2c3-49c5-86d5-e672811d00cd\") " pod="openstack/swift-storage-0" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.984303 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tr49\" (UniqueName: \"kubernetes.io/projected/529a332e-d2c3-49c5-86d5-e672811d00cd-kube-api-access-8tr49\") pod \"swift-storage-0\" (UID: \"529a332e-d2c3-49c5-86d5-e672811d00cd\") " pod="openstack/swift-storage-0" Apr 02 13:58:51 crc kubenswrapper[4732]: I0402 13:58:51.990568 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"529a332e-d2c3-49c5-86d5-e672811d00cd\") " pod="openstack/swift-storage-0" Apr 02 13:58:52 crc kubenswrapper[4732]: I0402 13:58:52.074087 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6xf79"] Apr 02 13:58:52 crc kubenswrapper[4732]: W0402 13:58:52.076813 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56a9e53f_9667_48a5_8065_ad30fd550a7d.slice/crio-3813acbe76f48b2cb59927fb517525777a650f410f579e46c170142412be33c7 WatchSource:0}: Error finding container 3813acbe76f48b2cb59927fb517525777a650f410f579e46c170142412be33c7: Status 404 returned error can't find the container with id 3813acbe76f48b2cb59927fb517525777a650f410f579e46c170142412be33c7 Apr 02 13:58:52 crc kubenswrapper[4732]: I0402 13:58:52.476047 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/529a332e-d2c3-49c5-86d5-e672811d00cd-etc-swift\") pod \"swift-storage-0\" (UID: \"529a332e-d2c3-49c5-86d5-e672811d00cd\") " pod="openstack/swift-storage-0" Apr 02 13:58:52 crc kubenswrapper[4732]: E0402 13:58:52.476342 4732 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Apr 02 13:58:52 crc kubenswrapper[4732]: E0402 13:58:52.476371 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Apr 02 13:58:52 crc kubenswrapper[4732]: E0402 13:58:52.476426 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/529a332e-d2c3-49c5-86d5-e672811d00cd-etc-swift podName:529a332e-d2c3-49c5-86d5-e672811d00cd nodeName:}" failed. No retries permitted until 2026-04-02 13:58:53.476406771 +0000 UTC m=+1290.380814324 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/529a332e-d2c3-49c5-86d5-e672811d00cd-etc-swift") pod "swift-storage-0" (UID: "529a332e-d2c3-49c5-86d5-e672811d00cd") : configmap "swift-ring-files" not found Apr 02 13:58:52 crc kubenswrapper[4732]: I0402 13:58:52.694003 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad33310c-134f-4e39-a05b-9077f23159fc" path="/var/lib/kubelet/pods/ad33310c-134f-4e39-a05b-9077f23159fc/volumes" Apr 02 13:58:52 crc kubenswrapper[4732]: I0402 13:58:52.707936 4732 generic.go:334] "Generic (PLEG): container finished" podID="56a9e53f-9667-48a5-8065-ad30fd550a7d" containerID="26ff11c0f727ee2880bda54c6c49e3983795e6a07857649f772b06a7d389756e" exitCode=0 Apr 02 13:58:52 crc kubenswrapper[4732]: I0402 13:58:52.707985 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" event={"ID":"56a9e53f-9667-48a5-8065-ad30fd550a7d","Type":"ContainerDied","Data":"26ff11c0f727ee2880bda54c6c49e3983795e6a07857649f772b06a7d389756e"} Apr 02 13:58:52 crc kubenswrapper[4732]: I0402 13:58:52.708011 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" event={"ID":"56a9e53f-9667-48a5-8065-ad30fd550a7d","Type":"ContainerStarted","Data":"3813acbe76f48b2cb59927fb517525777a650f410f579e46c170142412be33c7"} Apr 02 13:58:53 crc kubenswrapper[4732]: I0402 13:58:53.492431 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/529a332e-d2c3-49c5-86d5-e672811d00cd-etc-swift\") pod \"swift-storage-0\" (UID: \"529a332e-d2c3-49c5-86d5-e672811d00cd\") " pod="openstack/swift-storage-0" Apr 02 13:58:53 crc kubenswrapper[4732]: E0402 13:58:53.492721 4732 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Apr 02 13:58:53 crc kubenswrapper[4732]: E0402 13:58:53.492921 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Apr 02 13:58:53 crc kubenswrapper[4732]: E0402 13:58:53.492981 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/529a332e-d2c3-49c5-86d5-e672811d00cd-etc-swift podName:529a332e-d2c3-49c5-86d5-e672811d00cd nodeName:}" failed. No retries permitted until 2026-04-02 13:58:55.492966328 +0000 UTC m=+1292.397373881 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/529a332e-d2c3-49c5-86d5-e672811d00cd-etc-swift") pod "swift-storage-0" (UID: "529a332e-d2c3-49c5-86d5-e672811d00cd") : configmap "swift-ring-files" not found Apr 02 13:58:53 crc kubenswrapper[4732]: I0402 13:58:53.684749 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-8a7f-account-create-update-rcjkk"] Apr 02 13:58:53 crc kubenswrapper[4732]: I0402 13:58:53.686014 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8a7f-account-create-update-rcjkk" Apr 02 13:58:53 crc kubenswrapper[4732]: I0402 13:58:53.688857 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Apr 02 13:58:53 crc kubenswrapper[4732]: I0402 13:58:53.692239 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8a7f-account-create-update-rcjkk"] Apr 02 13:58:53 crc kubenswrapper[4732]: I0402 13:58:53.717430 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" event={"ID":"56a9e53f-9667-48a5-8065-ad30fd550a7d","Type":"ContainerStarted","Data":"8e2522f8812673996dc5982ccee1bd3544e320ed8d8a1a5e82ab36b4111f0d61"} Apr 02 13:58:53 crc kubenswrapper[4732]: I0402 13:58:53.717590 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" Apr 02 13:58:53 crc kubenswrapper[4732]: I0402 13:58:53.734550 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-9sdcc"] Apr 02 13:58:53 crc kubenswrapper[4732]: I0402 13:58:53.735566 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9sdcc" Apr 02 13:58:53 crc kubenswrapper[4732]: I0402 13:58:53.751672 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9sdcc"] Apr 02 13:58:53 crc kubenswrapper[4732]: I0402 13:58:53.754259 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" podStartSLOduration=3.754244871 podStartE2EDuration="3.754244871s" podCreationTimestamp="2026-04-02 13:58:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:58:53.735295062 +0000 UTC m=+1290.639702635" watchObservedRunningTime="2026-04-02 13:58:53.754244871 +0000 UTC m=+1290.658652424" Apr 02 13:58:53 crc kubenswrapper[4732]: I0402 13:58:53.797636 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b57899ef-a548-4a0d-a677-529a876d7d68-operator-scripts\") pod \"glance-8a7f-account-create-update-rcjkk\" (UID: \"b57899ef-a548-4a0d-a677-529a876d7d68\") " pod="openstack/glance-8a7f-account-create-update-rcjkk" Apr 02 13:58:53 crc kubenswrapper[4732]: I0402 13:58:53.797710 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9nfw\" (UniqueName: \"kubernetes.io/projected/b57899ef-a548-4a0d-a677-529a876d7d68-kube-api-access-c9nfw\") pod \"glance-8a7f-account-create-update-rcjkk\" (UID: \"b57899ef-a548-4a0d-a677-529a876d7d68\") " pod="openstack/glance-8a7f-account-create-update-rcjkk" Apr 02 13:58:53 crc kubenswrapper[4732]: I0402 13:58:53.898741 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b57899ef-a548-4a0d-a677-529a876d7d68-operator-scripts\") pod \"glance-8a7f-account-create-update-rcjkk\" (UID: \"b57899ef-a548-4a0d-a677-529a876d7d68\") " pod="openstack/glance-8a7f-account-create-update-rcjkk" Apr 02 13:58:53 crc kubenswrapper[4732]: I0402 13:58:53.898801 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9nfw\" (UniqueName: \"kubernetes.io/projected/b57899ef-a548-4a0d-a677-529a876d7d68-kube-api-access-c9nfw\") pod \"glance-8a7f-account-create-update-rcjkk\" (UID: \"b57899ef-a548-4a0d-a677-529a876d7d68\") " pod="openstack/glance-8a7f-account-create-update-rcjkk" Apr 02 13:58:53 crc kubenswrapper[4732]: I0402 13:58:53.898849 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36a80dbf-30b0-40f2-b37f-01912226bb43-operator-scripts\") pod \"glance-db-create-9sdcc\" (UID: \"36a80dbf-30b0-40f2-b37f-01912226bb43\") " pod="openstack/glance-db-create-9sdcc" Apr 02 13:58:53 crc kubenswrapper[4732]: I0402 13:58:53.898879 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26v4p\" (UniqueName: \"kubernetes.io/projected/36a80dbf-30b0-40f2-b37f-01912226bb43-kube-api-access-26v4p\") pod \"glance-db-create-9sdcc\" (UID: \"36a80dbf-30b0-40f2-b37f-01912226bb43\") " pod="openstack/glance-db-create-9sdcc" Apr 02 13:58:53 crc kubenswrapper[4732]: I0402 13:58:53.899519 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b57899ef-a548-4a0d-a677-529a876d7d68-operator-scripts\") pod \"glance-8a7f-account-create-update-rcjkk\" (UID: \"b57899ef-a548-4a0d-a677-529a876d7d68\") " pod="openstack/glance-8a7f-account-create-update-rcjkk" Apr 02 13:58:53 crc kubenswrapper[4732]: I0402 13:58:53.921003 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9nfw\" (UniqueName: \"kubernetes.io/projected/b57899ef-a548-4a0d-a677-529a876d7d68-kube-api-access-c9nfw\") pod \"glance-8a7f-account-create-update-rcjkk\" (UID: \"b57899ef-a548-4a0d-a677-529a876d7d68\") " pod="openstack/glance-8a7f-account-create-update-rcjkk" Apr 02 13:58:54 crc kubenswrapper[4732]: I0402 13:58:54.000097 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36a80dbf-30b0-40f2-b37f-01912226bb43-operator-scripts\") pod \"glance-db-create-9sdcc\" (UID: \"36a80dbf-30b0-40f2-b37f-01912226bb43\") " pod="openstack/glance-db-create-9sdcc" Apr 02 13:58:54 crc kubenswrapper[4732]: I0402 13:58:54.000192 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26v4p\" (UniqueName: \"kubernetes.io/projected/36a80dbf-30b0-40f2-b37f-01912226bb43-kube-api-access-26v4p\") pod \"glance-db-create-9sdcc\" (UID: \"36a80dbf-30b0-40f2-b37f-01912226bb43\") " pod="openstack/glance-db-create-9sdcc" Apr 02 13:58:54 crc kubenswrapper[4732]: I0402 13:58:54.000914 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36a80dbf-30b0-40f2-b37f-01912226bb43-operator-scripts\") pod \"glance-db-create-9sdcc\" (UID: \"36a80dbf-30b0-40f2-b37f-01912226bb43\") " pod="openstack/glance-db-create-9sdcc" Apr 02 13:58:54 crc kubenswrapper[4732]: I0402 13:58:54.009321 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8a7f-account-create-update-rcjkk" Apr 02 13:58:54 crc kubenswrapper[4732]: I0402 13:58:54.034235 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26v4p\" (UniqueName: \"kubernetes.io/projected/36a80dbf-30b0-40f2-b37f-01912226bb43-kube-api-access-26v4p\") pod \"glance-db-create-9sdcc\" (UID: \"36a80dbf-30b0-40f2-b37f-01912226bb43\") " pod="openstack/glance-db-create-9sdcc" Apr 02 13:58:54 crc kubenswrapper[4732]: I0402 13:58:54.056103 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9sdcc" Apr 02 13:58:54 crc kubenswrapper[4732]: I0402 13:58:54.588836 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8a7f-account-create-update-rcjkk"] Apr 02 13:58:54 crc kubenswrapper[4732]: W0402 13:58:54.591799 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb57899ef_a548_4a0d_a677_529a876d7d68.slice/crio-b6688f0651f02c988c6cca5cedde77a74b53e76914edf37f2896fb2f41b91e64 WatchSource:0}: Error finding container b6688f0651f02c988c6cca5cedde77a74b53e76914edf37f2896fb2f41b91e64: Status 404 returned error can't find the container with id b6688f0651f02c988c6cca5cedde77a74b53e76914edf37f2896fb2f41b91e64 Apr 02 13:58:54 crc kubenswrapper[4732]: I0402 13:58:54.671876 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9sdcc"] Apr 02 13:58:54 crc kubenswrapper[4732]: W0402 13:58:54.676344 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36a80dbf_30b0_40f2_b37f_01912226bb43.slice/crio-19a1d81946c81d4faace8519304f5879588c1572114c02ba3d642200ad0d010d WatchSource:0}: Error finding container 19a1d81946c81d4faace8519304f5879588c1572114c02ba3d642200ad0d010d: Status 404 returned error can't find the container with id 19a1d81946c81d4faace8519304f5879588c1572114c02ba3d642200ad0d010d Apr 02 13:58:54 crc kubenswrapper[4732]: I0402 13:58:54.736778 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8a7f-account-create-update-rcjkk" event={"ID":"b57899ef-a548-4a0d-a677-529a876d7d68","Type":"ContainerStarted","Data":"b6688f0651f02c988c6cca5cedde77a74b53e76914edf37f2896fb2f41b91e64"} Apr 02 13:58:54 crc kubenswrapper[4732]: I0402 13:58:54.737961 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9sdcc" event={"ID":"36a80dbf-30b0-40f2-b37f-01912226bb43","Type":"ContainerStarted","Data":"19a1d81946c81d4faace8519304f5879588c1572114c02ba3d642200ad0d010d"} Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.351158 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-77mrb"] Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.352415 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-77mrb" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.355328 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.362767 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-77mrb"] Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.438497 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59006d63-1671-4062-92b6-8d7f5e36c778-operator-scripts\") pod \"root-account-create-update-77mrb\" (UID: \"59006d63-1671-4062-92b6-8d7f5e36c778\") " pod="openstack/root-account-create-update-77mrb" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.438595 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvxpm\" (UniqueName: \"kubernetes.io/projected/59006d63-1671-4062-92b6-8d7f5e36c778-kube-api-access-zvxpm\") pod \"root-account-create-update-77mrb\" (UID: \"59006d63-1671-4062-92b6-8d7f5e36c778\") " pod="openstack/root-account-create-update-77mrb" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.540175 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59006d63-1671-4062-92b6-8d7f5e36c778-operator-scripts\") pod \"root-account-create-update-77mrb\" (UID: \"59006d63-1671-4062-92b6-8d7f5e36c778\") " pod="openstack/root-account-create-update-77mrb" Apr 02 13:58:55 crc kubenswrapper[4732]: E0402 13:58:55.540538 4732 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Apr 02 13:58:55 crc kubenswrapper[4732]: E0402 13:58:55.540563 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Apr 02 13:58:55 crc kubenswrapper[4732]: E0402 13:58:55.540637 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/529a332e-d2c3-49c5-86d5-e672811d00cd-etc-swift podName:529a332e-d2c3-49c5-86d5-e672811d00cd nodeName:}" failed. No retries permitted until 2026-04-02 13:58:59.540601911 +0000 UTC m=+1296.445009464 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/529a332e-d2c3-49c5-86d5-e672811d00cd-etc-swift") pod "swift-storage-0" (UID: "529a332e-d2c3-49c5-86d5-e672811d00cd") : configmap "swift-ring-files" not found Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.540550 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/529a332e-d2c3-49c5-86d5-e672811d00cd-etc-swift\") pod \"swift-storage-0\" (UID: \"529a332e-d2c3-49c5-86d5-e672811d00cd\") " pod="openstack/swift-storage-0" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.540896 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59006d63-1671-4062-92b6-8d7f5e36c778-operator-scripts\") pod \"root-account-create-update-77mrb\" (UID: \"59006d63-1671-4062-92b6-8d7f5e36c778\") " pod="openstack/root-account-create-update-77mrb" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.541003 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvxpm\" (UniqueName: \"kubernetes.io/projected/59006d63-1671-4062-92b6-8d7f5e36c778-kube-api-access-zvxpm\") pod \"root-account-create-update-77mrb\" (UID: \"59006d63-1671-4062-92b6-8d7f5e36c778\") " pod="openstack/root-account-create-update-77mrb" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.560340 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvxpm\" (UniqueName: \"kubernetes.io/projected/59006d63-1671-4062-92b6-8d7f5e36c778-kube-api-access-zvxpm\") pod \"root-account-create-update-77mrb\" (UID: \"59006d63-1671-4062-92b6-8d7f5e36c778\") " pod="openstack/root-account-create-update-77mrb" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.606584 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-pzkkp"] Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.607866 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pzkkp" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.609910 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.609933 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.610064 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.645058 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-pzkkp"] Apr 02 13:58:55 crc kubenswrapper[4732]: E0402 13:58:55.645496 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-ls8v8 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-ls8v8 ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-pzkkp" podUID="ba58d0d3-ba5f-413e-b551-cd4825cf8214" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.653073 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-wsskk"] Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.654421 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wsskk" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.678110 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-77mrb" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.684741 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-wsskk"] Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.692128 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-pzkkp"] Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.746417 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/81138fff-8b7c-4cf3-8aa5-2582d80483e1-swiftconf\") pod \"swift-ring-rebalance-wsskk\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " pod="openstack/swift-ring-rebalance-wsskk" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.746470 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81138fff-8b7c-4cf3-8aa5-2582d80483e1-scripts\") pod \"swift-ring-rebalance-wsskk\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " pod="openstack/swift-ring-rebalance-wsskk" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.746495 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/81138fff-8b7c-4cf3-8aa5-2582d80483e1-etc-swift\") pod \"swift-ring-rebalance-wsskk\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " pod="openstack/swift-ring-rebalance-wsskk" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.746526 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba58d0d3-ba5f-413e-b551-cd4825cf8214-combined-ca-bundle\") pod \"swift-ring-rebalance-pzkkp\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " pod="openstack/swift-ring-rebalance-pzkkp" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.746556 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls8v8\" (UniqueName: \"kubernetes.io/projected/ba58d0d3-ba5f-413e-b551-cd4825cf8214-kube-api-access-ls8v8\") pod \"swift-ring-rebalance-pzkkp\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " pod="openstack/swift-ring-rebalance-pzkkp" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.746603 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/81138fff-8b7c-4cf3-8aa5-2582d80483e1-ring-data-devices\") pod \"swift-ring-rebalance-wsskk\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " pod="openstack/swift-ring-rebalance-wsskk" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.746647 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ba58d0d3-ba5f-413e-b551-cd4825cf8214-swiftconf\") pod \"swift-ring-rebalance-pzkkp\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " pod="openstack/swift-ring-rebalance-pzkkp" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.746671 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ba58d0d3-ba5f-413e-b551-cd4825cf8214-dispersionconf\") pod \"swift-ring-rebalance-pzkkp\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " pod="openstack/swift-ring-rebalance-pzkkp" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.746699 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/81138fff-8b7c-4cf3-8aa5-2582d80483e1-dispersionconf\") pod \"swift-ring-rebalance-wsskk\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " pod="openstack/swift-ring-rebalance-wsskk" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.746723 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkzc9\" (UniqueName: \"kubernetes.io/projected/81138fff-8b7c-4cf3-8aa5-2582d80483e1-kube-api-access-fkzc9\") pod \"swift-ring-rebalance-wsskk\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " pod="openstack/swift-ring-rebalance-wsskk" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.746991 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81138fff-8b7c-4cf3-8aa5-2582d80483e1-combined-ca-bundle\") pod \"swift-ring-rebalance-wsskk\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " pod="openstack/swift-ring-rebalance-wsskk" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.747009 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ba58d0d3-ba5f-413e-b551-cd4825cf8214-etc-swift\") pod \"swift-ring-rebalance-pzkkp\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " pod="openstack/swift-ring-rebalance-pzkkp" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.747036 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ba58d0d3-ba5f-413e-b551-cd4825cf8214-ring-data-devices\") pod \"swift-ring-rebalance-pzkkp\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " pod="openstack/swift-ring-rebalance-pzkkp" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.747126 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba58d0d3-ba5f-413e-b551-cd4825cf8214-scripts\") pod \"swift-ring-rebalance-pzkkp\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " pod="openstack/swift-ring-rebalance-pzkkp" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.747937 4732 generic.go:334] "Generic (PLEG): container finished" podID="b57899ef-a548-4a0d-a677-529a876d7d68" containerID="d1d0b2dec54ab79b1b90f5a344b79ec87fdccf2c932b5d875a946933032fd5a5" exitCode=0 Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.748077 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8a7f-account-create-update-rcjkk" event={"ID":"b57899ef-a548-4a0d-a677-529a876d7d68","Type":"ContainerDied","Data":"d1d0b2dec54ab79b1b90f5a344b79ec87fdccf2c932b5d875a946933032fd5a5"} Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.752519 4732 generic.go:334] "Generic (PLEG): container finished" podID="36a80dbf-30b0-40f2-b37f-01912226bb43" containerID="4b0d9f22e57ea5804e170db48b53d2c96be751433714eeedd5de906812ae372a" exitCode=0 Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.752602 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pzkkp" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.752799 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9sdcc" event={"ID":"36a80dbf-30b0-40f2-b37f-01912226bb43","Type":"ContainerDied","Data":"4b0d9f22e57ea5804e170db48b53d2c96be751433714eeedd5de906812ae372a"} Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.800762 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pzkkp" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.857796 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/81138fff-8b7c-4cf3-8aa5-2582d80483e1-dispersionconf\") pod \"swift-ring-rebalance-wsskk\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " pod="openstack/swift-ring-rebalance-wsskk" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.857856 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkzc9\" (UniqueName: \"kubernetes.io/projected/81138fff-8b7c-4cf3-8aa5-2582d80483e1-kube-api-access-fkzc9\") pod \"swift-ring-rebalance-wsskk\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " pod="openstack/swift-ring-rebalance-wsskk" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.857912 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81138fff-8b7c-4cf3-8aa5-2582d80483e1-combined-ca-bundle\") pod \"swift-ring-rebalance-wsskk\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " pod="openstack/swift-ring-rebalance-wsskk" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.857935 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ba58d0d3-ba5f-413e-b551-cd4825cf8214-etc-swift\") pod \"swift-ring-rebalance-pzkkp\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " pod="openstack/swift-ring-rebalance-pzkkp" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.858003 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ba58d0d3-ba5f-413e-b551-cd4825cf8214-ring-data-devices\") pod \"swift-ring-rebalance-pzkkp\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " pod="openstack/swift-ring-rebalance-pzkkp" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.858118 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba58d0d3-ba5f-413e-b551-cd4825cf8214-scripts\") pod \"swift-ring-rebalance-pzkkp\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " pod="openstack/swift-ring-rebalance-pzkkp" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.858147 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/81138fff-8b7c-4cf3-8aa5-2582d80483e1-swiftconf\") pod \"swift-ring-rebalance-wsskk\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " pod="openstack/swift-ring-rebalance-wsskk" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.858206 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81138fff-8b7c-4cf3-8aa5-2582d80483e1-scripts\") pod \"swift-ring-rebalance-wsskk\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " pod="openstack/swift-ring-rebalance-wsskk" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.858231 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/81138fff-8b7c-4cf3-8aa5-2582d80483e1-etc-swift\") pod \"swift-ring-rebalance-wsskk\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " pod="openstack/swift-ring-rebalance-wsskk" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.858259 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba58d0d3-ba5f-413e-b551-cd4825cf8214-combined-ca-bundle\") pod \"swift-ring-rebalance-pzkkp\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " pod="openstack/swift-ring-rebalance-pzkkp" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.858288 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls8v8\" (UniqueName: \"kubernetes.io/projected/ba58d0d3-ba5f-413e-b551-cd4825cf8214-kube-api-access-ls8v8\") pod \"swift-ring-rebalance-pzkkp\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " pod="openstack/swift-ring-rebalance-pzkkp" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.858342 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/81138fff-8b7c-4cf3-8aa5-2582d80483e1-ring-data-devices\") pod \"swift-ring-rebalance-wsskk\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " pod="openstack/swift-ring-rebalance-wsskk" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.858367 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ba58d0d3-ba5f-413e-b551-cd4825cf8214-swiftconf\") pod \"swift-ring-rebalance-pzkkp\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " pod="openstack/swift-ring-rebalance-pzkkp" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.858397 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ba58d0d3-ba5f-413e-b551-cd4825cf8214-dispersionconf\") pod \"swift-ring-rebalance-pzkkp\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " pod="openstack/swift-ring-rebalance-pzkkp" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.859863 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ba58d0d3-ba5f-413e-b551-cd4825cf8214-ring-data-devices\") pod \"swift-ring-rebalance-pzkkp\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " pod="openstack/swift-ring-rebalance-pzkkp" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.861991 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ba58d0d3-ba5f-413e-b551-cd4825cf8214-etc-swift\") pod \"swift-ring-rebalance-pzkkp\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " pod="openstack/swift-ring-rebalance-pzkkp" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.862340 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81138fff-8b7c-4cf3-8aa5-2582d80483e1-scripts\") pod \"swift-ring-rebalance-wsskk\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " pod="openstack/swift-ring-rebalance-wsskk" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.862595 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ba58d0d3-ba5f-413e-b551-cd4825cf8214-dispersionconf\") pod \"swift-ring-rebalance-pzkkp\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " pod="openstack/swift-ring-rebalance-pzkkp" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.862673 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/81138fff-8b7c-4cf3-8aa5-2582d80483e1-ring-data-devices\") pod \"swift-ring-rebalance-wsskk\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " pod="openstack/swift-ring-rebalance-wsskk" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.863311 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/81138fff-8b7c-4cf3-8aa5-2582d80483e1-etc-swift\") pod \"swift-ring-rebalance-wsskk\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " pod="openstack/swift-ring-rebalance-wsskk" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.865241 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/81138fff-8b7c-4cf3-8aa5-2582d80483e1-dispersionconf\") pod \"swift-ring-rebalance-wsskk\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " pod="openstack/swift-ring-rebalance-wsskk" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.865424 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba58d0d3-ba5f-413e-b551-cd4825cf8214-combined-ca-bundle\") pod \"swift-ring-rebalance-pzkkp\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " pod="openstack/swift-ring-rebalance-pzkkp" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.865599 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81138fff-8b7c-4cf3-8aa5-2582d80483e1-combined-ca-bundle\") pod \"swift-ring-rebalance-wsskk\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " pod="openstack/swift-ring-rebalance-wsskk" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.871106 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba58d0d3-ba5f-413e-b551-cd4825cf8214-scripts\") pod \"swift-ring-rebalance-pzkkp\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " pod="openstack/swift-ring-rebalance-pzkkp" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.877494 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/81138fff-8b7c-4cf3-8aa5-2582d80483e1-swiftconf\") pod \"swift-ring-rebalance-wsskk\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " pod="openstack/swift-ring-rebalance-wsskk" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.878968 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ba58d0d3-ba5f-413e-b551-cd4825cf8214-swiftconf\") pod \"swift-ring-rebalance-pzkkp\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " pod="openstack/swift-ring-rebalance-pzkkp" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.884441 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkzc9\" (UniqueName: \"kubernetes.io/projected/81138fff-8b7c-4cf3-8aa5-2582d80483e1-kube-api-access-fkzc9\") pod \"swift-ring-rebalance-wsskk\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " pod="openstack/swift-ring-rebalance-wsskk" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.884736 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls8v8\" (UniqueName: \"kubernetes.io/projected/ba58d0d3-ba5f-413e-b551-cd4825cf8214-kube-api-access-ls8v8\") pod \"swift-ring-rebalance-pzkkp\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " pod="openstack/swift-ring-rebalance-pzkkp" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.959677 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ba58d0d3-ba5f-413e-b551-cd4825cf8214-dispersionconf\") pod \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.959834 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba58d0d3-ba5f-413e-b551-cd4825cf8214-scripts\") pod \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.959861 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ba58d0d3-ba5f-413e-b551-cd4825cf8214-etc-swift\") pod \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.959943 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls8v8\" (UniqueName: \"kubernetes.io/projected/ba58d0d3-ba5f-413e-b551-cd4825cf8214-kube-api-access-ls8v8\") pod \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.959978 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ba58d0d3-ba5f-413e-b551-cd4825cf8214-ring-data-devices\") pod \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.960015 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ba58d0d3-ba5f-413e-b551-cd4825cf8214-swiftconf\") pod \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.960084 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba58d0d3-ba5f-413e-b551-cd4825cf8214-combined-ca-bundle\") pod \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\" (UID: \"ba58d0d3-ba5f-413e-b551-cd4825cf8214\") " Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.960545 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba58d0d3-ba5f-413e-b551-cd4825cf8214-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ba58d0d3-ba5f-413e-b551-cd4825cf8214" (UID: "ba58d0d3-ba5f-413e-b551-cd4825cf8214"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.960757 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba58d0d3-ba5f-413e-b551-cd4825cf8214-scripts" (OuterVolumeSpecName: "scripts") pod "ba58d0d3-ba5f-413e-b551-cd4825cf8214" (UID: "ba58d0d3-ba5f-413e-b551-cd4825cf8214"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.960991 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba58d0d3-ba5f-413e-b551-cd4825cf8214-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ba58d0d3-ba5f-413e-b551-cd4825cf8214" (UID: "ba58d0d3-ba5f-413e-b551-cd4825cf8214"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.963165 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba58d0d3-ba5f-413e-b551-cd4825cf8214-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ba58d0d3-ba5f-413e-b551-cd4825cf8214" (UID: "ba58d0d3-ba5f-413e-b551-cd4825cf8214"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.963459 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba58d0d3-ba5f-413e-b551-cd4825cf8214-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.963493 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ba58d0d3-ba5f-413e-b551-cd4825cf8214-etc-swift\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.963507 4732 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ba58d0d3-ba5f-413e-b551-cd4825cf8214-ring-data-devices\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.963519 4732 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ba58d0d3-ba5f-413e-b551-cd4825cf8214-dispersionconf\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.964163 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba58d0d3-ba5f-413e-b551-cd4825cf8214-kube-api-access-ls8v8" (OuterVolumeSpecName: "kube-api-access-ls8v8") pod "ba58d0d3-ba5f-413e-b551-cd4825cf8214" (UID: "ba58d0d3-ba5f-413e-b551-cd4825cf8214"). InnerVolumeSpecName "kube-api-access-ls8v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.964468 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba58d0d3-ba5f-413e-b551-cd4825cf8214-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ba58d0d3-ba5f-413e-b551-cd4825cf8214" (UID: "ba58d0d3-ba5f-413e-b551-cd4825cf8214"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.964492 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba58d0d3-ba5f-413e-b551-cd4825cf8214-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba58d0d3-ba5f-413e-b551-cd4825cf8214" (UID: "ba58d0d3-ba5f-413e-b551-cd4825cf8214"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:58:55 crc kubenswrapper[4732]: I0402 13:58:55.973129 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wsskk" Apr 02 13:58:56 crc kubenswrapper[4732]: I0402 13:58:56.066581 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls8v8\" (UniqueName: \"kubernetes.io/projected/ba58d0d3-ba5f-413e-b551-cd4825cf8214-kube-api-access-ls8v8\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:56 crc kubenswrapper[4732]: I0402 13:58:56.066600 4732 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ba58d0d3-ba5f-413e-b551-cd4825cf8214-swiftconf\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:56 crc kubenswrapper[4732]: I0402 13:58:56.066625 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba58d0d3-ba5f-413e-b551-cd4825cf8214-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:56 crc kubenswrapper[4732]: I0402 13:58:56.136942 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-77mrb"] Apr 02 13:58:56 crc kubenswrapper[4732]: W0402 13:58:56.152177 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59006d63_1671_4062_92b6_8d7f5e36c778.slice/crio-265cd8bce995e5afaf4d48982e4ad056e213ea54e4218c80266b97604cde8bb0 WatchSource:0}: Error finding container 265cd8bce995e5afaf4d48982e4ad056e213ea54e4218c80266b97604cde8bb0: Status 404 returned error can't find the container with id 265cd8bce995e5afaf4d48982e4ad056e213ea54e4218c80266b97604cde8bb0 Apr 02 13:58:56 crc kubenswrapper[4732]: I0402 13:58:56.406861 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-wsskk"] Apr 02 13:58:56 crc kubenswrapper[4732]: W0402 13:58:56.414892 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81138fff_8b7c_4cf3_8aa5_2582d80483e1.slice/crio-85894cf01fad8b1a6c5ba1a9a6e4318ca380444c5ecac39f962576034fcbc3b8 WatchSource:0}: Error finding container 85894cf01fad8b1a6c5ba1a9a6e4318ca380444c5ecac39f962576034fcbc3b8: Status 404 returned error can't find the container with id 85894cf01fad8b1a6c5ba1a9a6e4318ca380444c5ecac39f962576034fcbc3b8 Apr 02 13:58:56 crc kubenswrapper[4732]: I0402 13:58:56.760759 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wsskk" event={"ID":"81138fff-8b7c-4cf3-8aa5-2582d80483e1","Type":"ContainerStarted","Data":"85894cf01fad8b1a6c5ba1a9a6e4318ca380444c5ecac39f962576034fcbc3b8"} Apr 02 13:58:56 crc kubenswrapper[4732]: I0402 13:58:56.762473 4732 generic.go:334] "Generic (PLEG): container finished" podID="59006d63-1671-4062-92b6-8d7f5e36c778" containerID="5bdcee330725e133a8edcdc53f64fb04d483a11824993f03fbddd3ec3984ced2" exitCode=0 Apr 02 13:58:56 crc kubenswrapper[4732]: I0402 13:58:56.762584 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-77mrb" event={"ID":"59006d63-1671-4062-92b6-8d7f5e36c778","Type":"ContainerDied","Data":"5bdcee330725e133a8edcdc53f64fb04d483a11824993f03fbddd3ec3984ced2"} Apr 02 13:58:56 crc kubenswrapper[4732]: I0402 13:58:56.762653 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-77mrb" event={"ID":"59006d63-1671-4062-92b6-8d7f5e36c778","Type":"ContainerStarted","Data":"265cd8bce995e5afaf4d48982e4ad056e213ea54e4218c80266b97604cde8bb0"} Apr 02 13:58:56 crc kubenswrapper[4732]: I0402 13:58:56.762696 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pzkkp" Apr 02 13:58:56 crc kubenswrapper[4732]: I0402 13:58:56.854554 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-pzkkp"] Apr 02 13:58:56 crc kubenswrapper[4732]: I0402 13:58:56.861324 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-pzkkp"] Apr 02 13:58:57 crc kubenswrapper[4732]: I0402 13:58:57.228166 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8a7f-account-create-update-rcjkk" Apr 02 13:58:57 crc kubenswrapper[4732]: I0402 13:58:57.238415 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9sdcc" Apr 02 13:58:57 crc kubenswrapper[4732]: I0402 13:58:57.393063 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9nfw\" (UniqueName: \"kubernetes.io/projected/b57899ef-a548-4a0d-a677-529a876d7d68-kube-api-access-c9nfw\") pod \"b57899ef-a548-4a0d-a677-529a876d7d68\" (UID: \"b57899ef-a548-4a0d-a677-529a876d7d68\") " Apr 02 13:58:57 crc kubenswrapper[4732]: I0402 13:58:57.393219 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26v4p\" (UniqueName: \"kubernetes.io/projected/36a80dbf-30b0-40f2-b37f-01912226bb43-kube-api-access-26v4p\") pod \"36a80dbf-30b0-40f2-b37f-01912226bb43\" (UID: \"36a80dbf-30b0-40f2-b37f-01912226bb43\") " Apr 02 13:58:57 crc kubenswrapper[4732]: I0402 13:58:57.393360 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36a80dbf-30b0-40f2-b37f-01912226bb43-operator-scripts\") pod \"36a80dbf-30b0-40f2-b37f-01912226bb43\" (UID: \"36a80dbf-30b0-40f2-b37f-01912226bb43\") " Apr 02 13:58:57 crc kubenswrapper[4732]: I0402 13:58:57.393390 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b57899ef-a548-4a0d-a677-529a876d7d68-operator-scripts\") pod \"b57899ef-a548-4a0d-a677-529a876d7d68\" (UID: \"b57899ef-a548-4a0d-a677-529a876d7d68\") " Apr 02 13:58:57 crc kubenswrapper[4732]: I0402 13:58:57.393961 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36a80dbf-30b0-40f2-b37f-01912226bb43-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "36a80dbf-30b0-40f2-b37f-01912226bb43" (UID: "36a80dbf-30b0-40f2-b37f-01912226bb43"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:58:57 crc kubenswrapper[4732]: I0402 13:58:57.394025 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b57899ef-a548-4a0d-a677-529a876d7d68-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b57899ef-a548-4a0d-a677-529a876d7d68" (UID: "b57899ef-a548-4a0d-a677-529a876d7d68"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:58:57 crc kubenswrapper[4732]: I0402 13:58:57.394299 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36a80dbf-30b0-40f2-b37f-01912226bb43-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:57 crc kubenswrapper[4732]: I0402 13:58:57.394316 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b57899ef-a548-4a0d-a677-529a876d7d68-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:57 crc kubenswrapper[4732]: I0402 13:58:57.398726 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b57899ef-a548-4a0d-a677-529a876d7d68-kube-api-access-c9nfw" (OuterVolumeSpecName: "kube-api-access-c9nfw") pod "b57899ef-a548-4a0d-a677-529a876d7d68" (UID: "b57899ef-a548-4a0d-a677-529a876d7d68"). InnerVolumeSpecName "kube-api-access-c9nfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:58:57 crc kubenswrapper[4732]: I0402 13:58:57.410963 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36a80dbf-30b0-40f2-b37f-01912226bb43-kube-api-access-26v4p" (OuterVolumeSpecName: "kube-api-access-26v4p") pod "36a80dbf-30b0-40f2-b37f-01912226bb43" (UID: "36a80dbf-30b0-40f2-b37f-01912226bb43"). InnerVolumeSpecName "kube-api-access-26v4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:58:57 crc kubenswrapper[4732]: I0402 13:58:57.495543 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26v4p\" (UniqueName: \"kubernetes.io/projected/36a80dbf-30b0-40f2-b37f-01912226bb43-kube-api-access-26v4p\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:57 crc kubenswrapper[4732]: I0402 13:58:57.495808 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9nfw\" (UniqueName: \"kubernetes.io/projected/b57899ef-a548-4a0d-a677-529a876d7d68-kube-api-access-c9nfw\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:57 crc kubenswrapper[4732]: I0402 13:58:57.772082 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8a7f-account-create-update-rcjkk" event={"ID":"b57899ef-a548-4a0d-a677-529a876d7d68","Type":"ContainerDied","Data":"b6688f0651f02c988c6cca5cedde77a74b53e76914edf37f2896fb2f41b91e64"} Apr 02 13:58:57 crc kubenswrapper[4732]: I0402 13:58:57.772131 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6688f0651f02c988c6cca5cedde77a74b53e76914edf37f2896fb2f41b91e64" Apr 02 13:58:57 crc kubenswrapper[4732]: I0402 13:58:57.772194 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8a7f-account-create-update-rcjkk" Apr 02 13:58:57 crc kubenswrapper[4732]: I0402 13:58:57.782582 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9sdcc" event={"ID":"36a80dbf-30b0-40f2-b37f-01912226bb43","Type":"ContainerDied","Data":"19a1d81946c81d4faace8519304f5879588c1572114c02ba3d642200ad0d010d"} Apr 02 13:58:57 crc kubenswrapper[4732]: I0402 13:58:57.782606 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9sdcc" Apr 02 13:58:57 crc kubenswrapper[4732]: I0402 13:58:57.782636 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19a1d81946c81d4faace8519304f5879588c1572114c02ba3d642200ad0d010d" Apr 02 13:58:58 crc kubenswrapper[4732]: I0402 13:58:58.700268 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba58d0d3-ba5f-413e-b551-cd4825cf8214" path="/var/lib/kubelet/pods/ba58d0d3-ba5f-413e-b551-cd4825cf8214/volumes" Apr 02 13:58:58 crc kubenswrapper[4732]: I0402 13:58:58.930864 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-cs4gt"] Apr 02 13:58:58 crc kubenswrapper[4732]: E0402 13:58:58.931178 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b57899ef-a548-4a0d-a677-529a876d7d68" containerName="mariadb-account-create-update" Apr 02 13:58:58 crc kubenswrapper[4732]: I0402 13:58:58.931191 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b57899ef-a548-4a0d-a677-529a876d7d68" containerName="mariadb-account-create-update" Apr 02 13:58:58 crc kubenswrapper[4732]: E0402 13:58:58.931208 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36a80dbf-30b0-40f2-b37f-01912226bb43" containerName="mariadb-database-create" Apr 02 13:58:58 crc kubenswrapper[4732]: I0402 13:58:58.931216 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a80dbf-30b0-40f2-b37f-01912226bb43" containerName="mariadb-database-create" Apr 02 13:58:58 crc kubenswrapper[4732]: I0402 13:58:58.931384 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="36a80dbf-30b0-40f2-b37f-01912226bb43" containerName="mariadb-database-create" Apr 02 13:58:58 crc kubenswrapper[4732]: I0402 13:58:58.931394 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="b57899ef-a548-4a0d-a677-529a876d7d68" containerName="mariadb-account-create-update" Apr 02 13:58:58 crc kubenswrapper[4732]: I0402 13:58:58.933369 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-cs4gt" Apr 02 13:58:58 crc kubenswrapper[4732]: I0402 13:58:58.935525 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vk67c" Apr 02 13:58:58 crc kubenswrapper[4732]: I0402 13:58:58.935526 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Apr 02 13:58:58 crc kubenswrapper[4732]: I0402 13:58:58.946165 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-cs4gt"] Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.023987 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6gnf\" (UniqueName: \"kubernetes.io/projected/e3eee308-f9e6-4475-a2a4-2116af760963-kube-api-access-s6gnf\") pod \"glance-db-sync-cs4gt\" (UID: \"e3eee308-f9e6-4475-a2a4-2116af760963\") " pod="openstack/glance-db-sync-cs4gt" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.024322 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3eee308-f9e6-4475-a2a4-2116af760963-config-data\") pod \"glance-db-sync-cs4gt\" (UID: \"e3eee308-f9e6-4475-a2a4-2116af760963\") " pod="openstack/glance-db-sync-cs4gt" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.024407 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3eee308-f9e6-4475-a2a4-2116af760963-combined-ca-bundle\") pod \"glance-db-sync-cs4gt\" (UID: \"e3eee308-f9e6-4475-a2a4-2116af760963\") " pod="openstack/glance-db-sync-cs4gt" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.024687 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e3eee308-f9e6-4475-a2a4-2116af760963-db-sync-config-data\") pod \"glance-db-sync-cs4gt\" (UID: \"e3eee308-f9e6-4475-a2a4-2116af760963\") " pod="openstack/glance-db-sync-cs4gt" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.134825 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3eee308-f9e6-4475-a2a4-2116af760963-combined-ca-bundle\") pod \"glance-db-sync-cs4gt\" (UID: \"e3eee308-f9e6-4475-a2a4-2116af760963\") " pod="openstack/glance-db-sync-cs4gt" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.134927 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e3eee308-f9e6-4475-a2a4-2116af760963-db-sync-config-data\") pod \"glance-db-sync-cs4gt\" (UID: \"e3eee308-f9e6-4475-a2a4-2116af760963\") " pod="openstack/glance-db-sync-cs4gt" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.134983 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6gnf\" (UniqueName: \"kubernetes.io/projected/e3eee308-f9e6-4475-a2a4-2116af760963-kube-api-access-s6gnf\") pod \"glance-db-sync-cs4gt\" (UID: \"e3eee308-f9e6-4475-a2a4-2116af760963\") " pod="openstack/glance-db-sync-cs4gt" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.135003 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3eee308-f9e6-4475-a2a4-2116af760963-config-data\") pod \"glance-db-sync-cs4gt\" (UID: \"e3eee308-f9e6-4475-a2a4-2116af760963\") " pod="openstack/glance-db-sync-cs4gt" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.139894 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3eee308-f9e6-4475-a2a4-2116af760963-combined-ca-bundle\") pod \"glance-db-sync-cs4gt\" (UID: \"e3eee308-f9e6-4475-a2a4-2116af760963\") " pod="openstack/glance-db-sync-cs4gt" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.140543 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e3eee308-f9e6-4475-a2a4-2116af760963-db-sync-config-data\") pod \"glance-db-sync-cs4gt\" (UID: \"e3eee308-f9e6-4475-a2a4-2116af760963\") " pod="openstack/glance-db-sync-cs4gt" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.147042 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3eee308-f9e6-4475-a2a4-2116af760963-config-data\") pod \"glance-db-sync-cs4gt\" (UID: \"e3eee308-f9e6-4475-a2a4-2116af760963\") " pod="openstack/glance-db-sync-cs4gt" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.157797 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6gnf\" (UniqueName: \"kubernetes.io/projected/e3eee308-f9e6-4475-a2a4-2116af760963-kube-api-access-s6gnf\") pod \"glance-db-sync-cs4gt\" (UID: \"e3eee308-f9e6-4475-a2a4-2116af760963\") " pod="openstack/glance-db-sync-cs4gt" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.266280 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-cs4gt" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.277548 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-77mrb" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.432643 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-lmkhl"] Apr 02 13:58:59 crc kubenswrapper[4732]: E0402 13:58:59.433361 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59006d63-1671-4062-92b6-8d7f5e36c778" containerName="mariadb-account-create-update" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.433377 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="59006d63-1671-4062-92b6-8d7f5e36c778" containerName="mariadb-account-create-update" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.433733 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="59006d63-1671-4062-92b6-8d7f5e36c778" containerName="mariadb-account-create-update" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.434386 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lmkhl" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.438925 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59006d63-1671-4062-92b6-8d7f5e36c778-operator-scripts\") pod \"59006d63-1671-4062-92b6-8d7f5e36c778\" (UID: \"59006d63-1671-4062-92b6-8d7f5e36c778\") " Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.438985 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvxpm\" (UniqueName: \"kubernetes.io/projected/59006d63-1671-4062-92b6-8d7f5e36c778-kube-api-access-zvxpm\") pod \"59006d63-1671-4062-92b6-8d7f5e36c778\" (UID: \"59006d63-1671-4062-92b6-8d7f5e36c778\") " Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.439843 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59006d63-1671-4062-92b6-8d7f5e36c778-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "59006d63-1671-4062-92b6-8d7f5e36c778" (UID: "59006d63-1671-4062-92b6-8d7f5e36c778"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.440057 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-lmkhl"] Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.450550 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59006d63-1671-4062-92b6-8d7f5e36c778-kube-api-access-zvxpm" (OuterVolumeSpecName: "kube-api-access-zvxpm") pod "59006d63-1671-4062-92b6-8d7f5e36c778" (UID: "59006d63-1671-4062-92b6-8d7f5e36c778"). InnerVolumeSpecName "kube-api-access-zvxpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.538907 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-56c3-account-create-update-xg9gr"] Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.540474 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b406167-98d8-46ae-8319-5734de01a2e0-operator-scripts\") pod \"keystone-db-create-lmkhl\" (UID: \"6b406167-98d8-46ae-8319-5734de01a2e0\") " pod="openstack/keystone-db-create-lmkhl" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.540572 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twp2z\" (UniqueName: \"kubernetes.io/projected/6b406167-98d8-46ae-8319-5734de01a2e0-kube-api-access-twp2z\") pod \"keystone-db-create-lmkhl\" (UID: \"6b406167-98d8-46ae-8319-5734de01a2e0\") " pod="openstack/keystone-db-create-lmkhl" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.540495 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56c3-account-create-update-xg9gr" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.540710 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59006d63-1671-4062-92b6-8d7f5e36c778-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.540747 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvxpm\" (UniqueName: \"kubernetes.io/projected/59006d63-1671-4062-92b6-8d7f5e36c778-kube-api-access-zvxpm\") on node \"crc\" DevicePath \"\"" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.543289 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.562221 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-56c3-account-create-update-xg9gr"] Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.641829 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90e2fd4b-3c18-4414-93ea-6433d6c52f80-operator-scripts\") pod \"keystone-56c3-account-create-update-xg9gr\" (UID: \"90e2fd4b-3c18-4414-93ea-6433d6c52f80\") " pod="openstack/keystone-56c3-account-create-update-xg9gr" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.641937 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpzm5\" (UniqueName: \"kubernetes.io/projected/90e2fd4b-3c18-4414-93ea-6433d6c52f80-kube-api-access-mpzm5\") pod \"keystone-56c3-account-create-update-xg9gr\" (UID: \"90e2fd4b-3c18-4414-93ea-6433d6c52f80\") " pod="openstack/keystone-56c3-account-create-update-xg9gr" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.641983 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b406167-98d8-46ae-8319-5734de01a2e0-operator-scripts\") pod \"keystone-db-create-lmkhl\" (UID: \"6b406167-98d8-46ae-8319-5734de01a2e0\") " pod="openstack/keystone-db-create-lmkhl" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.642098 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/529a332e-d2c3-49c5-86d5-e672811d00cd-etc-swift\") pod \"swift-storage-0\" (UID: \"529a332e-d2c3-49c5-86d5-e672811d00cd\") " pod="openstack/swift-storage-0" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.642134 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twp2z\" (UniqueName: \"kubernetes.io/projected/6b406167-98d8-46ae-8319-5734de01a2e0-kube-api-access-twp2z\") pod \"keystone-db-create-lmkhl\" (UID: \"6b406167-98d8-46ae-8319-5734de01a2e0\") " pod="openstack/keystone-db-create-lmkhl" Apr 02 13:58:59 crc kubenswrapper[4732]: E0402 13:58:59.642465 4732 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Apr 02 13:58:59 crc kubenswrapper[4732]: E0402 13:58:59.642503 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Apr 02 13:58:59 crc kubenswrapper[4732]: E0402 13:58:59.642561 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/529a332e-d2c3-49c5-86d5-e672811d00cd-etc-swift podName:529a332e-d2c3-49c5-86d5-e672811d00cd nodeName:}" failed. No retries permitted until 2026-04-02 13:59:07.642538267 +0000 UTC m=+1304.546945880 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/529a332e-d2c3-49c5-86d5-e672811d00cd-etc-swift") pod "swift-storage-0" (UID: "529a332e-d2c3-49c5-86d5-e672811d00cd") : configmap "swift-ring-files" not found Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.643253 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b406167-98d8-46ae-8319-5734de01a2e0-operator-scripts\") pod \"keystone-db-create-lmkhl\" (UID: \"6b406167-98d8-46ae-8319-5734de01a2e0\") " pod="openstack/keystone-db-create-lmkhl" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.658650 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-sjkqb"] Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.659939 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-sjkqb" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.661629 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twp2z\" (UniqueName: \"kubernetes.io/projected/6b406167-98d8-46ae-8319-5734de01a2e0-kube-api-access-twp2z\") pod \"keystone-db-create-lmkhl\" (UID: \"6b406167-98d8-46ae-8319-5734de01a2e0\") " pod="openstack/keystone-db-create-lmkhl" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.665735 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-sjkqb"] Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.736092 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-da70-account-create-update-pl545"] Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.737160 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-da70-account-create-update-pl545" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.739518 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.744142 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90e2fd4b-3c18-4414-93ea-6433d6c52f80-operator-scripts\") pod \"keystone-56c3-account-create-update-xg9gr\" (UID: \"90e2fd4b-3c18-4414-93ea-6433d6c52f80\") " pod="openstack/keystone-56c3-account-create-update-xg9gr" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.744205 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d495f3e-102b-45f0-88e4-fe3777c11b97-operator-scripts\") pod \"placement-db-create-sjkqb\" (UID: \"1d495f3e-102b-45f0-88e4-fe3777c11b97\") " pod="openstack/placement-db-create-sjkqb" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.744224 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8fkm\" (UniqueName: \"kubernetes.io/projected/1d495f3e-102b-45f0-88e4-fe3777c11b97-kube-api-access-m8fkm\") pod \"placement-db-create-sjkqb\" (UID: \"1d495f3e-102b-45f0-88e4-fe3777c11b97\") " pod="openstack/placement-db-create-sjkqb" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.744269 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpzm5\" (UniqueName: \"kubernetes.io/projected/90e2fd4b-3c18-4414-93ea-6433d6c52f80-kube-api-access-mpzm5\") pod \"keystone-56c3-account-create-update-xg9gr\" (UID: \"90e2fd4b-3c18-4414-93ea-6433d6c52f80\") " pod="openstack/keystone-56c3-account-create-update-xg9gr" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.745281 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90e2fd4b-3c18-4414-93ea-6433d6c52f80-operator-scripts\") pod \"keystone-56c3-account-create-update-xg9gr\" (UID: \"90e2fd4b-3c18-4414-93ea-6433d6c52f80\") " pod="openstack/keystone-56c3-account-create-update-xg9gr" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.753015 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-da70-account-create-update-pl545"] Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.757312 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lmkhl" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.767481 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpzm5\" (UniqueName: \"kubernetes.io/projected/90e2fd4b-3c18-4414-93ea-6433d6c52f80-kube-api-access-mpzm5\") pod \"keystone-56c3-account-create-update-xg9gr\" (UID: \"90e2fd4b-3c18-4414-93ea-6433d6c52f80\") " pod="openstack/keystone-56c3-account-create-update-xg9gr" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.799269 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wsskk" event={"ID":"81138fff-8b7c-4cf3-8aa5-2582d80483e1","Type":"ContainerStarted","Data":"22fb6eb8d107d107c9cbc9c086c306a92277eebd8dc426055926610f504f12e6"} Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.803360 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-77mrb" event={"ID":"59006d63-1671-4062-92b6-8d7f5e36c778","Type":"ContainerDied","Data":"265cd8bce995e5afaf4d48982e4ad056e213ea54e4218c80266b97604cde8bb0"} Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.803403 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="265cd8bce995e5afaf4d48982e4ad056e213ea54e4218c80266b97604cde8bb0" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.803479 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-77mrb" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.829728 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-wsskk" podStartSLOduration=1.964664092 podStartE2EDuration="4.829701598s" podCreationTimestamp="2026-04-02 13:58:55 +0000 UTC" firstStartedPulling="2026-04-02 13:58:56.417404601 +0000 UTC m=+1293.321812154" lastFinishedPulling="2026-04-02 13:58:59.282442107 +0000 UTC m=+1296.186849660" observedRunningTime="2026-04-02 13:58:59.824530429 +0000 UTC m=+1296.728938002" watchObservedRunningTime="2026-04-02 13:58:59.829701598 +0000 UTC m=+1296.734109151" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.846029 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7907b3e6-ba65-4bc8-bf30-dd79cab266b5-operator-scripts\") pod \"placement-da70-account-create-update-pl545\" (UID: \"7907b3e6-ba65-4bc8-bf30-dd79cab266b5\") " pod="openstack/placement-da70-account-create-update-pl545" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.846729 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d495f3e-102b-45f0-88e4-fe3777c11b97-operator-scripts\") pod \"placement-db-create-sjkqb\" (UID: \"1d495f3e-102b-45f0-88e4-fe3777c11b97\") " pod="openstack/placement-db-create-sjkqb" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.846767 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8fkm\" (UniqueName: \"kubernetes.io/projected/1d495f3e-102b-45f0-88e4-fe3777c11b97-kube-api-access-m8fkm\") pod \"placement-db-create-sjkqb\" (UID: \"1d495f3e-102b-45f0-88e4-fe3777c11b97\") " pod="openstack/placement-db-create-sjkqb" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.846792 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7d49\" (UniqueName: \"kubernetes.io/projected/7907b3e6-ba65-4bc8-bf30-dd79cab266b5-kube-api-access-d7d49\") pod \"placement-da70-account-create-update-pl545\" (UID: \"7907b3e6-ba65-4bc8-bf30-dd79cab266b5\") " pod="openstack/placement-da70-account-create-update-pl545" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.847831 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d495f3e-102b-45f0-88e4-fe3777c11b97-operator-scripts\") pod \"placement-db-create-sjkqb\" (UID: \"1d495f3e-102b-45f0-88e4-fe3777c11b97\") " pod="openstack/placement-db-create-sjkqb" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.853505 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-cs4gt"] Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.868573 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56c3-account-create-update-xg9gr" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.869127 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8fkm\" (UniqueName: \"kubernetes.io/projected/1d495f3e-102b-45f0-88e4-fe3777c11b97-kube-api-access-m8fkm\") pod \"placement-db-create-sjkqb\" (UID: \"1d495f3e-102b-45f0-88e4-fe3777c11b97\") " pod="openstack/placement-db-create-sjkqb" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.948768 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7907b3e6-ba65-4bc8-bf30-dd79cab266b5-operator-scripts\") pod \"placement-da70-account-create-update-pl545\" (UID: \"7907b3e6-ba65-4bc8-bf30-dd79cab266b5\") " pod="openstack/placement-da70-account-create-update-pl545" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.948907 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7d49\" (UniqueName: \"kubernetes.io/projected/7907b3e6-ba65-4bc8-bf30-dd79cab266b5-kube-api-access-d7d49\") pod \"placement-da70-account-create-update-pl545\" (UID: \"7907b3e6-ba65-4bc8-bf30-dd79cab266b5\") " pod="openstack/placement-da70-account-create-update-pl545" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.950605 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7907b3e6-ba65-4bc8-bf30-dd79cab266b5-operator-scripts\") pod \"placement-da70-account-create-update-pl545\" (UID: \"7907b3e6-ba65-4bc8-bf30-dd79cab266b5\") " pod="openstack/placement-da70-account-create-update-pl545" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.971324 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7d49\" (UniqueName: \"kubernetes.io/projected/7907b3e6-ba65-4bc8-bf30-dd79cab266b5-kube-api-access-d7d49\") pod \"placement-da70-account-create-update-pl545\" (UID: \"7907b3e6-ba65-4bc8-bf30-dd79cab266b5\") " pod="openstack/placement-da70-account-create-update-pl545" Apr 02 13:58:59 crc kubenswrapper[4732]: I0402 13:58:59.977977 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-sjkqb" Apr 02 13:59:00 crc kubenswrapper[4732]: I0402 13:59:00.054549 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-da70-account-create-update-pl545" Apr 02 13:59:00 crc kubenswrapper[4732]: I0402 13:59:00.230347 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-lmkhl"] Apr 02 13:59:00 crc kubenswrapper[4732]: I0402 13:59:00.354306 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-56c3-account-create-update-xg9gr"] Apr 02 13:59:00 crc kubenswrapper[4732]: I0402 13:59:00.509295 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-sjkqb"] Apr 02 13:59:00 crc kubenswrapper[4732]: W0402 13:59:00.526477 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d495f3e_102b_45f0_88e4_fe3777c11b97.slice/crio-6364b56e479eaca0e359e0227d48dec3e1935b5bef53673b05f6b8c268ac08f6 WatchSource:0}: Error finding container 6364b56e479eaca0e359e0227d48dec3e1935b5bef53673b05f6b8c268ac08f6: Status 404 returned error can't find the container with id 6364b56e479eaca0e359e0227d48dec3e1935b5bef53673b05f6b8c268ac08f6 Apr 02 13:59:00 crc kubenswrapper[4732]: I0402 13:59:00.635281 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-da70-account-create-update-pl545"] Apr 02 13:59:00 crc kubenswrapper[4732]: W0402 13:59:00.645770 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7907b3e6_ba65_4bc8_bf30_dd79cab266b5.slice/crio-086132d0d3617ed68d798ad9549736d50cca8b260a7b0c3fa34a9d6cb7ad9ca7 WatchSource:0}: Error finding container 086132d0d3617ed68d798ad9549736d50cca8b260a7b0c3fa34a9d6cb7ad9ca7: Status 404 returned error can't find the container with id 086132d0d3617ed68d798ad9549736d50cca8b260a7b0c3fa34a9d6cb7ad9ca7 Apr 02 13:59:00 crc kubenswrapper[4732]: I0402 13:59:00.814201 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56c3-account-create-update-xg9gr" event={"ID":"90e2fd4b-3c18-4414-93ea-6433d6c52f80","Type":"ContainerStarted","Data":"f49b0d66e2113c8b4a9bfd08a0ad2f5202af6eb438aeb031770c1c903d8ee3b9"} Apr 02 13:59:00 crc kubenswrapper[4732]: I0402 13:59:00.814253 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56c3-account-create-update-xg9gr" event={"ID":"90e2fd4b-3c18-4414-93ea-6433d6c52f80","Type":"ContainerStarted","Data":"d0b11520f05ce2216b4d53a5fb0d4eef75f1640848f1e09e40c663b613766002"} Apr 02 13:59:00 crc kubenswrapper[4732]: I0402 13:59:00.817137 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-sjkqb" event={"ID":"1d495f3e-102b-45f0-88e4-fe3777c11b97","Type":"ContainerStarted","Data":"90301a5b3770820faf4773b54429c8ae265f19bd4a26fa5f5d0b97834e08f405"} Apr 02 13:59:00 crc kubenswrapper[4732]: I0402 13:59:00.817204 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-sjkqb" event={"ID":"1d495f3e-102b-45f0-88e4-fe3777c11b97","Type":"ContainerStarted","Data":"6364b56e479eaca0e359e0227d48dec3e1935b5bef53673b05f6b8c268ac08f6"} Apr 02 13:59:00 crc kubenswrapper[4732]: I0402 13:59:00.819436 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-lmkhl" event={"ID":"6b406167-98d8-46ae-8319-5734de01a2e0","Type":"ContainerStarted","Data":"16ed54a1e0a3aae64c8783d62ca5fc2d1642d99087645ba7ab6ae31450c56103"} Apr 02 13:59:00 crc kubenswrapper[4732]: I0402 13:59:00.819467 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-lmkhl" event={"ID":"6b406167-98d8-46ae-8319-5734de01a2e0","Type":"ContainerStarted","Data":"b3bc675fb21d789c6cbda294883250d13a00436ea63f6da46108786da035e8c7"} Apr 02 13:59:00 crc kubenswrapper[4732]: I0402 13:59:00.824491 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-da70-account-create-update-pl545" event={"ID":"7907b3e6-ba65-4bc8-bf30-dd79cab266b5","Type":"ContainerStarted","Data":"086132d0d3617ed68d798ad9549736d50cca8b260a7b0c3fa34a9d6cb7ad9ca7"} Apr 02 13:59:00 crc kubenswrapper[4732]: I0402 13:59:00.828218 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-cs4gt" event={"ID":"e3eee308-f9e6-4475-a2a4-2116af760963","Type":"ContainerStarted","Data":"8b32680cbed87d1485954a4f0720f3b315fb4a4b1c7699193cf12fdcd544d36d"} Apr 02 13:59:00 crc kubenswrapper[4732]: I0402 13:59:00.840743 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-lmkhl" podStartSLOduration=1.8407206170000001 podStartE2EDuration="1.840720617s" podCreationTimestamp="2026-04-02 13:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:59:00.836790041 +0000 UTC m=+1297.741197614" watchObservedRunningTime="2026-04-02 13:59:00.840720617 +0000 UTC m=+1297.745128170" Apr 02 13:59:00 crc kubenswrapper[4732]: I0402 13:59:00.844846 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" Apr 02 13:59:00 crc kubenswrapper[4732]: I0402 13:59:00.950519 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rjf6j"] Apr 02 13:59:00 crc kubenswrapper[4732]: I0402 13:59:00.951111 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-rjf6j" podUID="38a12349-b9ff-4123-ba3b-96edc0cf2bc6" containerName="dnsmasq-dns" containerID="cri-o://d6626f18c4d55241c0910c46f125ab90840fcd25e84f3ab515dea20bc6197255" gracePeriod=10 Apr 02 13:59:01 crc kubenswrapper[4732]: I0402 13:59:01.777096 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-77mrb"] Apr 02 13:59:01 crc kubenswrapper[4732]: I0402 13:59:01.784202 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-77mrb"] Apr 02 13:59:01 crc kubenswrapper[4732]: I0402 13:59:01.836963 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-da70-account-create-update-pl545" event={"ID":"7907b3e6-ba65-4bc8-bf30-dd79cab266b5","Type":"ContainerStarted","Data":"96ef04e158bb2a83cd3bf3c9ac26655c113bf9e195ec6e34cfd8766d4564a311"} Apr 02 13:59:01 crc kubenswrapper[4732]: I0402 13:59:01.840339 4732 generic.go:334] "Generic (PLEG): container finished" podID="38a12349-b9ff-4123-ba3b-96edc0cf2bc6" containerID="d6626f18c4d55241c0910c46f125ab90840fcd25e84f3ab515dea20bc6197255" exitCode=0 Apr 02 13:59:01 crc kubenswrapper[4732]: I0402 13:59:01.840495 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rjf6j" event={"ID":"38a12349-b9ff-4123-ba3b-96edc0cf2bc6","Type":"ContainerDied","Data":"d6626f18c4d55241c0910c46f125ab90840fcd25e84f3ab515dea20bc6197255"} Apr 02 13:59:01 crc kubenswrapper[4732]: I0402 13:59:01.853947 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-da70-account-create-update-pl545" podStartSLOduration=2.853931046 podStartE2EDuration="2.853931046s" podCreationTimestamp="2026-04-02 13:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:59:01.852259841 +0000 UTC m=+1298.756667414" watchObservedRunningTime="2026-04-02 13:59:01.853931046 +0000 UTC m=+1298.758338599" Apr 02 13:59:01 crc kubenswrapper[4732]: I0402 13:59:01.872541 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-56c3-account-create-update-xg9gr" podStartSLOduration=2.8725230059999998 podStartE2EDuration="2.872523006s" podCreationTimestamp="2026-04-02 13:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:59:01.86447313 +0000 UTC m=+1298.768880683" watchObservedRunningTime="2026-04-02 13:59:01.872523006 +0000 UTC m=+1298.776930559" Apr 02 13:59:01 crc kubenswrapper[4732]: I0402 13:59:01.905000 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-sjkqb" podStartSLOduration=2.904976261 podStartE2EDuration="2.904976261s" podCreationTimestamp="2026-04-02 13:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:59:01.886049251 +0000 UTC m=+1298.790456834" watchObservedRunningTime="2026-04-02 13:59:01.904976261 +0000 UTC m=+1298.809383824" Apr 02 13:59:02 crc kubenswrapper[4732]: I0402 13:59:02.064077 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rjf6j" Apr 02 13:59:02 crc kubenswrapper[4732]: I0402 13:59:02.194332 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38a12349-b9ff-4123-ba3b-96edc0cf2bc6-config\") pod \"38a12349-b9ff-4123-ba3b-96edc0cf2bc6\" (UID: \"38a12349-b9ff-4123-ba3b-96edc0cf2bc6\") " Apr 02 13:59:02 crc kubenswrapper[4732]: I0402 13:59:02.194386 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38a12349-b9ff-4123-ba3b-96edc0cf2bc6-dns-svc\") pod \"38a12349-b9ff-4123-ba3b-96edc0cf2bc6\" (UID: \"38a12349-b9ff-4123-ba3b-96edc0cf2bc6\") " Apr 02 13:59:02 crc kubenswrapper[4732]: I0402 13:59:02.194559 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh982\" (UniqueName: \"kubernetes.io/projected/38a12349-b9ff-4123-ba3b-96edc0cf2bc6-kube-api-access-gh982\") pod \"38a12349-b9ff-4123-ba3b-96edc0cf2bc6\" (UID: \"38a12349-b9ff-4123-ba3b-96edc0cf2bc6\") " Apr 02 13:59:02 crc kubenswrapper[4732]: I0402 13:59:02.200359 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38a12349-b9ff-4123-ba3b-96edc0cf2bc6-kube-api-access-gh982" (OuterVolumeSpecName: "kube-api-access-gh982") pod "38a12349-b9ff-4123-ba3b-96edc0cf2bc6" (UID: "38a12349-b9ff-4123-ba3b-96edc0cf2bc6"). InnerVolumeSpecName "kube-api-access-gh982". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:59:02 crc kubenswrapper[4732]: I0402 13:59:02.239577 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38a12349-b9ff-4123-ba3b-96edc0cf2bc6-config" (OuterVolumeSpecName: "config") pod "38a12349-b9ff-4123-ba3b-96edc0cf2bc6" (UID: "38a12349-b9ff-4123-ba3b-96edc0cf2bc6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:02 crc kubenswrapper[4732]: I0402 13:59:02.241335 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38a12349-b9ff-4123-ba3b-96edc0cf2bc6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "38a12349-b9ff-4123-ba3b-96edc0cf2bc6" (UID: "38a12349-b9ff-4123-ba3b-96edc0cf2bc6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:02 crc kubenswrapper[4732]: I0402 13:59:02.297221 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38a12349-b9ff-4123-ba3b-96edc0cf2bc6-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:02 crc kubenswrapper[4732]: I0402 13:59:02.297278 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38a12349-b9ff-4123-ba3b-96edc0cf2bc6-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:02 crc kubenswrapper[4732]: I0402 13:59:02.297313 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh982\" (UniqueName: \"kubernetes.io/projected/38a12349-b9ff-4123-ba3b-96edc0cf2bc6-kube-api-access-gh982\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:02 crc kubenswrapper[4732]: I0402 13:59:02.693986 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59006d63-1671-4062-92b6-8d7f5e36c778" path="/var/lib/kubelet/pods/59006d63-1671-4062-92b6-8d7f5e36c778/volumes" Apr 02 13:59:03 crc kubenswrapper[4732]: I0402 13:59:03.486821 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rjf6j" Apr 02 13:59:03 crc kubenswrapper[4732]: I0402 13:59:03.486871 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rjf6j" event={"ID":"38a12349-b9ff-4123-ba3b-96edc0cf2bc6","Type":"ContainerDied","Data":"3e787762a53d8a1b1175df1be8e3c94d07093c61c7dc9dfe099962875a02710e"} Apr 02 13:59:03 crc kubenswrapper[4732]: I0402 13:59:03.486914 4732 scope.go:117] "RemoveContainer" containerID="d6626f18c4d55241c0910c46f125ab90840fcd25e84f3ab515dea20bc6197255" Apr 02 13:59:03 crc kubenswrapper[4732]: I0402 13:59:03.513097 4732 scope.go:117] "RemoveContainer" containerID="3f135ea198a03663ed9d5efdf74058e2dfddec3f91057f0ffe0f8948dd604ea1" Apr 02 13:59:03 crc kubenswrapper[4732]: I0402 13:59:03.514298 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rjf6j"] Apr 02 13:59:03 crc kubenswrapper[4732]: I0402 13:59:03.523532 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rjf6j"] Apr 02 13:59:04 crc kubenswrapper[4732]: I0402 13:59:04.697346 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38a12349-b9ff-4123-ba3b-96edc0cf2bc6" path="/var/lib/kubelet/pods/38a12349-b9ff-4123-ba3b-96edc0cf2bc6/volumes" Apr 02 13:59:05 crc kubenswrapper[4732]: I0402 13:59:05.503420 4732 generic.go:334] "Generic (PLEG): container finished" podID="6b406167-98d8-46ae-8319-5734de01a2e0" containerID="16ed54a1e0a3aae64c8783d62ca5fc2d1642d99087645ba7ab6ae31450c56103" exitCode=0 Apr 02 13:59:05 crc kubenswrapper[4732]: I0402 13:59:05.503506 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-lmkhl" event={"ID":"6b406167-98d8-46ae-8319-5734de01a2e0","Type":"ContainerDied","Data":"16ed54a1e0a3aae64c8783d62ca5fc2d1642d99087645ba7ab6ae31450c56103"} Apr 02 13:59:06 crc kubenswrapper[4732]: I0402 13:59:06.233166 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Apr 02 13:59:06 crc kubenswrapper[4732]: I0402 13:59:06.515666 4732 generic.go:334] "Generic (PLEG): container finished" podID="1d495f3e-102b-45f0-88e4-fe3777c11b97" containerID="90301a5b3770820faf4773b54429c8ae265f19bd4a26fa5f5d0b97834e08f405" exitCode=0 Apr 02 13:59:06 crc kubenswrapper[4732]: I0402 13:59:06.515764 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-sjkqb" event={"ID":"1d495f3e-102b-45f0-88e4-fe3777c11b97","Type":"ContainerDied","Data":"90301a5b3770820faf4773b54429c8ae265f19bd4a26fa5f5d0b97834e08f405"} Apr 02 13:59:06 crc kubenswrapper[4732]: I0402 13:59:06.793990 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-kbmg5"] Apr 02 13:59:06 crc kubenswrapper[4732]: E0402 13:59:06.794689 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38a12349-b9ff-4123-ba3b-96edc0cf2bc6" containerName="dnsmasq-dns" Apr 02 13:59:06 crc kubenswrapper[4732]: I0402 13:59:06.794712 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a12349-b9ff-4123-ba3b-96edc0cf2bc6" containerName="dnsmasq-dns" Apr 02 13:59:06 crc kubenswrapper[4732]: E0402 13:59:06.794745 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38a12349-b9ff-4123-ba3b-96edc0cf2bc6" containerName="init" Apr 02 13:59:06 crc kubenswrapper[4732]: I0402 13:59:06.794754 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a12349-b9ff-4123-ba3b-96edc0cf2bc6" containerName="init" Apr 02 13:59:06 crc kubenswrapper[4732]: I0402 13:59:06.794949 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="38a12349-b9ff-4123-ba3b-96edc0cf2bc6" containerName="dnsmasq-dns" Apr 02 13:59:06 crc kubenswrapper[4732]: I0402 13:59:06.795434 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kbmg5" Apr 02 13:59:06 crc kubenswrapper[4732]: I0402 13:59:06.799462 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Apr 02 13:59:06 crc kubenswrapper[4732]: I0402 13:59:06.803369 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-kbmg5"] Apr 02 13:59:06 crc kubenswrapper[4732]: I0402 13:59:06.825697 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4shbk\" (UniqueName: \"kubernetes.io/projected/e2e69ec9-8296-4cbf-a039-82cd34ed3d72-kube-api-access-4shbk\") pod \"root-account-create-update-kbmg5\" (UID: \"e2e69ec9-8296-4cbf-a039-82cd34ed3d72\") " pod="openstack/root-account-create-update-kbmg5" Apr 02 13:59:06 crc kubenswrapper[4732]: I0402 13:59:06.825829 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2e69ec9-8296-4cbf-a039-82cd34ed3d72-operator-scripts\") pod \"root-account-create-update-kbmg5\" (UID: \"e2e69ec9-8296-4cbf-a039-82cd34ed3d72\") " pod="openstack/root-account-create-update-kbmg5" Apr 02 13:59:06 crc kubenswrapper[4732]: I0402 13:59:06.927226 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4shbk\" (UniqueName: \"kubernetes.io/projected/e2e69ec9-8296-4cbf-a039-82cd34ed3d72-kube-api-access-4shbk\") pod \"root-account-create-update-kbmg5\" (UID: \"e2e69ec9-8296-4cbf-a039-82cd34ed3d72\") " pod="openstack/root-account-create-update-kbmg5" Apr 02 13:59:06 crc kubenswrapper[4732]: I0402 13:59:06.927331 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2e69ec9-8296-4cbf-a039-82cd34ed3d72-operator-scripts\") pod \"root-account-create-update-kbmg5\" (UID: \"e2e69ec9-8296-4cbf-a039-82cd34ed3d72\") " pod="openstack/root-account-create-update-kbmg5" Apr 02 13:59:06 crc kubenswrapper[4732]: I0402 13:59:06.928652 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2e69ec9-8296-4cbf-a039-82cd34ed3d72-operator-scripts\") pod \"root-account-create-update-kbmg5\" (UID: \"e2e69ec9-8296-4cbf-a039-82cd34ed3d72\") " pod="openstack/root-account-create-update-kbmg5" Apr 02 13:59:06 crc kubenswrapper[4732]: I0402 13:59:06.946476 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4shbk\" (UniqueName: \"kubernetes.io/projected/e2e69ec9-8296-4cbf-a039-82cd34ed3d72-kube-api-access-4shbk\") pod \"root-account-create-update-kbmg5\" (UID: \"e2e69ec9-8296-4cbf-a039-82cd34ed3d72\") " pod="openstack/root-account-create-update-kbmg5" Apr 02 13:59:07 crc kubenswrapper[4732]: I0402 13:59:07.125223 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kbmg5" Apr 02 13:59:07 crc kubenswrapper[4732]: I0402 13:59:07.526161 4732 generic.go:334] "Generic (PLEG): container finished" podID="7907b3e6-ba65-4bc8-bf30-dd79cab266b5" containerID="96ef04e158bb2a83cd3bf3c9ac26655c113bf9e195ec6e34cfd8766d4564a311" exitCode=0 Apr 02 13:59:07 crc kubenswrapper[4732]: I0402 13:59:07.526271 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-da70-account-create-update-pl545" event={"ID":"7907b3e6-ba65-4bc8-bf30-dd79cab266b5","Type":"ContainerDied","Data":"96ef04e158bb2a83cd3bf3c9ac26655c113bf9e195ec6e34cfd8766d4564a311"} Apr 02 13:59:07 crc kubenswrapper[4732]: I0402 13:59:07.528842 4732 generic.go:334] "Generic (PLEG): container finished" podID="90e2fd4b-3c18-4414-93ea-6433d6c52f80" containerID="f49b0d66e2113c8b4a9bfd08a0ad2f5202af6eb438aeb031770c1c903d8ee3b9" exitCode=0 Apr 02 13:59:07 crc kubenswrapper[4732]: I0402 13:59:07.528873 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56c3-account-create-update-xg9gr" event={"ID":"90e2fd4b-3c18-4414-93ea-6433d6c52f80","Type":"ContainerDied","Data":"f49b0d66e2113c8b4a9bfd08a0ad2f5202af6eb438aeb031770c1c903d8ee3b9"} Apr 02 13:59:07 crc kubenswrapper[4732]: I0402 13:59:07.741946 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/529a332e-d2c3-49c5-86d5-e672811d00cd-etc-swift\") pod \"swift-storage-0\" (UID: \"529a332e-d2c3-49c5-86d5-e672811d00cd\") " pod="openstack/swift-storage-0" Apr 02 13:59:07 crc kubenswrapper[4732]: E0402 13:59:07.742064 4732 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Apr 02 13:59:07 crc kubenswrapper[4732]: E0402 13:59:07.742429 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Apr 02 13:59:07 crc kubenswrapper[4732]: E0402 13:59:07.742503 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/529a332e-d2c3-49c5-86d5-e672811d00cd-etc-swift podName:529a332e-d2c3-49c5-86d5-e672811d00cd nodeName:}" failed. No retries permitted until 2026-04-02 13:59:23.742478506 +0000 UTC m=+1320.646886119 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/529a332e-d2c3-49c5-86d5-e672811d00cd-etc-swift") pod "swift-storage-0" (UID: "529a332e-d2c3-49c5-86d5-e672811d00cd") : configmap "swift-ring-files" not found Apr 02 13:59:09 crc kubenswrapper[4732]: I0402 13:59:09.439851 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-5222s" podUID="8af6391f-4f8b-4473-8e7c-186c9c838527" containerName="ovn-controller" probeResult="failure" output=< Apr 02 13:59:09 crc kubenswrapper[4732]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Apr 02 13:59:09 crc kubenswrapper[4732]: > Apr 02 13:59:10 crc kubenswrapper[4732]: I0402 13:59:10.570869 4732 generic.go:334] "Generic (PLEG): container finished" podID="81138fff-8b7c-4cf3-8aa5-2582d80483e1" containerID="22fb6eb8d107d107c9cbc9c086c306a92277eebd8dc426055926610f504f12e6" exitCode=0 Apr 02 13:59:10 crc kubenswrapper[4732]: I0402 13:59:10.570956 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wsskk" event={"ID":"81138fff-8b7c-4cf3-8aa5-2582d80483e1","Type":"ContainerDied","Data":"22fb6eb8d107d107c9cbc9c086c306a92277eebd8dc426055926610f504f12e6"} Apr 02 13:59:11 crc kubenswrapper[4732]: I0402 13:59:11.579325 4732 generic.go:334] "Generic (PLEG): container finished" podID="b0bb93d2-9da7-4667-9079-b403332d31e0" containerID="991fee5892b181cbf9eaa8f0e526c1dca54ed5e2932b158ac3a0bf0139afeaf4" exitCode=0 Apr 02 13:59:11 crc kubenswrapper[4732]: I0402 13:59:11.579870 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b0bb93d2-9da7-4667-9079-b403332d31e0","Type":"ContainerDied","Data":"991fee5892b181cbf9eaa8f0e526c1dca54ed5e2932b158ac3a0bf0139afeaf4"} Apr 02 13:59:11 crc kubenswrapper[4732]: I0402 13:59:11.582750 4732 generic.go:334] "Generic (PLEG): container finished" podID="56762f05-a513-4f47-8cf7-5d19bb58c5bd" containerID="a107fd73961c43c85df6b57282cf11a8acd3acae427dc6edddcade0f4ab33f9d" exitCode=0 Apr 02 13:59:11 crc kubenswrapper[4732]: I0402 13:59:11.582938 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"56762f05-a513-4f47-8cf7-5d19bb58c5bd","Type":"ContainerDied","Data":"a107fd73961c43c85df6b57282cf11a8acd3acae427dc6edddcade0f4ab33f9d"} Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.194341 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-da70-account-create-update-pl545" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.220426 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lmkhl" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.242057 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56c3-account-create-update-xg9gr" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.249165 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wsskk" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.254147 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-sjkqb" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.332210 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twp2z\" (UniqueName: \"kubernetes.io/projected/6b406167-98d8-46ae-8319-5734de01a2e0-kube-api-access-twp2z\") pod \"6b406167-98d8-46ae-8319-5734de01a2e0\" (UID: \"6b406167-98d8-46ae-8319-5734de01a2e0\") " Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.332261 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b406167-98d8-46ae-8319-5734de01a2e0-operator-scripts\") pod \"6b406167-98d8-46ae-8319-5734de01a2e0\" (UID: \"6b406167-98d8-46ae-8319-5734de01a2e0\") " Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.332291 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7d49\" (UniqueName: \"kubernetes.io/projected/7907b3e6-ba65-4bc8-bf30-dd79cab266b5-kube-api-access-d7d49\") pod \"7907b3e6-ba65-4bc8-bf30-dd79cab266b5\" (UID: \"7907b3e6-ba65-4bc8-bf30-dd79cab266b5\") " Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.332319 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81138fff-8b7c-4cf3-8aa5-2582d80483e1-combined-ca-bundle\") pod \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.332337 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d495f3e-102b-45f0-88e4-fe3777c11b97-operator-scripts\") pod \"1d495f3e-102b-45f0-88e4-fe3777c11b97\" (UID: \"1d495f3e-102b-45f0-88e4-fe3777c11b97\") " Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.332359 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/81138fff-8b7c-4cf3-8aa5-2582d80483e1-dispersionconf\") pod \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.332424 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpzm5\" (UniqueName: \"kubernetes.io/projected/90e2fd4b-3c18-4414-93ea-6433d6c52f80-kube-api-access-mpzm5\") pod \"90e2fd4b-3c18-4414-93ea-6433d6c52f80\" (UID: \"90e2fd4b-3c18-4414-93ea-6433d6c52f80\") " Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.332447 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/81138fff-8b7c-4cf3-8aa5-2582d80483e1-etc-swift\") pod \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.332483 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81138fff-8b7c-4cf3-8aa5-2582d80483e1-scripts\") pod \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.332510 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8fkm\" (UniqueName: \"kubernetes.io/projected/1d495f3e-102b-45f0-88e4-fe3777c11b97-kube-api-access-m8fkm\") pod \"1d495f3e-102b-45f0-88e4-fe3777c11b97\" (UID: \"1d495f3e-102b-45f0-88e4-fe3777c11b97\") " Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.332532 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7907b3e6-ba65-4bc8-bf30-dd79cab266b5-operator-scripts\") pod \"7907b3e6-ba65-4bc8-bf30-dd79cab266b5\" (UID: \"7907b3e6-ba65-4bc8-bf30-dd79cab266b5\") " Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.332575 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/81138fff-8b7c-4cf3-8aa5-2582d80483e1-ring-data-devices\") pod \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.332597 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/81138fff-8b7c-4cf3-8aa5-2582d80483e1-swiftconf\") pod \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.332626 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90e2fd4b-3c18-4414-93ea-6433d6c52f80-operator-scripts\") pod \"90e2fd4b-3c18-4414-93ea-6433d6c52f80\" (UID: \"90e2fd4b-3c18-4414-93ea-6433d6c52f80\") " Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.332656 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkzc9\" (UniqueName: \"kubernetes.io/projected/81138fff-8b7c-4cf3-8aa5-2582d80483e1-kube-api-access-fkzc9\") pod \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\" (UID: \"81138fff-8b7c-4cf3-8aa5-2582d80483e1\") " Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.333809 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7907b3e6-ba65-4bc8-bf30-dd79cab266b5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7907b3e6-ba65-4bc8-bf30-dd79cab266b5" (UID: "7907b3e6-ba65-4bc8-bf30-dd79cab266b5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.334431 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81138fff-8b7c-4cf3-8aa5-2582d80483e1-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "81138fff-8b7c-4cf3-8aa5-2582d80483e1" (UID: "81138fff-8b7c-4cf3-8aa5-2582d80483e1"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.335083 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d495f3e-102b-45f0-88e4-fe3777c11b97-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1d495f3e-102b-45f0-88e4-fe3777c11b97" (UID: "1d495f3e-102b-45f0-88e4-fe3777c11b97"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.338094 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81138fff-8b7c-4cf3-8aa5-2582d80483e1-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "81138fff-8b7c-4cf3-8aa5-2582d80483e1" (UID: "81138fff-8b7c-4cf3-8aa5-2582d80483e1"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.338655 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90e2fd4b-3c18-4414-93ea-6433d6c52f80-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "90e2fd4b-3c18-4414-93ea-6433d6c52f80" (UID: "90e2fd4b-3c18-4414-93ea-6433d6c52f80"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.338673 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b406167-98d8-46ae-8319-5734de01a2e0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6b406167-98d8-46ae-8319-5734de01a2e0" (UID: "6b406167-98d8-46ae-8319-5734de01a2e0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.340056 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7907b3e6-ba65-4bc8-bf30-dd79cab266b5-kube-api-access-d7d49" (OuterVolumeSpecName: "kube-api-access-d7d49") pod "7907b3e6-ba65-4bc8-bf30-dd79cab266b5" (UID: "7907b3e6-ba65-4bc8-bf30-dd79cab266b5"). InnerVolumeSpecName "kube-api-access-d7d49". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.343014 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81138fff-8b7c-4cf3-8aa5-2582d80483e1-kube-api-access-fkzc9" (OuterVolumeSpecName: "kube-api-access-fkzc9") pod "81138fff-8b7c-4cf3-8aa5-2582d80483e1" (UID: "81138fff-8b7c-4cf3-8aa5-2582d80483e1"). InnerVolumeSpecName "kube-api-access-fkzc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.343575 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90e2fd4b-3c18-4414-93ea-6433d6c52f80-kube-api-access-mpzm5" (OuterVolumeSpecName: "kube-api-access-mpzm5") pod "90e2fd4b-3c18-4414-93ea-6433d6c52f80" (UID: "90e2fd4b-3c18-4414-93ea-6433d6c52f80"). InnerVolumeSpecName "kube-api-access-mpzm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.344031 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d495f3e-102b-45f0-88e4-fe3777c11b97-kube-api-access-m8fkm" (OuterVolumeSpecName: "kube-api-access-m8fkm") pod "1d495f3e-102b-45f0-88e4-fe3777c11b97" (UID: "1d495f3e-102b-45f0-88e4-fe3777c11b97"). InnerVolumeSpecName "kube-api-access-m8fkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.344802 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b406167-98d8-46ae-8319-5734de01a2e0-kube-api-access-twp2z" (OuterVolumeSpecName: "kube-api-access-twp2z") pod "6b406167-98d8-46ae-8319-5734de01a2e0" (UID: "6b406167-98d8-46ae-8319-5734de01a2e0"). InnerVolumeSpecName "kube-api-access-twp2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.347299 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81138fff-8b7c-4cf3-8aa5-2582d80483e1-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "81138fff-8b7c-4cf3-8aa5-2582d80483e1" (UID: "81138fff-8b7c-4cf3-8aa5-2582d80483e1"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.355517 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81138fff-8b7c-4cf3-8aa5-2582d80483e1-scripts" (OuterVolumeSpecName: "scripts") pod "81138fff-8b7c-4cf3-8aa5-2582d80483e1" (UID: "81138fff-8b7c-4cf3-8aa5-2582d80483e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.365817 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81138fff-8b7c-4cf3-8aa5-2582d80483e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81138fff-8b7c-4cf3-8aa5-2582d80483e1" (UID: "81138fff-8b7c-4cf3-8aa5-2582d80483e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.383394 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81138fff-8b7c-4cf3-8aa5-2582d80483e1-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "81138fff-8b7c-4cf3-8aa5-2582d80483e1" (UID: "81138fff-8b7c-4cf3-8aa5-2582d80483e1"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.434467 4732 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/81138fff-8b7c-4cf3-8aa5-2582d80483e1-ring-data-devices\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.434505 4732 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/81138fff-8b7c-4cf3-8aa5-2582d80483e1-swiftconf\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.434518 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90e2fd4b-3c18-4414-93ea-6433d6c52f80-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.434532 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkzc9\" (UniqueName: \"kubernetes.io/projected/81138fff-8b7c-4cf3-8aa5-2582d80483e1-kube-api-access-fkzc9\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.434548 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twp2z\" (UniqueName: \"kubernetes.io/projected/6b406167-98d8-46ae-8319-5734de01a2e0-kube-api-access-twp2z\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.434559 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b406167-98d8-46ae-8319-5734de01a2e0-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.434571 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7d49\" (UniqueName: \"kubernetes.io/projected/7907b3e6-ba65-4bc8-bf30-dd79cab266b5-kube-api-access-d7d49\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.434584 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81138fff-8b7c-4cf3-8aa5-2582d80483e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.434596 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d495f3e-102b-45f0-88e4-fe3777c11b97-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.434604 4732 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/81138fff-8b7c-4cf3-8aa5-2582d80483e1-dispersionconf\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.434628 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpzm5\" (UniqueName: \"kubernetes.io/projected/90e2fd4b-3c18-4414-93ea-6433d6c52f80-kube-api-access-mpzm5\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.434638 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/81138fff-8b7c-4cf3-8aa5-2582d80483e1-etc-swift\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.434650 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81138fff-8b7c-4cf3-8aa5-2582d80483e1-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.434661 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8fkm\" (UniqueName: \"kubernetes.io/projected/1d495f3e-102b-45f0-88e4-fe3777c11b97-kube-api-access-m8fkm\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.434672 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7907b3e6-ba65-4bc8-bf30-dd79cab266b5-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.475697 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-kbmg5"] Apr 02 13:59:13 crc kubenswrapper[4732]: W0402 13:59:13.477931 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2e69ec9_8296_4cbf_a039_82cd34ed3d72.slice/crio-437697c35697ce874cf1b66b2d97e49cf54f4286e4d368ec02fe72bbba88d6c1 WatchSource:0}: Error finding container 437697c35697ce874cf1b66b2d97e49cf54f4286e4d368ec02fe72bbba88d6c1: Status 404 returned error can't find the container with id 437697c35697ce874cf1b66b2d97e49cf54f4286e4d368ec02fe72bbba88d6c1 Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.604842 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-sjkqb" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.604870 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-sjkqb" event={"ID":"1d495f3e-102b-45f0-88e4-fe3777c11b97","Type":"ContainerDied","Data":"6364b56e479eaca0e359e0227d48dec3e1935b5bef53673b05f6b8c268ac08f6"} Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.605278 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6364b56e479eaca0e359e0227d48dec3e1935b5bef53673b05f6b8c268ac08f6" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.608692 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-lmkhl" event={"ID":"6b406167-98d8-46ae-8319-5734de01a2e0","Type":"ContainerDied","Data":"b3bc675fb21d789c6cbda294883250d13a00436ea63f6da46108786da035e8c7"} Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.608746 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3bc675fb21d789c6cbda294883250d13a00436ea63f6da46108786da035e8c7" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.608817 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lmkhl" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.616634 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b0bb93d2-9da7-4667-9079-b403332d31e0","Type":"ContainerStarted","Data":"9717b2925f56b229efa61c94052925ad34c23f5599b515eb928ba046da74a28f"} Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.617715 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.627464 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kbmg5" event={"ID":"e2e69ec9-8296-4cbf-a039-82cd34ed3d72","Type":"ContainerStarted","Data":"437697c35697ce874cf1b66b2d97e49cf54f4286e4d368ec02fe72bbba88d6c1"} Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.634338 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-da70-account-create-update-pl545" event={"ID":"7907b3e6-ba65-4bc8-bf30-dd79cab266b5","Type":"ContainerDied","Data":"086132d0d3617ed68d798ad9549736d50cca8b260a7b0c3fa34a9d6cb7ad9ca7"} Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.634383 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="086132d0d3617ed68d798ad9549736d50cca8b260a7b0c3fa34a9d6cb7ad9ca7" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.634452 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-da70-account-create-update-pl545" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.642246 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wsskk" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.642880 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wsskk" event={"ID":"81138fff-8b7c-4cf3-8aa5-2582d80483e1","Type":"ContainerDied","Data":"85894cf01fad8b1a6c5ba1a9a6e4318ca380444c5ecac39f962576034fcbc3b8"} Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.642910 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85894cf01fad8b1a6c5ba1a9a6e4318ca380444c5ecac39f962576034fcbc3b8" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.647297 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=51.380173342 podStartE2EDuration="1m0.647276045s" podCreationTimestamp="2026-04-02 13:58:13 +0000 UTC" firstStartedPulling="2026-04-02 13:58:29.511474132 +0000 UTC m=+1266.415881685" lastFinishedPulling="2026-04-02 13:58:38.778576835 +0000 UTC m=+1275.682984388" observedRunningTime="2026-04-02 13:59:13.642381573 +0000 UTC m=+1310.546789146" watchObservedRunningTime="2026-04-02 13:59:13.647276045 +0000 UTC m=+1310.551683628" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.654268 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56c3-account-create-update-xg9gr" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.654355 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56c3-account-create-update-xg9gr" event={"ID":"90e2fd4b-3c18-4414-93ea-6433d6c52f80","Type":"ContainerDied","Data":"d0b11520f05ce2216b4d53a5fb0d4eef75f1640848f1e09e40c663b613766002"} Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.654389 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0b11520f05ce2216b4d53a5fb0d4eef75f1640848f1e09e40c663b613766002" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.661665 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"56762f05-a513-4f47-8cf7-5d19bb58c5bd","Type":"ContainerStarted","Data":"c59797a594162dd89aca027cf2a8334e52ae126e765ce2f5b5f5aca4eab7131a"} Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.661988 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.670881 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-kbmg5" podStartSLOduration=7.670859621 podStartE2EDuration="7.670859621s" podCreationTimestamp="2026-04-02 13:59:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:59:13.660406559 +0000 UTC m=+1310.564814122" watchObservedRunningTime="2026-04-02 13:59:13.670859621 +0000 UTC m=+1310.575267174" Apr 02 13:59:13 crc kubenswrapper[4732]: I0402 13:59:13.693343 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=51.584523188 podStartE2EDuration="1m0.693324916s" podCreationTimestamp="2026-04-02 13:58:13 +0000 UTC" firstStartedPulling="2026-04-02 13:58:29.242419919 +0000 UTC m=+1266.146827472" lastFinishedPulling="2026-04-02 13:58:38.351221647 +0000 UTC m=+1275.255629200" observedRunningTime="2026-04-02 13:59:13.692288688 +0000 UTC m=+1310.596696261" watchObservedRunningTime="2026-04-02 13:59:13.693324916 +0000 UTC m=+1310.597732469" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.453673 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-5222s" podUID="8af6391f-4f8b-4473-8e7c-186c9c838527" containerName="ovn-controller" probeResult="failure" output=< Apr 02 13:59:14 crc kubenswrapper[4732]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Apr 02 13:59:14 crc kubenswrapper[4732]: > Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.462464 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-l4ttl" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.465498 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-l4ttl" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.671629 4732 generic.go:334] "Generic (PLEG): container finished" podID="e2e69ec9-8296-4cbf-a039-82cd34ed3d72" containerID="b3657c45bbcc5ec422c65b30f15d71eaea3a630e99da5c680870af5be98a1057" exitCode=0 Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.671771 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kbmg5" event={"ID":"e2e69ec9-8296-4cbf-a039-82cd34ed3d72","Type":"ContainerDied","Data":"b3657c45bbcc5ec422c65b30f15d71eaea3a630e99da5c680870af5be98a1057"} Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.674166 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-cs4gt" event={"ID":"e3eee308-f9e6-4475-a2a4-2116af760963","Type":"ContainerStarted","Data":"0a0032734f5fd52e26c5e0604d22f129133838bce5b2865c9a8ace616d48d77d"} Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.711956 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-5222s-config-9rjtq"] Apr 02 13:59:14 crc kubenswrapper[4732]: E0402 13:59:14.712354 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d495f3e-102b-45f0-88e4-fe3777c11b97" containerName="mariadb-database-create" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.712376 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d495f3e-102b-45f0-88e4-fe3777c11b97" containerName="mariadb-database-create" Apr 02 13:59:14 crc kubenswrapper[4732]: E0402 13:59:14.712394 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81138fff-8b7c-4cf3-8aa5-2582d80483e1" containerName="swift-ring-rebalance" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.712401 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="81138fff-8b7c-4cf3-8aa5-2582d80483e1" containerName="swift-ring-rebalance" Apr 02 13:59:14 crc kubenswrapper[4732]: E0402 13:59:14.712415 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7907b3e6-ba65-4bc8-bf30-dd79cab266b5" containerName="mariadb-account-create-update" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.712421 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7907b3e6-ba65-4bc8-bf30-dd79cab266b5" containerName="mariadb-account-create-update" Apr 02 13:59:14 crc kubenswrapper[4732]: E0402 13:59:14.712433 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e2fd4b-3c18-4414-93ea-6433d6c52f80" containerName="mariadb-account-create-update" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.712439 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e2fd4b-3c18-4414-93ea-6433d6c52f80" containerName="mariadb-account-create-update" Apr 02 13:59:14 crc kubenswrapper[4732]: E0402 13:59:14.712452 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b406167-98d8-46ae-8319-5734de01a2e0" containerName="mariadb-database-create" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.712460 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b406167-98d8-46ae-8319-5734de01a2e0" containerName="mariadb-database-create" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.712689 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d495f3e-102b-45f0-88e4-fe3777c11b97" containerName="mariadb-database-create" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.712711 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="81138fff-8b7c-4cf3-8aa5-2582d80483e1" containerName="swift-ring-rebalance" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.712718 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b406167-98d8-46ae-8319-5734de01a2e0" containerName="mariadb-database-create" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.712744 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="90e2fd4b-3c18-4414-93ea-6433d6c52f80" containerName="mariadb-account-create-update" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.712754 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="7907b3e6-ba65-4bc8-bf30-dd79cab266b5" containerName="mariadb-account-create-update" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.713348 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5222s-config-9rjtq" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.714893 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.722257 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5222s-config-9rjtq"] Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.724715 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-cs4gt" podStartSLOduration=3.485996832 podStartE2EDuration="16.724693844s" podCreationTimestamp="2026-04-02 13:58:58 +0000 UTC" firstStartedPulling="2026-04-02 13:58:59.848087902 +0000 UTC m=+1296.752495455" lastFinishedPulling="2026-04-02 13:59:13.086784914 +0000 UTC m=+1309.991192467" observedRunningTime="2026-04-02 13:59:14.714099958 +0000 UTC m=+1311.618507531" watchObservedRunningTime="2026-04-02 13:59:14.724693844 +0000 UTC m=+1311.629101397" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.760009 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/748727d6-c157-4a27-9e1d-a1acaaa57e1d-additional-scripts\") pod \"ovn-controller-5222s-config-9rjtq\" (UID: \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\") " pod="openstack/ovn-controller-5222s-config-9rjtq" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.760123 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/748727d6-c157-4a27-9e1d-a1acaaa57e1d-var-log-ovn\") pod \"ovn-controller-5222s-config-9rjtq\" (UID: \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\") " pod="openstack/ovn-controller-5222s-config-9rjtq" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.760155 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/748727d6-c157-4a27-9e1d-a1acaaa57e1d-var-run-ovn\") pod \"ovn-controller-5222s-config-9rjtq\" (UID: \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\") " pod="openstack/ovn-controller-5222s-config-9rjtq" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.760223 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/748727d6-c157-4a27-9e1d-a1acaaa57e1d-scripts\") pod \"ovn-controller-5222s-config-9rjtq\" (UID: \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\") " pod="openstack/ovn-controller-5222s-config-9rjtq" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.760363 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd8xx\" (UniqueName: \"kubernetes.io/projected/748727d6-c157-4a27-9e1d-a1acaaa57e1d-kube-api-access-fd8xx\") pod \"ovn-controller-5222s-config-9rjtq\" (UID: \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\") " pod="openstack/ovn-controller-5222s-config-9rjtq" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.760399 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/748727d6-c157-4a27-9e1d-a1acaaa57e1d-var-run\") pod \"ovn-controller-5222s-config-9rjtq\" (UID: \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\") " pod="openstack/ovn-controller-5222s-config-9rjtq" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.861507 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/748727d6-c157-4a27-9e1d-a1acaaa57e1d-scripts\") pod \"ovn-controller-5222s-config-9rjtq\" (UID: \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\") " pod="openstack/ovn-controller-5222s-config-9rjtq" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.861819 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd8xx\" (UniqueName: \"kubernetes.io/projected/748727d6-c157-4a27-9e1d-a1acaaa57e1d-kube-api-access-fd8xx\") pod \"ovn-controller-5222s-config-9rjtq\" (UID: \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\") " pod="openstack/ovn-controller-5222s-config-9rjtq" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.861911 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/748727d6-c157-4a27-9e1d-a1acaaa57e1d-var-run\") pod \"ovn-controller-5222s-config-9rjtq\" (UID: \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\") " pod="openstack/ovn-controller-5222s-config-9rjtq" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.862096 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/748727d6-c157-4a27-9e1d-a1acaaa57e1d-additional-scripts\") pod \"ovn-controller-5222s-config-9rjtq\" (UID: \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\") " pod="openstack/ovn-controller-5222s-config-9rjtq" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.862214 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/748727d6-c157-4a27-9e1d-a1acaaa57e1d-var-log-ovn\") pod \"ovn-controller-5222s-config-9rjtq\" (UID: \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\") " pod="openstack/ovn-controller-5222s-config-9rjtq" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.862334 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/748727d6-c157-4a27-9e1d-a1acaaa57e1d-var-run-ovn\") pod \"ovn-controller-5222s-config-9rjtq\" (UID: \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\") " pod="openstack/ovn-controller-5222s-config-9rjtq" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.862285 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/748727d6-c157-4a27-9e1d-a1acaaa57e1d-var-run\") pod \"ovn-controller-5222s-config-9rjtq\" (UID: \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\") " pod="openstack/ovn-controller-5222s-config-9rjtq" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.862285 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/748727d6-c157-4a27-9e1d-a1acaaa57e1d-var-log-ovn\") pod \"ovn-controller-5222s-config-9rjtq\" (UID: \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\") " pod="openstack/ovn-controller-5222s-config-9rjtq" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.862956 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/748727d6-c157-4a27-9e1d-a1acaaa57e1d-additional-scripts\") pod \"ovn-controller-5222s-config-9rjtq\" (UID: \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\") " pod="openstack/ovn-controller-5222s-config-9rjtq" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.863013 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/748727d6-c157-4a27-9e1d-a1acaaa57e1d-var-run-ovn\") pod \"ovn-controller-5222s-config-9rjtq\" (UID: \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\") " pod="openstack/ovn-controller-5222s-config-9rjtq" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.864114 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/748727d6-c157-4a27-9e1d-a1acaaa57e1d-scripts\") pod \"ovn-controller-5222s-config-9rjtq\" (UID: \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\") " pod="openstack/ovn-controller-5222s-config-9rjtq" Apr 02 13:59:14 crc kubenswrapper[4732]: I0402 13:59:14.887508 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd8xx\" (UniqueName: \"kubernetes.io/projected/748727d6-c157-4a27-9e1d-a1acaaa57e1d-kube-api-access-fd8xx\") pod \"ovn-controller-5222s-config-9rjtq\" (UID: \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\") " pod="openstack/ovn-controller-5222s-config-9rjtq" Apr 02 13:59:15 crc kubenswrapper[4732]: I0402 13:59:15.032388 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5222s-config-9rjtq" Apr 02 13:59:15 crc kubenswrapper[4732]: I0402 13:59:15.455202 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5222s-config-9rjtq"] Apr 02 13:59:15 crc kubenswrapper[4732]: W0402 13:59:15.464402 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod748727d6_c157_4a27_9e1d_a1acaaa57e1d.slice/crio-04628f4bae4ee2828b88f1639718d70e22822b47e7e23b743428f9a2f2f061ce WatchSource:0}: Error finding container 04628f4bae4ee2828b88f1639718d70e22822b47e7e23b743428f9a2f2f061ce: Status 404 returned error can't find the container with id 04628f4bae4ee2828b88f1639718d70e22822b47e7e23b743428f9a2f2f061ce Apr 02 13:59:15 crc kubenswrapper[4732]: I0402 13:59:15.687714 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5222s-config-9rjtq" event={"ID":"748727d6-c157-4a27-9e1d-a1acaaa57e1d","Type":"ContainerStarted","Data":"04628f4bae4ee2828b88f1639718d70e22822b47e7e23b743428f9a2f2f061ce"} Apr 02 13:59:16 crc kubenswrapper[4732]: I0402 13:59:16.026387 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kbmg5" Apr 02 13:59:16 crc kubenswrapper[4732]: I0402 13:59:16.081238 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4shbk\" (UniqueName: \"kubernetes.io/projected/e2e69ec9-8296-4cbf-a039-82cd34ed3d72-kube-api-access-4shbk\") pod \"e2e69ec9-8296-4cbf-a039-82cd34ed3d72\" (UID: \"e2e69ec9-8296-4cbf-a039-82cd34ed3d72\") " Apr 02 13:59:16 crc kubenswrapper[4732]: I0402 13:59:16.081501 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2e69ec9-8296-4cbf-a039-82cd34ed3d72-operator-scripts\") pod \"e2e69ec9-8296-4cbf-a039-82cd34ed3d72\" (UID: \"e2e69ec9-8296-4cbf-a039-82cd34ed3d72\") " Apr 02 13:59:16 crc kubenswrapper[4732]: I0402 13:59:16.082323 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2e69ec9-8296-4cbf-a039-82cd34ed3d72-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e2e69ec9-8296-4cbf-a039-82cd34ed3d72" (UID: "e2e69ec9-8296-4cbf-a039-82cd34ed3d72"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:16 crc kubenswrapper[4732]: I0402 13:59:16.091071 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2e69ec9-8296-4cbf-a039-82cd34ed3d72-kube-api-access-4shbk" (OuterVolumeSpecName: "kube-api-access-4shbk") pod "e2e69ec9-8296-4cbf-a039-82cd34ed3d72" (UID: "e2e69ec9-8296-4cbf-a039-82cd34ed3d72"). InnerVolumeSpecName "kube-api-access-4shbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:59:16 crc kubenswrapper[4732]: I0402 13:59:16.183860 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2e69ec9-8296-4cbf-a039-82cd34ed3d72-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:16 crc kubenswrapper[4732]: I0402 13:59:16.183906 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4shbk\" (UniqueName: \"kubernetes.io/projected/e2e69ec9-8296-4cbf-a039-82cd34ed3d72-kube-api-access-4shbk\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:16 crc kubenswrapper[4732]: I0402 13:59:16.696340 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kbmg5" Apr 02 13:59:16 crc kubenswrapper[4732]: I0402 13:59:16.696389 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kbmg5" event={"ID":"e2e69ec9-8296-4cbf-a039-82cd34ed3d72","Type":"ContainerDied","Data":"437697c35697ce874cf1b66b2d97e49cf54f4286e4d368ec02fe72bbba88d6c1"} Apr 02 13:59:16 crc kubenswrapper[4732]: I0402 13:59:16.696772 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="437697c35697ce874cf1b66b2d97e49cf54f4286e4d368ec02fe72bbba88d6c1" Apr 02 13:59:16 crc kubenswrapper[4732]: I0402 13:59:16.697490 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5222s-config-9rjtq" event={"ID":"748727d6-c157-4a27-9e1d-a1acaaa57e1d","Type":"ContainerStarted","Data":"1f06a6a01b70864db610c9b5a52f02e68dca501f9a004dbc96b89f1f376d64ba"} Apr 02 13:59:17 crc kubenswrapper[4732]: I0402 13:59:17.707215 4732 generic.go:334] "Generic (PLEG): container finished" podID="748727d6-c157-4a27-9e1d-a1acaaa57e1d" containerID="1f06a6a01b70864db610c9b5a52f02e68dca501f9a004dbc96b89f1f376d64ba" exitCode=0 Apr 02 13:59:17 crc kubenswrapper[4732]: I0402 13:59:17.707284 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5222s-config-9rjtq" event={"ID":"748727d6-c157-4a27-9e1d-a1acaaa57e1d","Type":"ContainerDied","Data":"1f06a6a01b70864db610c9b5a52f02e68dca501f9a004dbc96b89f1f376d64ba"} Apr 02 13:59:19 crc kubenswrapper[4732]: I0402 13:59:19.023009 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5222s-config-9rjtq" Apr 02 13:59:19 crc kubenswrapper[4732]: I0402 13:59:19.132762 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/748727d6-c157-4a27-9e1d-a1acaaa57e1d-var-log-ovn\") pod \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\" (UID: \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\") " Apr 02 13:59:19 crc kubenswrapper[4732]: I0402 13:59:19.132994 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/748727d6-c157-4a27-9e1d-a1acaaa57e1d-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "748727d6-c157-4a27-9e1d-a1acaaa57e1d" (UID: "748727d6-c157-4a27-9e1d-a1acaaa57e1d"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:59:19 crc kubenswrapper[4732]: I0402 13:59:19.133841 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/748727d6-c157-4a27-9e1d-a1acaaa57e1d-additional-scripts\") pod \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\" (UID: \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\") " Apr 02 13:59:19 crc kubenswrapper[4732]: I0402 13:59:19.134072 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/748727d6-c157-4a27-9e1d-a1acaaa57e1d-scripts\") pod \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\" (UID: \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\") " Apr 02 13:59:19 crc kubenswrapper[4732]: I0402 13:59:19.134119 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/748727d6-c157-4a27-9e1d-a1acaaa57e1d-var-run\") pod \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\" (UID: \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\") " Apr 02 13:59:19 crc kubenswrapper[4732]: I0402 13:59:19.134189 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/748727d6-c157-4a27-9e1d-a1acaaa57e1d-var-run-ovn\") pod \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\" (UID: \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\") " Apr 02 13:59:19 crc kubenswrapper[4732]: I0402 13:59:19.134264 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/748727d6-c157-4a27-9e1d-a1acaaa57e1d-var-run" (OuterVolumeSpecName: "var-run") pod "748727d6-c157-4a27-9e1d-a1acaaa57e1d" (UID: "748727d6-c157-4a27-9e1d-a1acaaa57e1d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:59:19 crc kubenswrapper[4732]: I0402 13:59:19.134332 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd8xx\" (UniqueName: \"kubernetes.io/projected/748727d6-c157-4a27-9e1d-a1acaaa57e1d-kube-api-access-fd8xx\") pod \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\" (UID: \"748727d6-c157-4a27-9e1d-a1acaaa57e1d\") " Apr 02 13:59:19 crc kubenswrapper[4732]: I0402 13:59:19.134361 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/748727d6-c157-4a27-9e1d-a1acaaa57e1d-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "748727d6-c157-4a27-9e1d-a1acaaa57e1d" (UID: "748727d6-c157-4a27-9e1d-a1acaaa57e1d"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 13:59:19 crc kubenswrapper[4732]: I0402 13:59:19.134630 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/748727d6-c157-4a27-9e1d-a1acaaa57e1d-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "748727d6-c157-4a27-9e1d-a1acaaa57e1d" (UID: "748727d6-c157-4a27-9e1d-a1acaaa57e1d"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:19 crc kubenswrapper[4732]: I0402 13:59:19.135004 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/748727d6-c157-4a27-9e1d-a1acaaa57e1d-scripts" (OuterVolumeSpecName: "scripts") pod "748727d6-c157-4a27-9e1d-a1acaaa57e1d" (UID: "748727d6-c157-4a27-9e1d-a1acaaa57e1d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:19 crc kubenswrapper[4732]: I0402 13:59:19.135361 4732 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/748727d6-c157-4a27-9e1d-a1acaaa57e1d-additional-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:19 crc kubenswrapper[4732]: I0402 13:59:19.135387 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/748727d6-c157-4a27-9e1d-a1acaaa57e1d-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:19 crc kubenswrapper[4732]: I0402 13:59:19.135397 4732 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/748727d6-c157-4a27-9e1d-a1acaaa57e1d-var-run\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:19 crc kubenswrapper[4732]: I0402 13:59:19.135406 4732 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/748727d6-c157-4a27-9e1d-a1acaaa57e1d-var-run-ovn\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:19 crc kubenswrapper[4732]: I0402 13:59:19.135416 4732 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/748727d6-c157-4a27-9e1d-a1acaaa57e1d-var-log-ovn\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:19 crc kubenswrapper[4732]: I0402 13:59:19.145425 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/748727d6-c157-4a27-9e1d-a1acaaa57e1d-kube-api-access-fd8xx" (OuterVolumeSpecName: "kube-api-access-fd8xx") pod "748727d6-c157-4a27-9e1d-a1acaaa57e1d" (UID: "748727d6-c157-4a27-9e1d-a1acaaa57e1d"). InnerVolumeSpecName "kube-api-access-fd8xx". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:59:19 crc kubenswrapper[4732]: I0402 13:59:19.237327 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd8xx\" (UniqueName: \"kubernetes.io/projected/748727d6-c157-4a27-9e1d-a1acaaa57e1d-kube-api-access-fd8xx\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:19 crc kubenswrapper[4732]: I0402 13:59:19.447496 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-5222s" Apr 02 13:59:19 crc kubenswrapper[4732]: I0402 13:59:19.724607 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5222s-config-9rjtq" event={"ID":"748727d6-c157-4a27-9e1d-a1acaaa57e1d","Type":"ContainerDied","Data":"04628f4bae4ee2828b88f1639718d70e22822b47e7e23b743428f9a2f2f061ce"} Apr 02 13:59:19 crc kubenswrapper[4732]: I0402 13:59:19.724672 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5222s-config-9rjtq" Apr 02 13:59:19 crc kubenswrapper[4732]: I0402 13:59:19.724681 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04628f4bae4ee2828b88f1639718d70e22822b47e7e23b743428f9a2f2f061ce" Apr 02 13:59:19 crc kubenswrapper[4732]: I0402 13:59:19.846370 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-5222s-config-9rjtq"] Apr 02 13:59:19 crc kubenswrapper[4732]: I0402 13:59:19.854127 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-5222s-config-9rjtq"] Apr 02 13:59:20 crc kubenswrapper[4732]: I0402 13:59:20.691341 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="748727d6-c157-4a27-9e1d-a1acaaa57e1d" path="/var/lib/kubelet/pods/748727d6-c157-4a27-9e1d-a1acaaa57e1d/volumes" Apr 02 13:59:21 crc kubenswrapper[4732]: I0402 13:59:21.750523 4732 generic.go:334] "Generic (PLEG): container finished" podID="e3eee308-f9e6-4475-a2a4-2116af760963" containerID="0a0032734f5fd52e26c5e0604d22f129133838bce5b2865c9a8ace616d48d77d" exitCode=0 Apr 02 13:59:21 crc kubenswrapper[4732]: I0402 13:59:21.750594 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-cs4gt" event={"ID":"e3eee308-f9e6-4475-a2a4-2116af760963","Type":"ContainerDied","Data":"0a0032734f5fd52e26c5e0604d22f129133838bce5b2865c9a8ace616d48d77d"} Apr 02 13:59:23 crc kubenswrapper[4732]: I0402 13:59:23.148299 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-cs4gt" Apr 02 13:59:23 crc kubenswrapper[4732]: I0402 13:59:23.206353 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3eee308-f9e6-4475-a2a4-2116af760963-config-data\") pod \"e3eee308-f9e6-4475-a2a4-2116af760963\" (UID: \"e3eee308-f9e6-4475-a2a4-2116af760963\") " Apr 02 13:59:23 crc kubenswrapper[4732]: I0402 13:59:23.206556 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e3eee308-f9e6-4475-a2a4-2116af760963-db-sync-config-data\") pod \"e3eee308-f9e6-4475-a2a4-2116af760963\" (UID: \"e3eee308-f9e6-4475-a2a4-2116af760963\") " Apr 02 13:59:23 crc kubenswrapper[4732]: I0402 13:59:23.206602 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3eee308-f9e6-4475-a2a4-2116af760963-combined-ca-bundle\") pod \"e3eee308-f9e6-4475-a2a4-2116af760963\" (UID: \"e3eee308-f9e6-4475-a2a4-2116af760963\") " Apr 02 13:59:23 crc kubenswrapper[4732]: I0402 13:59:23.206641 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6gnf\" (UniqueName: \"kubernetes.io/projected/e3eee308-f9e6-4475-a2a4-2116af760963-kube-api-access-s6gnf\") pod \"e3eee308-f9e6-4475-a2a4-2116af760963\" (UID: \"e3eee308-f9e6-4475-a2a4-2116af760963\") " Apr 02 13:59:23 crc kubenswrapper[4732]: I0402 13:59:23.211717 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3eee308-f9e6-4475-a2a4-2116af760963-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e3eee308-f9e6-4475-a2a4-2116af760963" (UID: "e3eee308-f9e6-4475-a2a4-2116af760963"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:59:23 crc kubenswrapper[4732]: I0402 13:59:23.211934 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3eee308-f9e6-4475-a2a4-2116af760963-kube-api-access-s6gnf" (OuterVolumeSpecName: "kube-api-access-s6gnf") pod "e3eee308-f9e6-4475-a2a4-2116af760963" (UID: "e3eee308-f9e6-4475-a2a4-2116af760963"). InnerVolumeSpecName "kube-api-access-s6gnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:59:23 crc kubenswrapper[4732]: I0402 13:59:23.229811 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3eee308-f9e6-4475-a2a4-2116af760963-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3eee308-f9e6-4475-a2a4-2116af760963" (UID: "e3eee308-f9e6-4475-a2a4-2116af760963"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:59:23 crc kubenswrapper[4732]: I0402 13:59:23.252109 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3eee308-f9e6-4475-a2a4-2116af760963-config-data" (OuterVolumeSpecName: "config-data") pod "e3eee308-f9e6-4475-a2a4-2116af760963" (UID: "e3eee308-f9e6-4475-a2a4-2116af760963"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:59:23 crc kubenswrapper[4732]: I0402 13:59:23.308336 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6gnf\" (UniqueName: \"kubernetes.io/projected/e3eee308-f9e6-4475-a2a4-2116af760963-kube-api-access-s6gnf\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:23 crc kubenswrapper[4732]: I0402 13:59:23.308361 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3eee308-f9e6-4475-a2a4-2116af760963-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:23 crc kubenswrapper[4732]: I0402 13:59:23.308371 4732 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e3eee308-f9e6-4475-a2a4-2116af760963-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:23 crc kubenswrapper[4732]: I0402 13:59:23.308381 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3eee308-f9e6-4475-a2a4-2116af760963-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:23 crc kubenswrapper[4732]: I0402 13:59:23.769322 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-cs4gt" event={"ID":"e3eee308-f9e6-4475-a2a4-2116af760963","Type":"ContainerDied","Data":"8b32680cbed87d1485954a4f0720f3b315fb4a4b1c7699193cf12fdcd544d36d"} Apr 02 13:59:23 crc kubenswrapper[4732]: I0402 13:59:23.769601 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b32680cbed87d1485954a4f0720f3b315fb4a4b1c7699193cf12fdcd544d36d" Apr 02 13:59:23 crc kubenswrapper[4732]: I0402 13:59:23.769366 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-cs4gt" Apr 02 13:59:23 crc kubenswrapper[4732]: I0402 13:59:23.816667 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/529a332e-d2c3-49c5-86d5-e672811d00cd-etc-swift\") pod \"swift-storage-0\" (UID: \"529a332e-d2c3-49c5-86d5-e672811d00cd\") " pod="openstack/swift-storage-0" Apr 02 13:59:23 crc kubenswrapper[4732]: I0402 13:59:23.820867 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/529a332e-d2c3-49c5-86d5-e672811d00cd-etc-swift\") pod \"swift-storage-0\" (UID: \"529a332e-d2c3-49c5-86d5-e672811d00cd\") " pod="openstack/swift-storage-0" Apr 02 13:59:23 crc kubenswrapper[4732]: I0402 13:59:23.928025 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.215048 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-z6mt2"] Apr 02 13:59:24 crc kubenswrapper[4732]: E0402 13:59:24.215850 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3eee308-f9e6-4475-a2a4-2116af760963" containerName="glance-db-sync" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.215889 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3eee308-f9e6-4475-a2a4-2116af760963" containerName="glance-db-sync" Apr 02 13:59:24 crc kubenswrapper[4732]: E0402 13:59:24.215902 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2e69ec9-8296-4cbf-a039-82cd34ed3d72" containerName="mariadb-account-create-update" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.215910 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e69ec9-8296-4cbf-a039-82cd34ed3d72" containerName="mariadb-account-create-update" Apr 02 13:59:24 crc kubenswrapper[4732]: E0402 13:59:24.215920 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="748727d6-c157-4a27-9e1d-a1acaaa57e1d" containerName="ovn-config" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.215926 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="748727d6-c157-4a27-9e1d-a1acaaa57e1d" containerName="ovn-config" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.216216 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="748727d6-c157-4a27-9e1d-a1acaaa57e1d" containerName="ovn-config" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.216243 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2e69ec9-8296-4cbf-a039-82cd34ed3d72" containerName="mariadb-account-create-update" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.216256 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3eee308-f9e6-4475-a2a4-2116af760963" containerName="glance-db-sync" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.217473 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.247924 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-z6mt2"] Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.326131 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bac9092-5574-4351-954b-b63659d28b2e-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-z6mt2\" (UID: \"2bac9092-5574-4351-954b-b63659d28b2e\") " pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.326203 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bac9092-5574-4351-954b-b63659d28b2e-config\") pod \"dnsmasq-dns-74dc88fc-z6mt2\" (UID: \"2bac9092-5574-4351-954b-b63659d28b2e\") " pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.326266 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bac9092-5574-4351-954b-b63659d28b2e-dns-svc\") pod \"dnsmasq-dns-74dc88fc-z6mt2\" (UID: \"2bac9092-5574-4351-954b-b63659d28b2e\") " pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.326298 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bac9092-5574-4351-954b-b63659d28b2e-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-z6mt2\" (UID: \"2bac9092-5574-4351-954b-b63659d28b2e\") " pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.326443 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s68rk\" (UniqueName: \"kubernetes.io/projected/2bac9092-5574-4351-954b-b63659d28b2e-kube-api-access-s68rk\") pod \"dnsmasq-dns-74dc88fc-z6mt2\" (UID: \"2bac9092-5574-4351-954b-b63659d28b2e\") " pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.427579 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s68rk\" (UniqueName: \"kubernetes.io/projected/2bac9092-5574-4351-954b-b63659d28b2e-kube-api-access-s68rk\") pod \"dnsmasq-dns-74dc88fc-z6mt2\" (UID: \"2bac9092-5574-4351-954b-b63659d28b2e\") " pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.427652 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bac9092-5574-4351-954b-b63659d28b2e-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-z6mt2\" (UID: \"2bac9092-5574-4351-954b-b63659d28b2e\") " pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.427687 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bac9092-5574-4351-954b-b63659d28b2e-config\") pod \"dnsmasq-dns-74dc88fc-z6mt2\" (UID: \"2bac9092-5574-4351-954b-b63659d28b2e\") " pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.427730 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bac9092-5574-4351-954b-b63659d28b2e-dns-svc\") pod \"dnsmasq-dns-74dc88fc-z6mt2\" (UID: \"2bac9092-5574-4351-954b-b63659d28b2e\") " pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.427759 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bac9092-5574-4351-954b-b63659d28b2e-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-z6mt2\" (UID: \"2bac9092-5574-4351-954b-b63659d28b2e\") " pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.429276 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bac9092-5574-4351-954b-b63659d28b2e-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-z6mt2\" (UID: \"2bac9092-5574-4351-954b-b63659d28b2e\") " pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.430236 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bac9092-5574-4351-954b-b63659d28b2e-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-z6mt2\" (UID: \"2bac9092-5574-4351-954b-b63659d28b2e\") " pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.435205 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bac9092-5574-4351-954b-b63659d28b2e-config\") pod \"dnsmasq-dns-74dc88fc-z6mt2\" (UID: \"2bac9092-5574-4351-954b-b63659d28b2e\") " pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.435223 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bac9092-5574-4351-954b-b63659d28b2e-dns-svc\") pod \"dnsmasq-dns-74dc88fc-z6mt2\" (UID: \"2bac9092-5574-4351-954b-b63659d28b2e\") " pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.458094 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s68rk\" (UniqueName: \"kubernetes.io/projected/2bac9092-5574-4351-954b-b63659d28b2e-kube-api-access-s68rk\") pod \"dnsmasq-dns-74dc88fc-z6mt2\" (UID: \"2bac9092-5574-4351-954b-b63659d28b2e\") " pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.555972 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.604030 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.771688 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="56762f05-a513-4f47-8cf7-5d19bb58c5bd" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.95:5671: connect: connection refused" Apr 02 13:59:24 crc kubenswrapper[4732]: I0402 13:59:24.777535 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529a332e-d2c3-49c5-86d5-e672811d00cd","Type":"ContainerStarted","Data":"598b226e229b0488b120c5ec7a69432d52b1886e004276020fd6f860c2ea9b81"} Apr 02 13:59:25 crc kubenswrapper[4732]: I0402 13:59:25.019472 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-z6mt2"] Apr 02 13:59:25 crc kubenswrapper[4732]: I0402 13:59:25.098841 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Apr 02 13:59:25 crc kubenswrapper[4732]: I0402 13:59:25.790717 4732 generic.go:334] "Generic (PLEG): container finished" podID="2bac9092-5574-4351-954b-b63659d28b2e" containerID="653cd427f0f645bd1f6210b42716a3dc12741571345aba90ba5dfe36a9a55ab6" exitCode=0 Apr 02 13:59:25 crc kubenswrapper[4732]: I0402 13:59:25.791250 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" event={"ID":"2bac9092-5574-4351-954b-b63659d28b2e","Type":"ContainerDied","Data":"653cd427f0f645bd1f6210b42716a3dc12741571345aba90ba5dfe36a9a55ab6"} Apr 02 13:59:25 crc kubenswrapper[4732]: I0402 13:59:25.791279 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" event={"ID":"2bac9092-5574-4351-954b-b63659d28b2e","Type":"ContainerStarted","Data":"3695f9a19a3e2888e79be267f3868a9a0717d0496c26fd14f79367d77d2d3edd"} Apr 02 13:59:26 crc kubenswrapper[4732]: I0402 13:59:26.800648 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529a332e-d2c3-49c5-86d5-e672811d00cd","Type":"ContainerStarted","Data":"b6bcc3c958a7a732fa19c768bdedc54a1c40be95fa3a4ae1b2fd2dbdeba0b8f1"} Apr 02 13:59:26 crc kubenswrapper[4732]: I0402 13:59:26.801042 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529a332e-d2c3-49c5-86d5-e672811d00cd","Type":"ContainerStarted","Data":"245524d7e22b99428b0053fbc0525222227385fc5d3dc773ec75d1530ce37c43"} Apr 02 13:59:26 crc kubenswrapper[4732]: I0402 13:59:26.801059 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529a332e-d2c3-49c5-86d5-e672811d00cd","Type":"ContainerStarted","Data":"5137452dc3d3256fe9f0790519ebd0c13371f556cad3cb4c40a20e91db5a0e15"} Apr 02 13:59:26 crc kubenswrapper[4732]: I0402 13:59:26.801068 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529a332e-d2c3-49c5-86d5-e672811d00cd","Type":"ContainerStarted","Data":"4b0ec31d834a54db409c75b4665f8c7bb350a8081e8cad2f4ed48ac5daefd711"} Apr 02 13:59:26 crc kubenswrapper[4732]: I0402 13:59:26.802378 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" event={"ID":"2bac9092-5574-4351-954b-b63659d28b2e","Type":"ContainerStarted","Data":"b4395fba8e8e7e36aa9b94239e1295839384a286adc587510e0fb0d41b8a15ac"} Apr 02 13:59:26 crc kubenswrapper[4732]: I0402 13:59:26.802534 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" Apr 02 13:59:26 crc kubenswrapper[4732]: I0402 13:59:26.823571 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" podStartSLOduration=2.823553066 podStartE2EDuration="2.823553066s" podCreationTimestamp="2026-04-02 13:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:59:26.81667108 +0000 UTC m=+1323.721078653" watchObservedRunningTime="2026-04-02 13:59:26.823553066 +0000 UTC m=+1323.727960609" Apr 02 13:59:28 crc kubenswrapper[4732]: I0402 13:59:28.826474 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529a332e-d2c3-49c5-86d5-e672811d00cd","Type":"ContainerStarted","Data":"cd4baa4e358261c05d962ff5da98c4d3cc7b5ad67b7df266e94123f9f740a45d"} Apr 02 13:59:29 crc kubenswrapper[4732]: I0402 13:59:29.838322 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529a332e-d2c3-49c5-86d5-e672811d00cd","Type":"ContainerStarted","Data":"4c526f73a1ed56d169093d159a8a3a54b6265f60c9769eb164bab7551ed0d6a0"} Apr 02 13:59:29 crc kubenswrapper[4732]: I0402 13:59:29.838651 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529a332e-d2c3-49c5-86d5-e672811d00cd","Type":"ContainerStarted","Data":"0f8a5b5ba26dce20bd6a643fe4def6fe0c475d580c886488e3806b040590175f"} Apr 02 13:59:29 crc kubenswrapper[4732]: I0402 13:59:29.838666 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529a332e-d2c3-49c5-86d5-e672811d00cd","Type":"ContainerStarted","Data":"29199cc946b4c9f8fe918404daf1e6e9998f8d3d751f11120466e751b302d1ed"} Apr 02 13:59:31 crc kubenswrapper[4732]: I0402 13:59:31.861758 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529a332e-d2c3-49c5-86d5-e672811d00cd","Type":"ContainerStarted","Data":"dc0b3878dd1d8719731f10598af3ad7ec781fe50607006d9da09ba66a2c07aba"} Apr 02 13:59:31 crc kubenswrapper[4732]: I0402 13:59:31.862992 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529a332e-d2c3-49c5-86d5-e672811d00cd","Type":"ContainerStarted","Data":"9020b61a5d908ecd987f789eb0c94e6fe4a625b837f7a924f617b7cbb736c0b3"} Apr 02 13:59:31 crc kubenswrapper[4732]: I0402 13:59:31.863007 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529a332e-d2c3-49c5-86d5-e672811d00cd","Type":"ContainerStarted","Data":"322332d3848a5db450795d0e1a0b08909e20c2af221b3347f99483cf4e8211be"} Apr 02 13:59:31 crc kubenswrapper[4732]: I0402 13:59:31.925045 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 13:59:31 crc kubenswrapper[4732]: I0402 13:59:31.925103 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 13:59:32 crc kubenswrapper[4732]: I0402 13:59:32.876362 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529a332e-d2c3-49c5-86d5-e672811d00cd","Type":"ContainerStarted","Data":"2df5c8806e090fcd8921be29ac6e73076df45059b1db28f1feee1fc802ac4659"} Apr 02 13:59:32 crc kubenswrapper[4732]: I0402 13:59:32.876739 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529a332e-d2c3-49c5-86d5-e672811d00cd","Type":"ContainerStarted","Data":"c851da6de4a121958c76892cddd55e756639d16c7d6ca09b7c0600989f736d1f"} Apr 02 13:59:32 crc kubenswrapper[4732]: I0402 13:59:32.876754 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529a332e-d2c3-49c5-86d5-e672811d00cd","Type":"ContainerStarted","Data":"ad8151f7d6c56fbd5b08cea1343241657215a67fd1912cd8f3fe458d8cdc34b9"} Apr 02 13:59:32 crc kubenswrapper[4732]: I0402 13:59:32.876764 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529a332e-d2c3-49c5-86d5-e672811d00cd","Type":"ContainerStarted","Data":"04c718a8d305d3ea89fa137f1ed18fa514f9cfbe057677fa80d12f61c1981915"} Apr 02 13:59:32 crc kubenswrapper[4732]: I0402 13:59:32.924970 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.716602204 podStartE2EDuration="42.924948931s" podCreationTimestamp="2026-04-02 13:58:50 +0000 UTC" firstStartedPulling="2026-04-02 13:59:24.63942464 +0000 UTC m=+1321.543832193" lastFinishedPulling="2026-04-02 13:59:30.847771367 +0000 UTC m=+1327.752178920" observedRunningTime="2026-04-02 13:59:32.923257836 +0000 UTC m=+1329.827665409" watchObservedRunningTime="2026-04-02 13:59:32.924948931 +0000 UTC m=+1329.829356484" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.198777 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-z6mt2"] Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.198972 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" podUID="2bac9092-5574-4351-954b-b63659d28b2e" containerName="dnsmasq-dns" containerID="cri-o://b4395fba8e8e7e36aa9b94239e1295839384a286adc587510e0fb0d41b8a15ac" gracePeriod=10 Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.200255 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.235020 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-ftbdg"] Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.236291 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.248971 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.256065 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-ftbdg"] Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.376740 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-ftbdg\" (UID: \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\") " pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.376792 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-config\") pod \"dnsmasq-dns-895cf5cf-ftbdg\" (UID: \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\") " pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.376819 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-ftbdg\" (UID: \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\") " pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.377034 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x265x\" (UniqueName: \"kubernetes.io/projected/e48613ce-8967-4bcb-b928-eb2b2c662c0d-kube-api-access-x265x\") pod \"dnsmasq-dns-895cf5cf-ftbdg\" (UID: \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\") " pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.377221 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-dns-svc\") pod \"dnsmasq-dns-895cf5cf-ftbdg\" (UID: \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\") " pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.377278 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-ftbdg\" (UID: \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\") " pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.478683 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x265x\" (UniqueName: \"kubernetes.io/projected/e48613ce-8967-4bcb-b928-eb2b2c662c0d-kube-api-access-x265x\") pod \"dnsmasq-dns-895cf5cf-ftbdg\" (UID: \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\") " pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.478757 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-dns-svc\") pod \"dnsmasq-dns-895cf5cf-ftbdg\" (UID: \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\") " pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.478790 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-ftbdg\" (UID: \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\") " pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.478877 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-ftbdg\" (UID: \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\") " pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.478923 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-config\") pod \"dnsmasq-dns-895cf5cf-ftbdg\" (UID: \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\") " pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.478958 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-ftbdg\" (UID: \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\") " pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.479816 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-ftbdg\" (UID: \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\") " pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.479865 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-ftbdg\" (UID: \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\") " pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.479914 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-config\") pod \"dnsmasq-dns-895cf5cf-ftbdg\" (UID: \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\") " pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.481421 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-ftbdg\" (UID: \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\") " pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.482212 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-dns-svc\") pod \"dnsmasq-dns-895cf5cf-ftbdg\" (UID: \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\") " pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.500787 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x265x\" (UniqueName: \"kubernetes.io/projected/e48613ce-8967-4bcb-b928-eb2b2c662c0d-kube-api-access-x265x\") pod \"dnsmasq-dns-895cf5cf-ftbdg\" (UID: \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\") " pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.585281 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.639078 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.783827 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bac9092-5574-4351-954b-b63659d28b2e-config\") pod \"2bac9092-5574-4351-954b-b63659d28b2e\" (UID: \"2bac9092-5574-4351-954b-b63659d28b2e\") " Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.783911 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bac9092-5574-4351-954b-b63659d28b2e-dns-svc\") pod \"2bac9092-5574-4351-954b-b63659d28b2e\" (UID: \"2bac9092-5574-4351-954b-b63659d28b2e\") " Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.783999 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bac9092-5574-4351-954b-b63659d28b2e-ovsdbserver-nb\") pod \"2bac9092-5574-4351-954b-b63659d28b2e\" (UID: \"2bac9092-5574-4351-954b-b63659d28b2e\") " Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.784128 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bac9092-5574-4351-954b-b63659d28b2e-ovsdbserver-sb\") pod \"2bac9092-5574-4351-954b-b63659d28b2e\" (UID: \"2bac9092-5574-4351-954b-b63659d28b2e\") " Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.784215 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s68rk\" (UniqueName: \"kubernetes.io/projected/2bac9092-5574-4351-954b-b63659d28b2e-kube-api-access-s68rk\") pod \"2bac9092-5574-4351-954b-b63659d28b2e\" (UID: \"2bac9092-5574-4351-954b-b63659d28b2e\") " Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.792499 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bac9092-5574-4351-954b-b63659d28b2e-kube-api-access-s68rk" (OuterVolumeSpecName: "kube-api-access-s68rk") pod "2bac9092-5574-4351-954b-b63659d28b2e" (UID: "2bac9092-5574-4351-954b-b63659d28b2e"). InnerVolumeSpecName "kube-api-access-s68rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.826706 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bac9092-5574-4351-954b-b63659d28b2e-config" (OuterVolumeSpecName: "config") pod "2bac9092-5574-4351-954b-b63659d28b2e" (UID: "2bac9092-5574-4351-954b-b63659d28b2e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.830106 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bac9092-5574-4351-954b-b63659d28b2e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2bac9092-5574-4351-954b-b63659d28b2e" (UID: "2bac9092-5574-4351-954b-b63659d28b2e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.832660 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bac9092-5574-4351-954b-b63659d28b2e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2bac9092-5574-4351-954b-b63659d28b2e" (UID: "2bac9092-5574-4351-954b-b63659d28b2e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.843728 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bac9092-5574-4351-954b-b63659d28b2e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2bac9092-5574-4351-954b-b63659d28b2e" (UID: "2bac9092-5574-4351-954b-b63659d28b2e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.887186 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s68rk\" (UniqueName: \"kubernetes.io/projected/2bac9092-5574-4351-954b-b63659d28b2e-kube-api-access-s68rk\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.887227 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bac9092-5574-4351-954b-b63659d28b2e-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.887238 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bac9092-5574-4351-954b-b63659d28b2e-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.887251 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bac9092-5574-4351-954b-b63659d28b2e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.887263 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bac9092-5574-4351-954b-b63659d28b2e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.890561 4732 generic.go:334] "Generic (PLEG): container finished" podID="2bac9092-5574-4351-954b-b63659d28b2e" containerID="b4395fba8e8e7e36aa9b94239e1295839384a286adc587510e0fb0d41b8a15ac" exitCode=0 Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.890646 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.890645 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" event={"ID":"2bac9092-5574-4351-954b-b63659d28b2e","Type":"ContainerDied","Data":"b4395fba8e8e7e36aa9b94239e1295839384a286adc587510e0fb0d41b8a15ac"} Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.890778 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-z6mt2" event={"ID":"2bac9092-5574-4351-954b-b63659d28b2e","Type":"ContainerDied","Data":"3695f9a19a3e2888e79be267f3868a9a0717d0496c26fd14f79367d77d2d3edd"} Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.890803 4732 scope.go:117] "RemoveContainer" containerID="b4395fba8e8e7e36aa9b94239e1295839384a286adc587510e0fb0d41b8a15ac" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.911144 4732 scope.go:117] "RemoveContainer" containerID="653cd427f0f645bd1f6210b42716a3dc12741571345aba90ba5dfe36a9a55ab6" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.924041 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-z6mt2"] Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.932545 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-z6mt2"] Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.943429 4732 scope.go:117] "RemoveContainer" containerID="b4395fba8e8e7e36aa9b94239e1295839384a286adc587510e0fb0d41b8a15ac" Apr 02 13:59:33 crc kubenswrapper[4732]: E0402 13:59:33.943842 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4395fba8e8e7e36aa9b94239e1295839384a286adc587510e0fb0d41b8a15ac\": container with ID starting with b4395fba8e8e7e36aa9b94239e1295839384a286adc587510e0fb0d41b8a15ac not found: ID does not exist" containerID="b4395fba8e8e7e36aa9b94239e1295839384a286adc587510e0fb0d41b8a15ac" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.943877 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4395fba8e8e7e36aa9b94239e1295839384a286adc587510e0fb0d41b8a15ac"} err="failed to get container status \"b4395fba8e8e7e36aa9b94239e1295839384a286adc587510e0fb0d41b8a15ac\": rpc error: code = NotFound desc = could not find container \"b4395fba8e8e7e36aa9b94239e1295839384a286adc587510e0fb0d41b8a15ac\": container with ID starting with b4395fba8e8e7e36aa9b94239e1295839384a286adc587510e0fb0d41b8a15ac not found: ID does not exist" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.943897 4732 scope.go:117] "RemoveContainer" containerID="653cd427f0f645bd1f6210b42716a3dc12741571345aba90ba5dfe36a9a55ab6" Apr 02 13:59:33 crc kubenswrapper[4732]: E0402 13:59:33.944180 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"653cd427f0f645bd1f6210b42716a3dc12741571345aba90ba5dfe36a9a55ab6\": container with ID starting with 653cd427f0f645bd1f6210b42716a3dc12741571345aba90ba5dfe36a9a55ab6 not found: ID does not exist" containerID="653cd427f0f645bd1f6210b42716a3dc12741571345aba90ba5dfe36a9a55ab6" Apr 02 13:59:33 crc kubenswrapper[4732]: I0402 13:59:33.944207 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"653cd427f0f645bd1f6210b42716a3dc12741571345aba90ba5dfe36a9a55ab6"} err="failed to get container status \"653cd427f0f645bd1f6210b42716a3dc12741571345aba90ba5dfe36a9a55ab6\": rpc error: code = NotFound desc = could not find container \"653cd427f0f645bd1f6210b42716a3dc12741571345aba90ba5dfe36a9a55ab6\": container with ID starting with 653cd427f0f645bd1f6210b42716a3dc12741571345aba90ba5dfe36a9a55ab6 not found: ID does not exist" Apr 02 13:59:34 crc kubenswrapper[4732]: I0402 13:59:34.075841 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-ftbdg"] Apr 02 13:59:34 crc kubenswrapper[4732]: W0402 13:59:34.081099 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode48613ce_8967_4bcb_b928_eb2b2c662c0d.slice/crio-645f2e48aba1dd4aee1325a0560d67b95e7b92a88ba28a4464846b43b1e3d883 WatchSource:0}: Error finding container 645f2e48aba1dd4aee1325a0560d67b95e7b92a88ba28a4464846b43b1e3d883: Status 404 returned error can't find the container with id 645f2e48aba1dd4aee1325a0560d67b95e7b92a88ba28a4464846b43b1e3d883 Apr 02 13:59:34 crc kubenswrapper[4732]: I0402 13:59:34.702835 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bac9092-5574-4351-954b-b63659d28b2e" path="/var/lib/kubelet/pods/2bac9092-5574-4351-954b-b63659d28b2e/volumes" Apr 02 13:59:34 crc kubenswrapper[4732]: I0402 13:59:34.767558 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Apr 02 13:59:34 crc kubenswrapper[4732]: I0402 13:59:34.909062 4732 generic.go:334] "Generic (PLEG): container finished" podID="e48613ce-8967-4bcb-b928-eb2b2c662c0d" containerID="efd3c9b6a794d607a7b118465d06e0041af8c6c5969b81d4c611982fc16c04f5" exitCode=0 Apr 02 13:59:34 crc kubenswrapper[4732]: I0402 13:59:34.909117 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" event={"ID":"e48613ce-8967-4bcb-b928-eb2b2c662c0d","Type":"ContainerDied","Data":"efd3c9b6a794d607a7b118465d06e0041af8c6c5969b81d4c611982fc16c04f5"} Apr 02 13:59:34 crc kubenswrapper[4732]: I0402 13:59:34.909155 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" event={"ID":"e48613ce-8967-4bcb-b928-eb2b2c662c0d","Type":"ContainerStarted","Data":"645f2e48aba1dd4aee1325a0560d67b95e7b92a88ba28a4464846b43b1e3d883"} Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.123832 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-lc9bc"] Apr 02 13:59:35 crc kubenswrapper[4732]: E0402 13:59:35.124408 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bac9092-5574-4351-954b-b63659d28b2e" containerName="dnsmasq-dns" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.124422 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bac9092-5574-4351-954b-b63659d28b2e" containerName="dnsmasq-dns" Apr 02 13:59:35 crc kubenswrapper[4732]: E0402 13:59:35.124453 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bac9092-5574-4351-954b-b63659d28b2e" containerName="init" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.124458 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bac9092-5574-4351-954b-b63659d28b2e" containerName="init" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.124623 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bac9092-5574-4351-954b-b63659d28b2e" containerName="dnsmasq-dns" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.125133 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lc9bc" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.140193 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-lc9bc"] Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.209014 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqqhf\" (UniqueName: \"kubernetes.io/projected/7ed7061f-5bb0-4113-851f-45cb0af3e77d-kube-api-access-fqqhf\") pod \"cinder-db-create-lc9bc\" (UID: \"7ed7061f-5bb0-4113-851f-45cb0af3e77d\") " pod="openstack/cinder-db-create-lc9bc" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.209096 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ed7061f-5bb0-4113-851f-45cb0af3e77d-operator-scripts\") pod \"cinder-db-create-lc9bc\" (UID: \"7ed7061f-5bb0-4113-851f-45cb0af3e77d\") " pod="openstack/cinder-db-create-lc9bc" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.227662 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-5e24-account-create-update-rbhwq"] Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.228956 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5e24-account-create-update-rbhwq" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.231105 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.240069 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5e24-account-create-update-rbhwq"] Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.310663 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ed7061f-5bb0-4113-851f-45cb0af3e77d-operator-scripts\") pod \"cinder-db-create-lc9bc\" (UID: \"7ed7061f-5bb0-4113-851f-45cb0af3e77d\") " pod="openstack/cinder-db-create-lc9bc" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.310791 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61ff3e0e-f82e-4130-afa8-739689043221-operator-scripts\") pod \"cinder-5e24-account-create-update-rbhwq\" (UID: \"61ff3e0e-f82e-4130-afa8-739689043221\") " pod="openstack/cinder-5e24-account-create-update-rbhwq" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.310831 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn87p\" (UniqueName: \"kubernetes.io/projected/61ff3e0e-f82e-4130-afa8-739689043221-kube-api-access-vn87p\") pod \"cinder-5e24-account-create-update-rbhwq\" (UID: \"61ff3e0e-f82e-4130-afa8-739689043221\") " pod="openstack/cinder-5e24-account-create-update-rbhwq" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.310988 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqqhf\" (UniqueName: \"kubernetes.io/projected/7ed7061f-5bb0-4113-851f-45cb0af3e77d-kube-api-access-fqqhf\") pod \"cinder-db-create-lc9bc\" (UID: \"7ed7061f-5bb0-4113-851f-45cb0af3e77d\") " pod="openstack/cinder-db-create-lc9bc" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.311436 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ed7061f-5bb0-4113-851f-45cb0af3e77d-operator-scripts\") pod \"cinder-db-create-lc9bc\" (UID: \"7ed7061f-5bb0-4113-851f-45cb0af3e77d\") " pod="openstack/cinder-db-create-lc9bc" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.327245 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqqhf\" (UniqueName: \"kubernetes.io/projected/7ed7061f-5bb0-4113-851f-45cb0af3e77d-kube-api-access-fqqhf\") pod \"cinder-db-create-lc9bc\" (UID: \"7ed7061f-5bb0-4113-851f-45cb0af3e77d\") " pod="openstack/cinder-db-create-lc9bc" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.407281 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-blk7m"] Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.408683 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-blk7m" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.412472 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61ff3e0e-f82e-4130-afa8-739689043221-operator-scripts\") pod \"cinder-5e24-account-create-update-rbhwq\" (UID: \"61ff3e0e-f82e-4130-afa8-739689043221\") " pod="openstack/cinder-5e24-account-create-update-rbhwq" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.412570 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn87p\" (UniqueName: \"kubernetes.io/projected/61ff3e0e-f82e-4130-afa8-739689043221-kube-api-access-vn87p\") pod \"cinder-5e24-account-create-update-rbhwq\" (UID: \"61ff3e0e-f82e-4130-afa8-739689043221\") " pod="openstack/cinder-5e24-account-create-update-rbhwq" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.413688 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61ff3e0e-f82e-4130-afa8-739689043221-operator-scripts\") pod \"cinder-5e24-account-create-update-rbhwq\" (UID: \"61ff3e0e-f82e-4130-afa8-739689043221\") " pod="openstack/cinder-5e24-account-create-update-rbhwq" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.417746 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-4269-account-create-update-nv66q"] Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.418794 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4269-account-create-update-nv66q" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.420933 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.426145 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-blk7m"] Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.440387 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lc9bc" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.441095 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4269-account-create-update-nv66q"] Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.445970 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn87p\" (UniqueName: \"kubernetes.io/projected/61ff3e0e-f82e-4130-afa8-739689043221-kube-api-access-vn87p\") pod \"cinder-5e24-account-create-update-rbhwq\" (UID: \"61ff3e0e-f82e-4130-afa8-739689043221\") " pod="openstack/cinder-5e24-account-create-update-rbhwq" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.483758 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-qgp9c"] Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.485073 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qgp9c" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.488372 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.488712 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.496657 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-scsz6" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.499452 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.510749 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qgp9c"] Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.513890 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz4h9\" (UniqueName: \"kubernetes.io/projected/f7e9d6ea-5e58-46fb-a241-6db82d0abd15-kube-api-access-fz4h9\") pod \"barbican-4269-account-create-update-nv66q\" (UID: \"f7e9d6ea-5e58-46fb-a241-6db82d0abd15\") " pod="openstack/barbican-4269-account-create-update-nv66q" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.513950 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7e9d6ea-5e58-46fb-a241-6db82d0abd15-operator-scripts\") pod \"barbican-4269-account-create-update-nv66q\" (UID: \"f7e9d6ea-5e58-46fb-a241-6db82d0abd15\") " pod="openstack/barbican-4269-account-create-update-nv66q" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.514007 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50683249-0922-48bc-9bea-f6ce81e3d192-operator-scripts\") pod \"barbican-db-create-blk7m\" (UID: \"50683249-0922-48bc-9bea-f6ce81e3d192\") " pod="openstack/barbican-db-create-blk7m" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.514040 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jtv2\" (UniqueName: \"kubernetes.io/projected/50683249-0922-48bc-9bea-f6ce81e3d192-kube-api-access-9jtv2\") pod \"barbican-db-create-blk7m\" (UID: \"50683249-0922-48bc-9bea-f6ce81e3d192\") " pod="openstack/barbican-db-create-blk7m" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.542916 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5e24-account-create-update-rbhwq" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.616002 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5znkx\" (UniqueName: \"kubernetes.io/projected/756f0330-2838-4d2f-a92f-739ed4acab76-kube-api-access-5znkx\") pod \"keystone-db-sync-qgp9c\" (UID: \"756f0330-2838-4d2f-a92f-739ed4acab76\") " pod="openstack/keystone-db-sync-qgp9c" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.616074 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50683249-0922-48bc-9bea-f6ce81e3d192-operator-scripts\") pod \"barbican-db-create-blk7m\" (UID: \"50683249-0922-48bc-9bea-f6ce81e3d192\") " pod="openstack/barbican-db-create-blk7m" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.616132 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jtv2\" (UniqueName: \"kubernetes.io/projected/50683249-0922-48bc-9bea-f6ce81e3d192-kube-api-access-9jtv2\") pod \"barbican-db-create-blk7m\" (UID: \"50683249-0922-48bc-9bea-f6ce81e3d192\") " pod="openstack/barbican-db-create-blk7m" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.616189 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/756f0330-2838-4d2f-a92f-739ed4acab76-config-data\") pod \"keystone-db-sync-qgp9c\" (UID: \"756f0330-2838-4d2f-a92f-739ed4acab76\") " pod="openstack/keystone-db-sync-qgp9c" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.616233 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/756f0330-2838-4d2f-a92f-739ed4acab76-combined-ca-bundle\") pod \"keystone-db-sync-qgp9c\" (UID: \"756f0330-2838-4d2f-a92f-739ed4acab76\") " pod="openstack/keystone-db-sync-qgp9c" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.616294 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz4h9\" (UniqueName: \"kubernetes.io/projected/f7e9d6ea-5e58-46fb-a241-6db82d0abd15-kube-api-access-fz4h9\") pod \"barbican-4269-account-create-update-nv66q\" (UID: \"f7e9d6ea-5e58-46fb-a241-6db82d0abd15\") " pod="openstack/barbican-4269-account-create-update-nv66q" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.616351 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7e9d6ea-5e58-46fb-a241-6db82d0abd15-operator-scripts\") pod \"barbican-4269-account-create-update-nv66q\" (UID: \"f7e9d6ea-5e58-46fb-a241-6db82d0abd15\") " pod="openstack/barbican-4269-account-create-update-nv66q" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.617262 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7e9d6ea-5e58-46fb-a241-6db82d0abd15-operator-scripts\") pod \"barbican-4269-account-create-update-nv66q\" (UID: \"f7e9d6ea-5e58-46fb-a241-6db82d0abd15\") " pod="openstack/barbican-4269-account-create-update-nv66q" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.618669 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-rvpl9"] Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.618844 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50683249-0922-48bc-9bea-f6ce81e3d192-operator-scripts\") pod \"barbican-db-create-blk7m\" (UID: \"50683249-0922-48bc-9bea-f6ce81e3d192\") " pod="openstack/barbican-db-create-blk7m" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.619927 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rvpl9" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.627739 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-de5a-account-create-update-x9fbv"] Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.631090 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-de5a-account-create-update-x9fbv" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.633965 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.639161 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jtv2\" (UniqueName: \"kubernetes.io/projected/50683249-0922-48bc-9bea-f6ce81e3d192-kube-api-access-9jtv2\") pod \"barbican-db-create-blk7m\" (UID: \"50683249-0922-48bc-9bea-f6ce81e3d192\") " pod="openstack/barbican-db-create-blk7m" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.640469 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz4h9\" (UniqueName: \"kubernetes.io/projected/f7e9d6ea-5e58-46fb-a241-6db82d0abd15-kube-api-access-fz4h9\") pod \"barbican-4269-account-create-update-nv66q\" (UID: \"f7e9d6ea-5e58-46fb-a241-6db82d0abd15\") " pod="openstack/barbican-4269-account-create-update-nv66q" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.647051 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rvpl9"] Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.665120 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-de5a-account-create-update-x9fbv"] Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.719938 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/756f0330-2838-4d2f-a92f-739ed4acab76-config-data\") pod \"keystone-db-sync-qgp9c\" (UID: \"756f0330-2838-4d2f-a92f-739ed4acab76\") " pod="openstack/keystone-db-sync-qgp9c" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.719999 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2055195c-7029-4ba4-b6b1-7e717991cbb3-operator-scripts\") pod \"neutron-de5a-account-create-update-x9fbv\" (UID: \"2055195c-7029-4ba4-b6b1-7e717991cbb3\") " pod="openstack/neutron-de5a-account-create-update-x9fbv" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.720027 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/756f0330-2838-4d2f-a92f-739ed4acab76-combined-ca-bundle\") pod \"keystone-db-sync-qgp9c\" (UID: \"756f0330-2838-4d2f-a92f-739ed4acab76\") " pod="openstack/keystone-db-sync-qgp9c" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.720082 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9pvt\" (UniqueName: \"kubernetes.io/projected/354dc245-41f8-48ca-8fef-ef66ea015690-kube-api-access-f9pvt\") pod \"neutron-db-create-rvpl9\" (UID: \"354dc245-41f8-48ca-8fef-ef66ea015690\") " pod="openstack/neutron-db-create-rvpl9" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.720098 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nd4x\" (UniqueName: \"kubernetes.io/projected/2055195c-7029-4ba4-b6b1-7e717991cbb3-kube-api-access-2nd4x\") pod \"neutron-de5a-account-create-update-x9fbv\" (UID: \"2055195c-7029-4ba4-b6b1-7e717991cbb3\") " pod="openstack/neutron-de5a-account-create-update-x9fbv" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.720123 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/354dc245-41f8-48ca-8fef-ef66ea015690-operator-scripts\") pod \"neutron-db-create-rvpl9\" (UID: \"354dc245-41f8-48ca-8fef-ef66ea015690\") " pod="openstack/neutron-db-create-rvpl9" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.720147 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5znkx\" (UniqueName: \"kubernetes.io/projected/756f0330-2838-4d2f-a92f-739ed4acab76-kube-api-access-5znkx\") pod \"keystone-db-sync-qgp9c\" (UID: \"756f0330-2838-4d2f-a92f-739ed4acab76\") " pod="openstack/keystone-db-sync-qgp9c" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.727666 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/756f0330-2838-4d2f-a92f-739ed4acab76-config-data\") pod \"keystone-db-sync-qgp9c\" (UID: \"756f0330-2838-4d2f-a92f-739ed4acab76\") " pod="openstack/keystone-db-sync-qgp9c" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.728977 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-blk7m" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.730033 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/756f0330-2838-4d2f-a92f-739ed4acab76-combined-ca-bundle\") pod \"keystone-db-sync-qgp9c\" (UID: \"756f0330-2838-4d2f-a92f-739ed4acab76\") " pod="openstack/keystone-db-sync-qgp9c" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.740116 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5znkx\" (UniqueName: \"kubernetes.io/projected/756f0330-2838-4d2f-a92f-739ed4acab76-kube-api-access-5znkx\") pod \"keystone-db-sync-qgp9c\" (UID: \"756f0330-2838-4d2f-a92f-739ed4acab76\") " pod="openstack/keystone-db-sync-qgp9c" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.742156 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4269-account-create-update-nv66q" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.821273 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qgp9c" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.822485 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9pvt\" (UniqueName: \"kubernetes.io/projected/354dc245-41f8-48ca-8fef-ef66ea015690-kube-api-access-f9pvt\") pod \"neutron-db-create-rvpl9\" (UID: \"354dc245-41f8-48ca-8fef-ef66ea015690\") " pod="openstack/neutron-db-create-rvpl9" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.822543 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nd4x\" (UniqueName: \"kubernetes.io/projected/2055195c-7029-4ba4-b6b1-7e717991cbb3-kube-api-access-2nd4x\") pod \"neutron-de5a-account-create-update-x9fbv\" (UID: \"2055195c-7029-4ba4-b6b1-7e717991cbb3\") " pod="openstack/neutron-de5a-account-create-update-x9fbv" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.822582 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/354dc245-41f8-48ca-8fef-ef66ea015690-operator-scripts\") pod \"neutron-db-create-rvpl9\" (UID: \"354dc245-41f8-48ca-8fef-ef66ea015690\") " pod="openstack/neutron-db-create-rvpl9" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.822738 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2055195c-7029-4ba4-b6b1-7e717991cbb3-operator-scripts\") pod \"neutron-de5a-account-create-update-x9fbv\" (UID: \"2055195c-7029-4ba4-b6b1-7e717991cbb3\") " pod="openstack/neutron-de5a-account-create-update-x9fbv" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.823384 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/354dc245-41f8-48ca-8fef-ef66ea015690-operator-scripts\") pod \"neutron-db-create-rvpl9\" (UID: \"354dc245-41f8-48ca-8fef-ef66ea015690\") " pod="openstack/neutron-db-create-rvpl9" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.823423 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2055195c-7029-4ba4-b6b1-7e717991cbb3-operator-scripts\") pod \"neutron-de5a-account-create-update-x9fbv\" (UID: \"2055195c-7029-4ba4-b6b1-7e717991cbb3\") " pod="openstack/neutron-de5a-account-create-update-x9fbv" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.839860 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nd4x\" (UniqueName: \"kubernetes.io/projected/2055195c-7029-4ba4-b6b1-7e717991cbb3-kube-api-access-2nd4x\") pod \"neutron-de5a-account-create-update-x9fbv\" (UID: \"2055195c-7029-4ba4-b6b1-7e717991cbb3\") " pod="openstack/neutron-de5a-account-create-update-x9fbv" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.845848 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9pvt\" (UniqueName: \"kubernetes.io/projected/354dc245-41f8-48ca-8fef-ef66ea015690-kube-api-access-f9pvt\") pod \"neutron-db-create-rvpl9\" (UID: \"354dc245-41f8-48ca-8fef-ef66ea015690\") " pod="openstack/neutron-db-create-rvpl9" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.924759 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" event={"ID":"e48613ce-8967-4bcb-b928-eb2b2c662c0d","Type":"ContainerStarted","Data":"a498398685860509d8cf0661319803fbdbf100ec5d11457ba2baa28161a1acaa"} Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.924818 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.947248 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" podStartSLOduration=2.947230849 podStartE2EDuration="2.947230849s" podCreationTimestamp="2026-04-02 13:59:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:59:35.940304442 +0000 UTC m=+1332.844712015" watchObservedRunningTime="2026-04-02 13:59:35.947230849 +0000 UTC m=+1332.851638402" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.953012 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rvpl9" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.962096 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-de5a-account-create-update-x9fbv" Apr 02 13:59:35 crc kubenswrapper[4732]: I0402 13:59:35.992598 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-lc9bc"] Apr 02 13:59:36 crc kubenswrapper[4732]: W0402 13:59:36.013639 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ed7061f_5bb0_4113_851f_45cb0af3e77d.slice/crio-916788b60827bddf060e80066399ba8f464591ae4cfdd51b77c94606cc87484f WatchSource:0}: Error finding container 916788b60827bddf060e80066399ba8f464591ae4cfdd51b77c94606cc87484f: Status 404 returned error can't find the container with id 916788b60827bddf060e80066399ba8f464591ae4cfdd51b77c94606cc87484f Apr 02 13:59:36 crc kubenswrapper[4732]: I0402 13:59:36.120742 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5e24-account-create-update-rbhwq"] Apr 02 13:59:36 crc kubenswrapper[4732]: W0402 13:59:36.154120 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61ff3e0e_f82e_4130_afa8_739689043221.slice/crio-e268eba04defb61abab01c95030e6d3ce3bf14857bd47460c8988d402a38acbb WatchSource:0}: Error finding container e268eba04defb61abab01c95030e6d3ce3bf14857bd47460c8988d402a38acbb: Status 404 returned error can't find the container with id e268eba04defb61abab01c95030e6d3ce3bf14857bd47460c8988d402a38acbb Apr 02 13:59:36 crc kubenswrapper[4732]: I0402 13:59:36.267075 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-blk7m"] Apr 02 13:59:36 crc kubenswrapper[4732]: W0402 13:59:36.279412 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50683249_0922_48bc_9bea_f6ce81e3d192.slice/crio-a3e5db8a4d3be46420831afa98c6be260925d5a6de4a73546e2f09d2ceed7fe9 WatchSource:0}: Error finding container a3e5db8a4d3be46420831afa98c6be260925d5a6de4a73546e2f09d2ceed7fe9: Status 404 returned error can't find the container with id a3e5db8a4d3be46420831afa98c6be260925d5a6de4a73546e2f09d2ceed7fe9 Apr 02 13:59:36 crc kubenswrapper[4732]: I0402 13:59:36.336599 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4269-account-create-update-nv66q"] Apr 02 13:59:36 crc kubenswrapper[4732]: W0402 13:59:36.339321 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7e9d6ea_5e58_46fb_a241_6db82d0abd15.slice/crio-817add7d7bf3d025f65e457e2dddd1cc91e1f7870f543d347024ffcf5b5c405d WatchSource:0}: Error finding container 817add7d7bf3d025f65e457e2dddd1cc91e1f7870f543d347024ffcf5b5c405d: Status 404 returned error can't find the container with id 817add7d7bf3d025f65e457e2dddd1cc91e1f7870f543d347024ffcf5b5c405d Apr 02 13:59:36 crc kubenswrapper[4732]: I0402 13:59:36.448653 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qgp9c"] Apr 02 13:59:36 crc kubenswrapper[4732]: W0402 13:59:36.462909 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod756f0330_2838_4d2f_a92f_739ed4acab76.slice/crio-b24be22dd1b093fd09676989ee6985873b9f5d2f4546ba0d0f275072ba46acab WatchSource:0}: Error finding container b24be22dd1b093fd09676989ee6985873b9f5d2f4546ba0d0f275072ba46acab: Status 404 returned error can't find the container with id b24be22dd1b093fd09676989ee6985873b9f5d2f4546ba0d0f275072ba46acab Apr 02 13:59:36 crc kubenswrapper[4732]: I0402 13:59:36.540131 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rvpl9"] Apr 02 13:59:36 crc kubenswrapper[4732]: I0402 13:59:36.562536 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-de5a-account-create-update-x9fbv"] Apr 02 13:59:36 crc kubenswrapper[4732]: W0402 13:59:36.573910 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod354dc245_41f8_48ca_8fef_ef66ea015690.slice/crio-f0610ec27b93cff20993af7123b5ecd035de28ee08544099afe1e0723e821a8e WatchSource:0}: Error finding container f0610ec27b93cff20993af7123b5ecd035de28ee08544099afe1e0723e821a8e: Status 404 returned error can't find the container with id f0610ec27b93cff20993af7123b5ecd035de28ee08544099afe1e0723e821a8e Apr 02 13:59:36 crc kubenswrapper[4732]: W0402 13:59:36.574128 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2055195c_7029_4ba4_b6b1_7e717991cbb3.slice/crio-a36207ca1399c4bec841f0030db47863be8523394e4d80c75cad30496a764453 WatchSource:0}: Error finding container a36207ca1399c4bec841f0030db47863be8523394e4d80c75cad30496a764453: Status 404 returned error can't find the container with id a36207ca1399c4bec841f0030db47863be8523394e4d80c75cad30496a764453 Apr 02 13:59:36 crc kubenswrapper[4732]: I0402 13:59:36.931272 4732 generic.go:334] "Generic (PLEG): container finished" podID="61ff3e0e-f82e-4130-afa8-739689043221" containerID="07c87ac3e88714a95605541c92131c731ee3697e162d5ee5c1ba6b8627b54434" exitCode=0 Apr 02 13:59:36 crc kubenswrapper[4732]: I0402 13:59:36.931348 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5e24-account-create-update-rbhwq" event={"ID":"61ff3e0e-f82e-4130-afa8-739689043221","Type":"ContainerDied","Data":"07c87ac3e88714a95605541c92131c731ee3697e162d5ee5c1ba6b8627b54434"} Apr 02 13:59:36 crc kubenswrapper[4732]: I0402 13:59:36.931384 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5e24-account-create-update-rbhwq" event={"ID":"61ff3e0e-f82e-4130-afa8-739689043221","Type":"ContainerStarted","Data":"e268eba04defb61abab01c95030e6d3ce3bf14857bd47460c8988d402a38acbb"} Apr 02 13:59:36 crc kubenswrapper[4732]: I0402 13:59:36.932760 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-de5a-account-create-update-x9fbv" event={"ID":"2055195c-7029-4ba4-b6b1-7e717991cbb3","Type":"ContainerStarted","Data":"339e0eb5b86d64be8bcd69bc05ae280f085ab5cd74e7c3b14ac096e9e752d90c"} Apr 02 13:59:36 crc kubenswrapper[4732]: I0402 13:59:36.932780 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-de5a-account-create-update-x9fbv" event={"ID":"2055195c-7029-4ba4-b6b1-7e717991cbb3","Type":"ContainerStarted","Data":"a36207ca1399c4bec841f0030db47863be8523394e4d80c75cad30496a764453"} Apr 02 13:59:36 crc kubenswrapper[4732]: I0402 13:59:36.939103 4732 generic.go:334] "Generic (PLEG): container finished" podID="50683249-0922-48bc-9bea-f6ce81e3d192" containerID="acbd44376914a5a7cce475daf23bfe53fa628560b0bb8ce2c80bec24acc6ab85" exitCode=0 Apr 02 13:59:36 crc kubenswrapper[4732]: I0402 13:59:36.939177 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-blk7m" event={"ID":"50683249-0922-48bc-9bea-f6ce81e3d192","Type":"ContainerDied","Data":"acbd44376914a5a7cce475daf23bfe53fa628560b0bb8ce2c80bec24acc6ab85"} Apr 02 13:59:36 crc kubenswrapper[4732]: I0402 13:59:36.939198 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-blk7m" event={"ID":"50683249-0922-48bc-9bea-f6ce81e3d192","Type":"ContainerStarted","Data":"a3e5db8a4d3be46420831afa98c6be260925d5a6de4a73546e2f09d2ceed7fe9"} Apr 02 13:59:36 crc kubenswrapper[4732]: I0402 13:59:36.942152 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rvpl9" event={"ID":"354dc245-41f8-48ca-8fef-ef66ea015690","Type":"ContainerStarted","Data":"bb8ea28ebf5e33ced4c2f8486f8dfe5524a7abce1a62b5e7fb5262ef8252196d"} Apr 02 13:59:36 crc kubenswrapper[4732]: I0402 13:59:36.942190 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rvpl9" event={"ID":"354dc245-41f8-48ca-8fef-ef66ea015690","Type":"ContainerStarted","Data":"f0610ec27b93cff20993af7123b5ecd035de28ee08544099afe1e0723e821a8e"} Apr 02 13:59:36 crc kubenswrapper[4732]: I0402 13:59:36.945304 4732 generic.go:334] "Generic (PLEG): container finished" podID="7ed7061f-5bb0-4113-851f-45cb0af3e77d" containerID="74926c31f3a2ffe4338b1e5734feeaa65c342309cbe58b3350a039dbe94db109" exitCode=0 Apr 02 13:59:36 crc kubenswrapper[4732]: I0402 13:59:36.945422 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lc9bc" event={"ID":"7ed7061f-5bb0-4113-851f-45cb0af3e77d","Type":"ContainerDied","Data":"74926c31f3a2ffe4338b1e5734feeaa65c342309cbe58b3350a039dbe94db109"} Apr 02 13:59:36 crc kubenswrapper[4732]: I0402 13:59:36.945464 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lc9bc" event={"ID":"7ed7061f-5bb0-4113-851f-45cb0af3e77d","Type":"ContainerStarted","Data":"916788b60827bddf060e80066399ba8f464591ae4cfdd51b77c94606cc87484f"} Apr 02 13:59:36 crc kubenswrapper[4732]: I0402 13:59:36.948525 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qgp9c" event={"ID":"756f0330-2838-4d2f-a92f-739ed4acab76","Type":"ContainerStarted","Data":"b24be22dd1b093fd09676989ee6985873b9f5d2f4546ba0d0f275072ba46acab"} Apr 02 13:59:36 crc kubenswrapper[4732]: I0402 13:59:36.950178 4732 generic.go:334] "Generic (PLEG): container finished" podID="f7e9d6ea-5e58-46fb-a241-6db82d0abd15" containerID="7bbca9f232721293aae826fe9b616df67da3b0c59428f8612b9b1bc2b8128d6b" exitCode=0 Apr 02 13:59:36 crc kubenswrapper[4732]: I0402 13:59:36.951141 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4269-account-create-update-nv66q" event={"ID":"f7e9d6ea-5e58-46fb-a241-6db82d0abd15","Type":"ContainerDied","Data":"7bbca9f232721293aae826fe9b616df67da3b0c59428f8612b9b1bc2b8128d6b"} Apr 02 13:59:36 crc kubenswrapper[4732]: I0402 13:59:36.951167 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4269-account-create-update-nv66q" event={"ID":"f7e9d6ea-5e58-46fb-a241-6db82d0abd15","Type":"ContainerStarted","Data":"817add7d7bf3d025f65e457e2dddd1cc91e1f7870f543d347024ffcf5b5c405d"} Apr 02 13:59:36 crc kubenswrapper[4732]: I0402 13:59:36.964838 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-de5a-account-create-update-x9fbv" podStartSLOduration=1.964812915 podStartE2EDuration="1.964812915s" podCreationTimestamp="2026-04-02 13:59:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:59:36.960022896 +0000 UTC m=+1333.864430459" watchObservedRunningTime="2026-04-02 13:59:36.964812915 +0000 UTC m=+1333.869220468" Apr 02 13:59:36 crc kubenswrapper[4732]: I0402 13:59:36.994886 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-rvpl9" podStartSLOduration=1.994864455 podStartE2EDuration="1.994864455s" podCreationTimestamp="2026-04-02 13:59:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:59:36.991016121 +0000 UTC m=+1333.895423704" watchObservedRunningTime="2026-04-02 13:59:36.994864455 +0000 UTC m=+1333.899272018" Apr 02 13:59:37 crc kubenswrapper[4732]: I0402 13:59:37.962364 4732 generic.go:334] "Generic (PLEG): container finished" podID="2055195c-7029-4ba4-b6b1-7e717991cbb3" containerID="339e0eb5b86d64be8bcd69bc05ae280f085ab5cd74e7c3b14ac096e9e752d90c" exitCode=0 Apr 02 13:59:37 crc kubenswrapper[4732]: I0402 13:59:37.962744 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-de5a-account-create-update-x9fbv" event={"ID":"2055195c-7029-4ba4-b6b1-7e717991cbb3","Type":"ContainerDied","Data":"339e0eb5b86d64be8bcd69bc05ae280f085ab5cd74e7c3b14ac096e9e752d90c"} Apr 02 13:59:37 crc kubenswrapper[4732]: I0402 13:59:37.972401 4732 generic.go:334] "Generic (PLEG): container finished" podID="354dc245-41f8-48ca-8fef-ef66ea015690" containerID="bb8ea28ebf5e33ced4c2f8486f8dfe5524a7abce1a62b5e7fb5262ef8252196d" exitCode=0 Apr 02 13:59:37 crc kubenswrapper[4732]: I0402 13:59:37.974515 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rvpl9" event={"ID":"354dc245-41f8-48ca-8fef-ef66ea015690","Type":"ContainerDied","Data":"bb8ea28ebf5e33ced4c2f8486f8dfe5524a7abce1a62b5e7fb5262ef8252196d"} Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.531863 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lc9bc" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.537390 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5e24-account-create-update-rbhwq" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.541666 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4269-account-create-update-nv66q" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.549199 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-blk7m" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.558896 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rvpl9" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.570151 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-de5a-account-create-update-x9fbv" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.604665 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2055195c-7029-4ba4-b6b1-7e717991cbb3-operator-scripts\") pod \"2055195c-7029-4ba4-b6b1-7e717991cbb3\" (UID: \"2055195c-7029-4ba4-b6b1-7e717991cbb3\") " Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.605511 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50683249-0922-48bc-9bea-f6ce81e3d192-operator-scripts\") pod \"50683249-0922-48bc-9bea-f6ce81e3d192\" (UID: \"50683249-0922-48bc-9bea-f6ce81e3d192\") " Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.605032 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2055195c-7029-4ba4-b6b1-7e717991cbb3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2055195c-7029-4ba4-b6b1-7e717991cbb3" (UID: "2055195c-7029-4ba4-b6b1-7e717991cbb3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.606481 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50683249-0922-48bc-9bea-f6ce81e3d192-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "50683249-0922-48bc-9bea-f6ce81e3d192" (UID: "50683249-0922-48bc-9bea-f6ce81e3d192"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.607142 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ed7061f-5bb0-4113-851f-45cb0af3e77d-operator-scripts\") pod \"7ed7061f-5bb0-4113-851f-45cb0af3e77d\" (UID: \"7ed7061f-5bb0-4113-851f-45cb0af3e77d\") " Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.607872 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7e9d6ea-5e58-46fb-a241-6db82d0abd15-operator-scripts\") pod \"f7e9d6ea-5e58-46fb-a241-6db82d0abd15\" (UID: \"f7e9d6ea-5e58-46fb-a241-6db82d0abd15\") " Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.608036 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn87p\" (UniqueName: \"kubernetes.io/projected/61ff3e0e-f82e-4130-afa8-739689043221-kube-api-access-vn87p\") pod \"61ff3e0e-f82e-4130-afa8-739689043221\" (UID: \"61ff3e0e-f82e-4130-afa8-739689043221\") " Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.608190 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/354dc245-41f8-48ca-8fef-ef66ea015690-operator-scripts\") pod \"354dc245-41f8-48ca-8fef-ef66ea015690\" (UID: \"354dc245-41f8-48ca-8fef-ef66ea015690\") " Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.608437 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9pvt\" (UniqueName: \"kubernetes.io/projected/354dc245-41f8-48ca-8fef-ef66ea015690-kube-api-access-f9pvt\") pod \"354dc245-41f8-48ca-8fef-ef66ea015690\" (UID: \"354dc245-41f8-48ca-8fef-ef66ea015690\") " Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.608538 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqqhf\" (UniqueName: \"kubernetes.io/projected/7ed7061f-5bb0-4113-851f-45cb0af3e77d-kube-api-access-fqqhf\") pod \"7ed7061f-5bb0-4113-851f-45cb0af3e77d\" (UID: \"7ed7061f-5bb0-4113-851f-45cb0af3e77d\") " Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.608669 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jtv2\" (UniqueName: \"kubernetes.io/projected/50683249-0922-48bc-9bea-f6ce81e3d192-kube-api-access-9jtv2\") pod \"50683249-0922-48bc-9bea-f6ce81e3d192\" (UID: \"50683249-0922-48bc-9bea-f6ce81e3d192\") " Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.608111 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ed7061f-5bb0-4113-851f-45cb0af3e77d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ed7061f-5bb0-4113-851f-45cb0af3e77d" (UID: "7ed7061f-5bb0-4113-851f-45cb0af3e77d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.608486 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7e9d6ea-5e58-46fb-a241-6db82d0abd15-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f7e9d6ea-5e58-46fb-a241-6db82d0abd15" (UID: "f7e9d6ea-5e58-46fb-a241-6db82d0abd15"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.608784 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/354dc245-41f8-48ca-8fef-ef66ea015690-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "354dc245-41f8-48ca-8fef-ef66ea015690" (UID: "354dc245-41f8-48ca-8fef-ef66ea015690"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.608985 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61ff3e0e-f82e-4130-afa8-739689043221-operator-scripts\") pod \"61ff3e0e-f82e-4130-afa8-739689043221\" (UID: \"61ff3e0e-f82e-4130-afa8-739689043221\") " Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.609179 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nd4x\" (UniqueName: \"kubernetes.io/projected/2055195c-7029-4ba4-b6b1-7e717991cbb3-kube-api-access-2nd4x\") pod \"2055195c-7029-4ba4-b6b1-7e717991cbb3\" (UID: \"2055195c-7029-4ba4-b6b1-7e717991cbb3\") " Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.609280 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz4h9\" (UniqueName: \"kubernetes.io/projected/f7e9d6ea-5e58-46fb-a241-6db82d0abd15-kube-api-access-fz4h9\") pod \"f7e9d6ea-5e58-46fb-a241-6db82d0abd15\" (UID: \"f7e9d6ea-5e58-46fb-a241-6db82d0abd15\") " Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.610014 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2055195c-7029-4ba4-b6b1-7e717991cbb3-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.610301 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50683249-0922-48bc-9bea-f6ce81e3d192-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.610413 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ed7061f-5bb0-4113-851f-45cb0af3e77d-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.610594 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7e9d6ea-5e58-46fb-a241-6db82d0abd15-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.610852 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/354dc245-41f8-48ca-8fef-ef66ea015690-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.616053 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2055195c-7029-4ba4-b6b1-7e717991cbb3-kube-api-access-2nd4x" (OuterVolumeSpecName: "kube-api-access-2nd4x") pod "2055195c-7029-4ba4-b6b1-7e717991cbb3" (UID: "2055195c-7029-4ba4-b6b1-7e717991cbb3"). InnerVolumeSpecName "kube-api-access-2nd4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.616878 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7e9d6ea-5e58-46fb-a241-6db82d0abd15-kube-api-access-fz4h9" (OuterVolumeSpecName: "kube-api-access-fz4h9") pod "f7e9d6ea-5e58-46fb-a241-6db82d0abd15" (UID: "f7e9d6ea-5e58-46fb-a241-6db82d0abd15"). InnerVolumeSpecName "kube-api-access-fz4h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.617578 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61ff3e0e-f82e-4130-afa8-739689043221-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "61ff3e0e-f82e-4130-afa8-739689043221" (UID: "61ff3e0e-f82e-4130-afa8-739689043221"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.622743 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50683249-0922-48bc-9bea-f6ce81e3d192-kube-api-access-9jtv2" (OuterVolumeSpecName: "kube-api-access-9jtv2") pod "50683249-0922-48bc-9bea-f6ce81e3d192" (UID: "50683249-0922-48bc-9bea-f6ce81e3d192"). InnerVolumeSpecName "kube-api-access-9jtv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.622803 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/354dc245-41f8-48ca-8fef-ef66ea015690-kube-api-access-f9pvt" (OuterVolumeSpecName: "kube-api-access-f9pvt") pod "354dc245-41f8-48ca-8fef-ef66ea015690" (UID: "354dc245-41f8-48ca-8fef-ef66ea015690"). InnerVolumeSpecName "kube-api-access-f9pvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.623375 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ed7061f-5bb0-4113-851f-45cb0af3e77d-kube-api-access-fqqhf" (OuterVolumeSpecName: "kube-api-access-fqqhf") pod "7ed7061f-5bb0-4113-851f-45cb0af3e77d" (UID: "7ed7061f-5bb0-4113-851f-45cb0af3e77d"). InnerVolumeSpecName "kube-api-access-fqqhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.635812 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61ff3e0e-f82e-4130-afa8-739689043221-kube-api-access-vn87p" (OuterVolumeSpecName: "kube-api-access-vn87p") pod "61ff3e0e-f82e-4130-afa8-739689043221" (UID: "61ff3e0e-f82e-4130-afa8-739689043221"). InnerVolumeSpecName "kube-api-access-vn87p". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.712313 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn87p\" (UniqueName: \"kubernetes.io/projected/61ff3e0e-f82e-4130-afa8-739689043221-kube-api-access-vn87p\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.712594 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9pvt\" (UniqueName: \"kubernetes.io/projected/354dc245-41f8-48ca-8fef-ef66ea015690-kube-api-access-f9pvt\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.712603 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqqhf\" (UniqueName: \"kubernetes.io/projected/7ed7061f-5bb0-4113-851f-45cb0af3e77d-kube-api-access-fqqhf\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.712654 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jtv2\" (UniqueName: \"kubernetes.io/projected/50683249-0922-48bc-9bea-f6ce81e3d192-kube-api-access-9jtv2\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.712683 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61ff3e0e-f82e-4130-afa8-739689043221-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.712691 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nd4x\" (UniqueName: \"kubernetes.io/projected/2055195c-7029-4ba4-b6b1-7e717991cbb3-kube-api-access-2nd4x\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:40 crc kubenswrapper[4732]: I0402 13:59:40.712700 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz4h9\" (UniqueName: \"kubernetes.io/projected/f7e9d6ea-5e58-46fb-a241-6db82d0abd15-kube-api-access-fz4h9\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:41 crc kubenswrapper[4732]: I0402 13:59:41.001203 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rvpl9" event={"ID":"354dc245-41f8-48ca-8fef-ef66ea015690","Type":"ContainerDied","Data":"f0610ec27b93cff20993af7123b5ecd035de28ee08544099afe1e0723e821a8e"} Apr 02 13:59:41 crc kubenswrapper[4732]: I0402 13:59:41.001249 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0610ec27b93cff20993af7123b5ecd035de28ee08544099afe1e0723e821a8e" Apr 02 13:59:41 crc kubenswrapper[4732]: I0402 13:59:41.001301 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rvpl9" Apr 02 13:59:41 crc kubenswrapper[4732]: I0402 13:59:41.003263 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lc9bc" Apr 02 13:59:41 crc kubenswrapper[4732]: I0402 13:59:41.003248 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lc9bc" event={"ID":"7ed7061f-5bb0-4113-851f-45cb0af3e77d","Type":"ContainerDied","Data":"916788b60827bddf060e80066399ba8f464591ae4cfdd51b77c94606cc87484f"} Apr 02 13:59:41 crc kubenswrapper[4732]: I0402 13:59:41.003306 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="916788b60827bddf060e80066399ba8f464591ae4cfdd51b77c94606cc87484f" Apr 02 13:59:41 crc kubenswrapper[4732]: I0402 13:59:41.005002 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4269-account-create-update-nv66q" Apr 02 13:59:41 crc kubenswrapper[4732]: I0402 13:59:41.005105 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4269-account-create-update-nv66q" event={"ID":"f7e9d6ea-5e58-46fb-a241-6db82d0abd15","Type":"ContainerDied","Data":"817add7d7bf3d025f65e457e2dddd1cc91e1f7870f543d347024ffcf5b5c405d"} Apr 02 13:59:41 crc kubenswrapper[4732]: I0402 13:59:41.005222 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="817add7d7bf3d025f65e457e2dddd1cc91e1f7870f543d347024ffcf5b5c405d" Apr 02 13:59:41 crc kubenswrapper[4732]: I0402 13:59:41.006893 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5e24-account-create-update-rbhwq" Apr 02 13:59:41 crc kubenswrapper[4732]: I0402 13:59:41.007017 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5e24-account-create-update-rbhwq" event={"ID":"61ff3e0e-f82e-4130-afa8-739689043221","Type":"ContainerDied","Data":"e268eba04defb61abab01c95030e6d3ce3bf14857bd47460c8988d402a38acbb"} Apr 02 13:59:41 crc kubenswrapper[4732]: I0402 13:59:41.007157 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e268eba04defb61abab01c95030e6d3ce3bf14857bd47460c8988d402a38acbb" Apr 02 13:59:41 crc kubenswrapper[4732]: I0402 13:59:41.008540 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-de5a-account-create-update-x9fbv" event={"ID":"2055195c-7029-4ba4-b6b1-7e717991cbb3","Type":"ContainerDied","Data":"a36207ca1399c4bec841f0030db47863be8523394e4d80c75cad30496a764453"} Apr 02 13:59:41 crc kubenswrapper[4732]: I0402 13:59:41.008660 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a36207ca1399c4bec841f0030db47863be8523394e4d80c75cad30496a764453" Apr 02 13:59:41 crc kubenswrapper[4732]: I0402 13:59:41.008846 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-de5a-account-create-update-x9fbv" Apr 02 13:59:41 crc kubenswrapper[4732]: I0402 13:59:41.010888 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-blk7m" event={"ID":"50683249-0922-48bc-9bea-f6ce81e3d192","Type":"ContainerDied","Data":"a3e5db8a4d3be46420831afa98c6be260925d5a6de4a73546e2f09d2ceed7fe9"} Apr 02 13:59:41 crc kubenswrapper[4732]: I0402 13:59:41.010985 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-blk7m" Apr 02 13:59:41 crc kubenswrapper[4732]: I0402 13:59:41.010994 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3e5db8a4d3be46420831afa98c6be260925d5a6de4a73546e2f09d2ceed7fe9" Apr 02 13:59:42 crc kubenswrapper[4732]: I0402 13:59:42.020961 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qgp9c" event={"ID":"756f0330-2838-4d2f-a92f-739ed4acab76","Type":"ContainerStarted","Data":"f056e082766505ebdab12624db3986b625ea9b0998b7eb20a1651ee5747de15d"} Apr 02 13:59:42 crc kubenswrapper[4732]: I0402 13:59:42.044749 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-qgp9c" podStartSLOduration=2.345300885 podStartE2EDuration="7.044730699s" podCreationTimestamp="2026-04-02 13:59:35 +0000 UTC" firstStartedPulling="2026-04-02 13:59:36.466809247 +0000 UTC m=+1333.371216800" lastFinishedPulling="2026-04-02 13:59:41.166239051 +0000 UTC m=+1338.070646614" observedRunningTime="2026-04-02 13:59:42.038770059 +0000 UTC m=+1338.943177622" watchObservedRunningTime="2026-04-02 13:59:42.044730699 +0000 UTC m=+1338.949138252" Apr 02 13:59:43 crc kubenswrapper[4732]: I0402 13:59:43.586836 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" Apr 02 13:59:43 crc kubenswrapper[4732]: I0402 13:59:43.672498 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6xf79"] Apr 02 13:59:43 crc kubenswrapper[4732]: I0402 13:59:43.672858 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" podUID="56a9e53f-9667-48a5-8065-ad30fd550a7d" containerName="dnsmasq-dns" containerID="cri-o://8e2522f8812673996dc5982ccee1bd3544e320ed8d8a1a5e82ab36b4111f0d61" gracePeriod=10 Apr 02 13:59:44 crc kubenswrapper[4732]: I0402 13:59:44.041864 4732 generic.go:334] "Generic (PLEG): container finished" podID="56a9e53f-9667-48a5-8065-ad30fd550a7d" containerID="8e2522f8812673996dc5982ccee1bd3544e320ed8d8a1a5e82ab36b4111f0d61" exitCode=0 Apr 02 13:59:44 crc kubenswrapper[4732]: I0402 13:59:44.041981 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" event={"ID":"56a9e53f-9667-48a5-8065-ad30fd550a7d","Type":"ContainerDied","Data":"8e2522f8812673996dc5982ccee1bd3544e320ed8d8a1a5e82ab36b4111f0d61"} Apr 02 13:59:44 crc kubenswrapper[4732]: I0402 13:59:44.145189 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" Apr 02 13:59:44 crc kubenswrapper[4732]: I0402 13:59:44.272584 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56a9e53f-9667-48a5-8065-ad30fd550a7d-ovsdbserver-sb\") pod \"56a9e53f-9667-48a5-8065-ad30fd550a7d\" (UID: \"56a9e53f-9667-48a5-8065-ad30fd550a7d\") " Apr 02 13:59:44 crc kubenswrapper[4732]: I0402 13:59:44.272677 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56a9e53f-9667-48a5-8065-ad30fd550a7d-dns-svc\") pod \"56a9e53f-9667-48a5-8065-ad30fd550a7d\" (UID: \"56a9e53f-9667-48a5-8065-ad30fd550a7d\") " Apr 02 13:59:44 crc kubenswrapper[4732]: I0402 13:59:44.272741 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhx7s\" (UniqueName: \"kubernetes.io/projected/56a9e53f-9667-48a5-8065-ad30fd550a7d-kube-api-access-vhx7s\") pod \"56a9e53f-9667-48a5-8065-ad30fd550a7d\" (UID: \"56a9e53f-9667-48a5-8065-ad30fd550a7d\") " Apr 02 13:59:44 crc kubenswrapper[4732]: I0402 13:59:44.273650 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56a9e53f-9667-48a5-8065-ad30fd550a7d-ovsdbserver-nb\") pod \"56a9e53f-9667-48a5-8065-ad30fd550a7d\" (UID: \"56a9e53f-9667-48a5-8065-ad30fd550a7d\") " Apr 02 13:59:44 crc kubenswrapper[4732]: I0402 13:59:44.273766 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56a9e53f-9667-48a5-8065-ad30fd550a7d-config\") pod \"56a9e53f-9667-48a5-8065-ad30fd550a7d\" (UID: \"56a9e53f-9667-48a5-8065-ad30fd550a7d\") " Apr 02 13:59:44 crc kubenswrapper[4732]: I0402 13:59:44.283047 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56a9e53f-9667-48a5-8065-ad30fd550a7d-kube-api-access-vhx7s" (OuterVolumeSpecName: "kube-api-access-vhx7s") pod "56a9e53f-9667-48a5-8065-ad30fd550a7d" (UID: "56a9e53f-9667-48a5-8065-ad30fd550a7d"). InnerVolumeSpecName "kube-api-access-vhx7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:59:44 crc kubenswrapper[4732]: I0402 13:59:44.322364 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56a9e53f-9667-48a5-8065-ad30fd550a7d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "56a9e53f-9667-48a5-8065-ad30fd550a7d" (UID: "56a9e53f-9667-48a5-8065-ad30fd550a7d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:44 crc kubenswrapper[4732]: I0402 13:59:44.327598 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56a9e53f-9667-48a5-8065-ad30fd550a7d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "56a9e53f-9667-48a5-8065-ad30fd550a7d" (UID: "56a9e53f-9667-48a5-8065-ad30fd550a7d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:44 crc kubenswrapper[4732]: I0402 13:59:44.333975 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56a9e53f-9667-48a5-8065-ad30fd550a7d-config" (OuterVolumeSpecName: "config") pod "56a9e53f-9667-48a5-8065-ad30fd550a7d" (UID: "56a9e53f-9667-48a5-8065-ad30fd550a7d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:44 crc kubenswrapper[4732]: I0402 13:59:44.336032 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56a9e53f-9667-48a5-8065-ad30fd550a7d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "56a9e53f-9667-48a5-8065-ad30fd550a7d" (UID: "56a9e53f-9667-48a5-8065-ad30fd550a7d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:44 crc kubenswrapper[4732]: I0402 13:59:44.375769 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56a9e53f-9667-48a5-8065-ad30fd550a7d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:44 crc kubenswrapper[4732]: I0402 13:59:44.375815 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56a9e53f-9667-48a5-8065-ad30fd550a7d-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:44 crc kubenswrapper[4732]: I0402 13:59:44.375828 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhx7s\" (UniqueName: \"kubernetes.io/projected/56a9e53f-9667-48a5-8065-ad30fd550a7d-kube-api-access-vhx7s\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:44 crc kubenswrapper[4732]: I0402 13:59:44.375844 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56a9e53f-9667-48a5-8065-ad30fd550a7d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:44 crc kubenswrapper[4732]: I0402 13:59:44.375855 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56a9e53f-9667-48a5-8065-ad30fd550a7d-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:45 crc kubenswrapper[4732]: I0402 13:59:45.053365 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" event={"ID":"56a9e53f-9667-48a5-8065-ad30fd550a7d","Type":"ContainerDied","Data":"3813acbe76f48b2cb59927fb517525777a650f410f579e46c170142412be33c7"} Apr 02 13:59:45 crc kubenswrapper[4732]: I0402 13:59:45.055007 4732 scope.go:117] "RemoveContainer" containerID="8e2522f8812673996dc5982ccee1bd3544e320ed8d8a1a5e82ab36b4111f0d61" Apr 02 13:59:45 crc kubenswrapper[4732]: I0402 13:59:45.053391 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6xf79" Apr 02 13:59:45 crc kubenswrapper[4732]: I0402 13:59:45.055249 4732 generic.go:334] "Generic (PLEG): container finished" podID="756f0330-2838-4d2f-a92f-739ed4acab76" containerID="f056e082766505ebdab12624db3986b625ea9b0998b7eb20a1651ee5747de15d" exitCode=0 Apr 02 13:59:45 crc kubenswrapper[4732]: I0402 13:59:45.055286 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qgp9c" event={"ID":"756f0330-2838-4d2f-a92f-739ed4acab76","Type":"ContainerDied","Data":"f056e082766505ebdab12624db3986b625ea9b0998b7eb20a1651ee5747de15d"} Apr 02 13:59:45 crc kubenswrapper[4732]: I0402 13:59:45.094976 4732 scope.go:117] "RemoveContainer" containerID="26ff11c0f727ee2880bda54c6c49e3983795e6a07857649f772b06a7d389756e" Apr 02 13:59:45 crc kubenswrapper[4732]: I0402 13:59:45.102665 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6xf79"] Apr 02 13:59:45 crc kubenswrapper[4732]: I0402 13:59:45.109770 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6xf79"] Apr 02 13:59:46 crc kubenswrapper[4732]: I0402 13:59:46.379365 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qgp9c" Apr 02 13:59:46 crc kubenswrapper[4732]: I0402 13:59:46.505699 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/756f0330-2838-4d2f-a92f-739ed4acab76-combined-ca-bundle\") pod \"756f0330-2838-4d2f-a92f-739ed4acab76\" (UID: \"756f0330-2838-4d2f-a92f-739ed4acab76\") " Apr 02 13:59:46 crc kubenswrapper[4732]: I0402 13:59:46.506986 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/756f0330-2838-4d2f-a92f-739ed4acab76-config-data\") pod \"756f0330-2838-4d2f-a92f-739ed4acab76\" (UID: \"756f0330-2838-4d2f-a92f-739ed4acab76\") " Apr 02 13:59:46 crc kubenswrapper[4732]: I0402 13:59:46.507357 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5znkx\" (UniqueName: \"kubernetes.io/projected/756f0330-2838-4d2f-a92f-739ed4acab76-kube-api-access-5znkx\") pod \"756f0330-2838-4d2f-a92f-739ed4acab76\" (UID: \"756f0330-2838-4d2f-a92f-739ed4acab76\") " Apr 02 13:59:46 crc kubenswrapper[4732]: I0402 13:59:46.512493 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/756f0330-2838-4d2f-a92f-739ed4acab76-kube-api-access-5znkx" (OuterVolumeSpecName: "kube-api-access-5znkx") pod "756f0330-2838-4d2f-a92f-739ed4acab76" (UID: "756f0330-2838-4d2f-a92f-739ed4acab76"). InnerVolumeSpecName "kube-api-access-5znkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:59:46 crc kubenswrapper[4732]: I0402 13:59:46.530825 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/756f0330-2838-4d2f-a92f-739ed4acab76-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "756f0330-2838-4d2f-a92f-739ed4acab76" (UID: "756f0330-2838-4d2f-a92f-739ed4acab76"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:59:46 crc kubenswrapper[4732]: I0402 13:59:46.551884 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/756f0330-2838-4d2f-a92f-739ed4acab76-config-data" (OuterVolumeSpecName: "config-data") pod "756f0330-2838-4d2f-a92f-739ed4acab76" (UID: "756f0330-2838-4d2f-a92f-739ed4acab76"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:59:46 crc kubenswrapper[4732]: I0402 13:59:46.610426 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/756f0330-2838-4d2f-a92f-739ed4acab76-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:46 crc kubenswrapper[4732]: I0402 13:59:46.610468 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5znkx\" (UniqueName: \"kubernetes.io/projected/756f0330-2838-4d2f-a92f-739ed4acab76-kube-api-access-5znkx\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:46 crc kubenswrapper[4732]: I0402 13:59:46.610484 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/756f0330-2838-4d2f-a92f-739ed4acab76-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:46 crc kubenswrapper[4732]: I0402 13:59:46.689773 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56a9e53f-9667-48a5-8065-ad30fd550a7d" path="/var/lib/kubelet/pods/56a9e53f-9667-48a5-8065-ad30fd550a7d/volumes" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.073507 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qgp9c" event={"ID":"756f0330-2838-4d2f-a92f-739ed4acab76","Type":"ContainerDied","Data":"b24be22dd1b093fd09676989ee6985873b9f5d2f4546ba0d0f275072ba46acab"} Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.073545 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b24be22dd1b093fd09676989ee6985873b9f5d2f4546ba0d0f275072ba46acab" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.073582 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qgp9c" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.270986 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-nfzft"] Apr 02 13:59:47 crc kubenswrapper[4732]: E0402 13:59:47.271311 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56a9e53f-9667-48a5-8065-ad30fd550a7d" containerName="init" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.271326 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a9e53f-9667-48a5-8065-ad30fd550a7d" containerName="init" Apr 02 13:59:47 crc kubenswrapper[4732]: E0402 13:59:47.271341 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed7061f-5bb0-4113-851f-45cb0af3e77d" containerName="mariadb-database-create" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.271348 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed7061f-5bb0-4113-851f-45cb0af3e77d" containerName="mariadb-database-create" Apr 02 13:59:47 crc kubenswrapper[4732]: E0402 13:59:47.271359 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61ff3e0e-f82e-4130-afa8-739689043221" containerName="mariadb-account-create-update" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.271367 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="61ff3e0e-f82e-4130-afa8-739689043221" containerName="mariadb-account-create-update" Apr 02 13:59:47 crc kubenswrapper[4732]: E0402 13:59:47.271380 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="354dc245-41f8-48ca-8fef-ef66ea015690" containerName="mariadb-database-create" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.271385 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="354dc245-41f8-48ca-8fef-ef66ea015690" containerName="mariadb-database-create" Apr 02 13:59:47 crc kubenswrapper[4732]: E0402 13:59:47.271396 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56a9e53f-9667-48a5-8065-ad30fd550a7d" containerName="dnsmasq-dns" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.271401 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a9e53f-9667-48a5-8065-ad30fd550a7d" containerName="dnsmasq-dns" Apr 02 13:59:47 crc kubenswrapper[4732]: E0402 13:59:47.271412 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="756f0330-2838-4d2f-a92f-739ed4acab76" containerName="keystone-db-sync" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.271418 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="756f0330-2838-4d2f-a92f-739ed4acab76" containerName="keystone-db-sync" Apr 02 13:59:47 crc kubenswrapper[4732]: E0402 13:59:47.271428 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7e9d6ea-5e58-46fb-a241-6db82d0abd15" containerName="mariadb-account-create-update" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.271435 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7e9d6ea-5e58-46fb-a241-6db82d0abd15" containerName="mariadb-account-create-update" Apr 02 13:59:47 crc kubenswrapper[4732]: E0402 13:59:47.271451 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50683249-0922-48bc-9bea-f6ce81e3d192" containerName="mariadb-database-create" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.271457 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="50683249-0922-48bc-9bea-f6ce81e3d192" containerName="mariadb-database-create" Apr 02 13:59:47 crc kubenswrapper[4732]: E0402 13:59:47.271469 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2055195c-7029-4ba4-b6b1-7e717991cbb3" containerName="mariadb-account-create-update" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.271475 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2055195c-7029-4ba4-b6b1-7e717991cbb3" containerName="mariadb-account-create-update" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.271640 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="354dc245-41f8-48ca-8fef-ef66ea015690" containerName="mariadb-database-create" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.271670 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="50683249-0922-48bc-9bea-f6ce81e3d192" containerName="mariadb-database-create" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.271687 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7e9d6ea-5e58-46fb-a241-6db82d0abd15" containerName="mariadb-account-create-update" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.271701 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ed7061f-5bb0-4113-851f-45cb0af3e77d" containerName="mariadb-database-create" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.271710 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="61ff3e0e-f82e-4130-afa8-739689043221" containerName="mariadb-account-create-update" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.271722 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="56a9e53f-9667-48a5-8065-ad30fd550a7d" containerName="dnsmasq-dns" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.271734 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="756f0330-2838-4d2f-a92f-739ed4acab76" containerName="keystone-db-sync" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.271746 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2055195c-7029-4ba4-b6b1-7e717991cbb3" containerName="mariadb-account-create-update" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.272633 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-nfzft" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.286744 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-nfzft"] Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.307692 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-j4fj8"] Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.308957 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j4fj8" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.315259 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.315535 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-scsz6" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.315546 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.315706 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.315795 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.319083 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-j4fj8"] Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.320136 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-nfzft\" (UID: \"0fa7c556-fc74-4001-9066-55e84d9a4670\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nfzft" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.320188 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-config\") pod \"dnsmasq-dns-6c9c9f998c-nfzft\" (UID: \"0fa7c556-fc74-4001-9066-55e84d9a4670\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nfzft" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.320240 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-nfzft\" (UID: \"0fa7c556-fc74-4001-9066-55e84d9a4670\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nfzft" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.320270 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-nfzft\" (UID: \"0fa7c556-fc74-4001-9066-55e84d9a4670\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nfzft" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.320465 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khdrl\" (UniqueName: \"kubernetes.io/projected/0fa7c556-fc74-4001-9066-55e84d9a4670-kube-api-access-khdrl\") pod \"dnsmasq-dns-6c9c9f998c-nfzft\" (UID: \"0fa7c556-fc74-4001-9066-55e84d9a4670\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nfzft" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.320503 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-nfzft\" (UID: \"0fa7c556-fc74-4001-9066-55e84d9a4670\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nfzft" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.422528 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khdrl\" (UniqueName: \"kubernetes.io/projected/0fa7c556-fc74-4001-9066-55e84d9a4670-kube-api-access-khdrl\") pod \"dnsmasq-dns-6c9c9f998c-nfzft\" (UID: \"0fa7c556-fc74-4001-9066-55e84d9a4670\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nfzft" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.422625 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-credential-keys\") pod \"keystone-bootstrap-j4fj8\" (UID: \"28f87ba1-2996-40b9-8c0a-93545e49915e\") " pod="openstack/keystone-bootstrap-j4fj8" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.422654 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-combined-ca-bundle\") pod \"keystone-bootstrap-j4fj8\" (UID: \"28f87ba1-2996-40b9-8c0a-93545e49915e\") " pod="openstack/keystone-bootstrap-j4fj8" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.422684 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-nfzft\" (UID: \"0fa7c556-fc74-4001-9066-55e84d9a4670\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nfzft" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.422724 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-scripts\") pod \"keystone-bootstrap-j4fj8\" (UID: \"28f87ba1-2996-40b9-8c0a-93545e49915e\") " pod="openstack/keystone-bootstrap-j4fj8" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.422886 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-nfzft\" (UID: \"0fa7c556-fc74-4001-9066-55e84d9a4670\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nfzft" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.422922 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-config\") pod \"dnsmasq-dns-6c9c9f998c-nfzft\" (UID: \"0fa7c556-fc74-4001-9066-55e84d9a4670\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nfzft" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.422969 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-nfzft\" (UID: \"0fa7c556-fc74-4001-9066-55e84d9a4670\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nfzft" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.422999 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlbs7\" (UniqueName: \"kubernetes.io/projected/28f87ba1-2996-40b9-8c0a-93545e49915e-kube-api-access-rlbs7\") pod \"keystone-bootstrap-j4fj8\" (UID: \"28f87ba1-2996-40b9-8c0a-93545e49915e\") " pod="openstack/keystone-bootstrap-j4fj8" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.423027 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-nfzft\" (UID: \"0fa7c556-fc74-4001-9066-55e84d9a4670\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nfzft" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.423094 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-fernet-keys\") pod \"keystone-bootstrap-j4fj8\" (UID: \"28f87ba1-2996-40b9-8c0a-93545e49915e\") " pod="openstack/keystone-bootstrap-j4fj8" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.423134 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-config-data\") pod \"keystone-bootstrap-j4fj8\" (UID: \"28f87ba1-2996-40b9-8c0a-93545e49915e\") " pod="openstack/keystone-bootstrap-j4fj8" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.424112 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-nfzft\" (UID: \"0fa7c556-fc74-4001-9066-55e84d9a4670\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nfzft" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.424176 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-config\") pod \"dnsmasq-dns-6c9c9f998c-nfzft\" (UID: \"0fa7c556-fc74-4001-9066-55e84d9a4670\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nfzft" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.424826 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-nfzft\" (UID: \"0fa7c556-fc74-4001-9066-55e84d9a4670\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nfzft" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.425471 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-nfzft\" (UID: \"0fa7c556-fc74-4001-9066-55e84d9a4670\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nfzft" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.426125 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-nfzft\" (UID: \"0fa7c556-fc74-4001-9066-55e84d9a4670\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nfzft" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.439655 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-fd6957bc-tm5df"] Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.440916 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fd6957bc-tm5df" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.450408 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-t64mn" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.450684 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.450845 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.450934 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.468797 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-fd6957bc-tm5df"] Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.473718 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khdrl\" (UniqueName: \"kubernetes.io/projected/0fa7c556-fc74-4001-9066-55e84d9a4670-kube-api-access-khdrl\") pod \"dnsmasq-dns-6c9c9f998c-nfzft\" (UID: \"0fa7c556-fc74-4001-9066-55e84d9a4670\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nfzft" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.524723 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-credential-keys\") pod \"keystone-bootstrap-j4fj8\" (UID: \"28f87ba1-2996-40b9-8c0a-93545e49915e\") " pod="openstack/keystone-bootstrap-j4fj8" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.524773 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-combined-ca-bundle\") pod \"keystone-bootstrap-j4fj8\" (UID: \"28f87ba1-2996-40b9-8c0a-93545e49915e\") " pod="openstack/keystone-bootstrap-j4fj8" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.524805 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-scripts\") pod \"keystone-bootstrap-j4fj8\" (UID: \"28f87ba1-2996-40b9-8c0a-93545e49915e\") " pod="openstack/keystone-bootstrap-j4fj8" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.524841 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c623b208-43b3-4c30-a6a1-9d4849513dfa-config-data\") pod \"horizon-fd6957bc-tm5df\" (UID: \"c623b208-43b3-4c30-a6a1-9d4849513dfa\") " pod="openstack/horizon-fd6957bc-tm5df" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.524869 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7b5c\" (UniqueName: \"kubernetes.io/projected/c623b208-43b3-4c30-a6a1-9d4849513dfa-kube-api-access-m7b5c\") pod \"horizon-fd6957bc-tm5df\" (UID: \"c623b208-43b3-4c30-a6a1-9d4849513dfa\") " pod="openstack/horizon-fd6957bc-tm5df" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.524893 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c623b208-43b3-4c30-a6a1-9d4849513dfa-scripts\") pod \"horizon-fd6957bc-tm5df\" (UID: \"c623b208-43b3-4c30-a6a1-9d4849513dfa\") " pod="openstack/horizon-fd6957bc-tm5df" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.524926 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c623b208-43b3-4c30-a6a1-9d4849513dfa-logs\") pod \"horizon-fd6957bc-tm5df\" (UID: \"c623b208-43b3-4c30-a6a1-9d4849513dfa\") " pod="openstack/horizon-fd6957bc-tm5df" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.524946 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlbs7\" (UniqueName: \"kubernetes.io/projected/28f87ba1-2996-40b9-8c0a-93545e49915e-kube-api-access-rlbs7\") pod \"keystone-bootstrap-j4fj8\" (UID: \"28f87ba1-2996-40b9-8c0a-93545e49915e\") " pod="openstack/keystone-bootstrap-j4fj8" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.524993 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c623b208-43b3-4c30-a6a1-9d4849513dfa-horizon-secret-key\") pod \"horizon-fd6957bc-tm5df\" (UID: \"c623b208-43b3-4c30-a6a1-9d4849513dfa\") " pod="openstack/horizon-fd6957bc-tm5df" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.525028 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-fernet-keys\") pod \"keystone-bootstrap-j4fj8\" (UID: \"28f87ba1-2996-40b9-8c0a-93545e49915e\") " pod="openstack/keystone-bootstrap-j4fj8" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.525057 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-config-data\") pod \"keystone-bootstrap-j4fj8\" (UID: \"28f87ba1-2996-40b9-8c0a-93545e49915e\") " pod="openstack/keystone-bootstrap-j4fj8" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.533984 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-scripts\") pod \"keystone-bootstrap-j4fj8\" (UID: \"28f87ba1-2996-40b9-8c0a-93545e49915e\") " pod="openstack/keystone-bootstrap-j4fj8" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.548596 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-fernet-keys\") pod \"keystone-bootstrap-j4fj8\" (UID: \"28f87ba1-2996-40b9-8c0a-93545e49915e\") " pod="openstack/keystone-bootstrap-j4fj8" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.548884 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-config-data\") pod \"keystone-bootstrap-j4fj8\" (UID: \"28f87ba1-2996-40b9-8c0a-93545e49915e\") " pod="openstack/keystone-bootstrap-j4fj8" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.553105 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-credential-keys\") pod \"keystone-bootstrap-j4fj8\" (UID: \"28f87ba1-2996-40b9-8c0a-93545e49915e\") " pod="openstack/keystone-bootstrap-j4fj8" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.565926 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-combined-ca-bundle\") pod \"keystone-bootstrap-j4fj8\" (UID: \"28f87ba1-2996-40b9-8c0a-93545e49915e\") " pod="openstack/keystone-bootstrap-j4fj8" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.569605 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-89fvh"] Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.570744 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-89fvh" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.587835 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.588081 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.589055 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-7tnt2" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.590321 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlbs7\" (UniqueName: \"kubernetes.io/projected/28f87ba1-2996-40b9-8c0a-93545e49915e-kube-api-access-rlbs7\") pod \"keystone-bootstrap-j4fj8\" (UID: \"28f87ba1-2996-40b9-8c0a-93545e49915e\") " pod="openstack/keystone-bootstrap-j4fj8" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.592730 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-nfzft" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.608989 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-89fvh"] Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.631481 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c623b208-43b3-4c30-a6a1-9d4849513dfa-logs\") pod \"horizon-fd6957bc-tm5df\" (UID: \"c623b208-43b3-4c30-a6a1-9d4849513dfa\") " pod="openstack/horizon-fd6957bc-tm5df" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.631540 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjnxk\" (UniqueName: \"kubernetes.io/projected/481f6e75-b423-4c6c-a1d6-b43674481fc1-kube-api-access-tjnxk\") pod \"neutron-db-sync-89fvh\" (UID: \"481f6e75-b423-4c6c-a1d6-b43674481fc1\") " pod="openstack/neutron-db-sync-89fvh" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.631575 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c623b208-43b3-4c30-a6a1-9d4849513dfa-horizon-secret-key\") pod \"horizon-fd6957bc-tm5df\" (UID: \"c623b208-43b3-4c30-a6a1-9d4849513dfa\") " pod="openstack/horizon-fd6957bc-tm5df" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.631598 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/481f6e75-b423-4c6c-a1d6-b43674481fc1-config\") pod \"neutron-db-sync-89fvh\" (UID: \"481f6e75-b423-4c6c-a1d6-b43674481fc1\") " pod="openstack/neutron-db-sync-89fvh" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.631660 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481f6e75-b423-4c6c-a1d6-b43674481fc1-combined-ca-bundle\") pod \"neutron-db-sync-89fvh\" (UID: \"481f6e75-b423-4c6c-a1d6-b43674481fc1\") " pod="openstack/neutron-db-sync-89fvh" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.631704 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c623b208-43b3-4c30-a6a1-9d4849513dfa-config-data\") pod \"horizon-fd6957bc-tm5df\" (UID: \"c623b208-43b3-4c30-a6a1-9d4849513dfa\") " pod="openstack/horizon-fd6957bc-tm5df" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.631727 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7b5c\" (UniqueName: \"kubernetes.io/projected/c623b208-43b3-4c30-a6a1-9d4849513dfa-kube-api-access-m7b5c\") pod \"horizon-fd6957bc-tm5df\" (UID: \"c623b208-43b3-4c30-a6a1-9d4849513dfa\") " pod="openstack/horizon-fd6957bc-tm5df" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.631747 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c623b208-43b3-4c30-a6a1-9d4849513dfa-scripts\") pod \"horizon-fd6957bc-tm5df\" (UID: \"c623b208-43b3-4c30-a6a1-9d4849513dfa\") " pod="openstack/horizon-fd6957bc-tm5df" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.632406 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c623b208-43b3-4c30-a6a1-9d4849513dfa-scripts\") pod \"horizon-fd6957bc-tm5df\" (UID: \"c623b208-43b3-4c30-a6a1-9d4849513dfa\") " pod="openstack/horizon-fd6957bc-tm5df" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.632654 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c623b208-43b3-4c30-a6a1-9d4849513dfa-logs\") pod \"horizon-fd6957bc-tm5df\" (UID: \"c623b208-43b3-4c30-a6a1-9d4849513dfa\") " pod="openstack/horizon-fd6957bc-tm5df" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.637739 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c623b208-43b3-4c30-a6a1-9d4849513dfa-config-data\") pod \"horizon-fd6957bc-tm5df\" (UID: \"c623b208-43b3-4c30-a6a1-9d4849513dfa\") " pod="openstack/horizon-fd6957bc-tm5df" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.638301 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j4fj8" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.646245 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c623b208-43b3-4c30-a6a1-9d4849513dfa-horizon-secret-key\") pod \"horizon-fd6957bc-tm5df\" (UID: \"c623b208-43b3-4c30-a6a1-9d4849513dfa\") " pod="openstack/horizon-fd6957bc-tm5df" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.687573 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7b5c\" (UniqueName: \"kubernetes.io/projected/c623b208-43b3-4c30-a6a1-9d4849513dfa-kube-api-access-m7b5c\") pod \"horizon-fd6957bc-tm5df\" (UID: \"c623b208-43b3-4c30-a6a1-9d4849513dfa\") " pod="openstack/horizon-fd6957bc-tm5df" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.704709 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.717956 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.725358 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.725551 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.732834 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjnxk\" (UniqueName: \"kubernetes.io/projected/481f6e75-b423-4c6c-a1d6-b43674481fc1-kube-api-access-tjnxk\") pod \"neutron-db-sync-89fvh\" (UID: \"481f6e75-b423-4c6c-a1d6-b43674481fc1\") " pod="openstack/neutron-db-sync-89fvh" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.732909 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/481f6e75-b423-4c6c-a1d6-b43674481fc1-config\") pod \"neutron-db-sync-89fvh\" (UID: \"481f6e75-b423-4c6c-a1d6-b43674481fc1\") " pod="openstack/neutron-db-sync-89fvh" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.732973 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481f6e75-b423-4c6c-a1d6-b43674481fc1-combined-ca-bundle\") pod \"neutron-db-sync-89fvh\" (UID: \"481f6e75-b423-4c6c-a1d6-b43674481fc1\") " pod="openstack/neutron-db-sync-89fvh" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.742430 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.744170 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/481f6e75-b423-4c6c-a1d6-b43674481fc1-config\") pod \"neutron-db-sync-89fvh\" (UID: \"481f6e75-b423-4c6c-a1d6-b43674481fc1\") " pod="openstack/neutron-db-sync-89fvh" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.749094 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481f6e75-b423-4c6c-a1d6-b43674481fc1-combined-ca-bundle\") pod \"neutron-db-sync-89fvh\" (UID: \"481f6e75-b423-4c6c-a1d6-b43674481fc1\") " pod="openstack/neutron-db-sync-89fvh" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.756108 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-d54q6"] Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.764988 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-jwc6l"] Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.766085 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jwc6l" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.770635 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.771047 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.771083 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-d54q6" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.771221 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mcjr4" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.773719 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-pxmzf" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.773986 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.784036 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jwc6l"] Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.794046 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjnxk\" (UniqueName: \"kubernetes.io/projected/481f6e75-b423-4c6c-a1d6-b43674481fc1-kube-api-access-tjnxk\") pod \"neutron-db-sync-89fvh\" (UID: \"481f6e75-b423-4c6c-a1d6-b43674481fc1\") " pod="openstack/neutron-db-sync-89fvh" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.801078 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-d54q6"] Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.808892 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-nfzft"] Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.818174 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.819491 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.825666 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fd6957bc-tm5df" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.827019 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.827428 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.827480 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vk67c" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.827564 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.838820 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-dh5g6"] Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.866981 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10ba0697-529f-41d3-a1a8-55b50ed024a2-scripts\") pod \"cinder-db-sync-jwc6l\" (UID: \"10ba0697-529f-41d3-a1a8-55b50ed024a2\") " pod="openstack/cinder-db-sync-jwc6l" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.867051 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-scripts\") pod \"ceilometer-0\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " pod="openstack/ceilometer-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.867106 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34aed337-bbff-45a7-b95f-b26c95733c82-combined-ca-bundle\") pod \"barbican-db-sync-d54q6\" (UID: \"34aed337-bbff-45a7-b95f-b26c95733c82\") " pod="openstack/barbican-db-sync-d54q6" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.867164 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/10ba0697-529f-41d3-a1a8-55b50ed024a2-db-sync-config-data\") pod \"cinder-db-sync-jwc6l\" (UID: \"10ba0697-529f-41d3-a1a8-55b50ed024a2\") " pod="openstack/cinder-db-sync-jwc6l" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.867220 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd7m5\" (UniqueName: \"kubernetes.io/projected/10ba0697-529f-41d3-a1a8-55b50ed024a2-kube-api-access-jd7m5\") pod \"cinder-db-sync-jwc6l\" (UID: \"10ba0697-529f-41d3-a1a8-55b50ed024a2\") " pod="openstack/cinder-db-sync-jwc6l" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.867251 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/34aed337-bbff-45a7-b95f-b26c95733c82-db-sync-config-data\") pod \"barbican-db-sync-d54q6\" (UID: \"34aed337-bbff-45a7-b95f-b26c95733c82\") " pod="openstack/barbican-db-sync-d54q6" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.867307 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-run-httpd\") pod \"ceilometer-0\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " pod="openstack/ceilometer-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.867340 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-log-httpd\") pod \"ceilometer-0\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " pod="openstack/ceilometer-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.867388 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ba0697-529f-41d3-a1a8-55b50ed024a2-combined-ca-bundle\") pod \"cinder-db-sync-jwc6l\" (UID: \"10ba0697-529f-41d3-a1a8-55b50ed024a2\") " pod="openstack/cinder-db-sync-jwc6l" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.867440 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " pod="openstack/ceilometer-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.867482 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/10ba0697-529f-41d3-a1a8-55b50ed024a2-etc-machine-id\") pod \"cinder-db-sync-jwc6l\" (UID: \"10ba0697-529f-41d3-a1a8-55b50ed024a2\") " pod="openstack/cinder-db-sync-jwc6l" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.867515 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-config-data\") pod \"ceilometer-0\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " pod="openstack/ceilometer-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.867591 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb6j4\" (UniqueName: \"kubernetes.io/projected/34aed337-bbff-45a7-b95f-b26c95733c82-kube-api-access-xb6j4\") pod \"barbican-db-sync-d54q6\" (UID: \"34aed337-bbff-45a7-b95f-b26c95733c82\") " pod="openstack/barbican-db-sync-d54q6" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.868736 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dh5g6" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.878778 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " pod="openstack/ceilometer-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.878894 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10ba0697-529f-41d3-a1a8-55b50ed024a2-config-data\") pod \"cinder-db-sync-jwc6l\" (UID: \"10ba0697-529f-41d3-a1a8-55b50ed024a2\") " pod="openstack/cinder-db-sync-jwc6l" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.879072 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt6px\" (UniqueName: \"kubernetes.io/projected/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-kube-api-access-kt6px\") pod \"ceilometer-0\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " pod="openstack/ceilometer-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.891184 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.894813 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-t4l5w" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.895117 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.897478 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.925141 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-dh5g6"] Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.986638 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-run-httpd\") pod \"ceilometer-0\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " pod="openstack/ceilometer-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.986687 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29n8v\" (UniqueName: \"kubernetes.io/projected/64f807d9-0af7-4723-98b2-dd3cbe55df99-kube-api-access-29n8v\") pod \"placement-db-sync-dh5g6\" (UID: \"64f807d9-0af7-4723-98b2-dd3cbe55df99\") " pod="openstack/placement-db-sync-dh5g6" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.986711 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-log-httpd\") pod \"ceilometer-0\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " pod="openstack/ceilometer-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.986734 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.986763 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ba0697-529f-41d3-a1a8-55b50ed024a2-combined-ca-bundle\") pod \"cinder-db-sync-jwc6l\" (UID: \"10ba0697-529f-41d3-a1a8-55b50ed024a2\") " pod="openstack/cinder-db-sync-jwc6l" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.986787 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.986813 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " pod="openstack/ceilometer-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.986839 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/10ba0697-529f-41d3-a1a8-55b50ed024a2-etc-machine-id\") pod \"cinder-db-sync-jwc6l\" (UID: \"10ba0697-529f-41d3-a1a8-55b50ed024a2\") " pod="openstack/cinder-db-sync-jwc6l" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.986853 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-config-data\") pod \"ceilometer-0\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " pod="openstack/ceilometer-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.986893 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.986922 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64f807d9-0af7-4723-98b2-dd3cbe55df99-config-data\") pod \"placement-db-sync-dh5g6\" (UID: \"64f807d9-0af7-4723-98b2-dd3cbe55df99\") " pod="openstack/placement-db-sync-dh5g6" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.986942 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb6j4\" (UniqueName: \"kubernetes.io/projected/34aed337-bbff-45a7-b95f-b26c95733c82-kube-api-access-xb6j4\") pod \"barbican-db-sync-d54q6\" (UID: \"34aed337-bbff-45a7-b95f-b26c95733c82\") " pod="openstack/barbican-db-sync-d54q6" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.986960 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzz5f\" (UniqueName: \"kubernetes.io/projected/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-kube-api-access-jzz5f\") pod \"glance-default-internal-api-0\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.986978 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " pod="openstack/ceilometer-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.987023 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10ba0697-529f-41d3-a1a8-55b50ed024a2-config-data\") pod \"cinder-db-sync-jwc6l\" (UID: \"10ba0697-529f-41d3-a1a8-55b50ed024a2\") " pod="openstack/cinder-db-sync-jwc6l" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.987038 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.987065 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64f807d9-0af7-4723-98b2-dd3cbe55df99-scripts\") pod \"placement-db-sync-dh5g6\" (UID: \"64f807d9-0af7-4723-98b2-dd3cbe55df99\") " pod="openstack/placement-db-sync-dh5g6" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.987088 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64f807d9-0af7-4723-98b2-dd3cbe55df99-combined-ca-bundle\") pod \"placement-db-sync-dh5g6\" (UID: \"64f807d9-0af7-4723-98b2-dd3cbe55df99\") " pod="openstack/placement-db-sync-dh5g6" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.987112 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64f807d9-0af7-4723-98b2-dd3cbe55df99-logs\") pod \"placement-db-sync-dh5g6\" (UID: \"64f807d9-0af7-4723-98b2-dd3cbe55df99\") " pod="openstack/placement-db-sync-dh5g6" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.987136 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt6px\" (UniqueName: \"kubernetes.io/projected/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-kube-api-access-kt6px\") pod \"ceilometer-0\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " pod="openstack/ceilometer-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.987160 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.987180 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-logs\") pod \"glance-default-internal-api-0\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.987203 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10ba0697-529f-41d3-a1a8-55b50ed024a2-scripts\") pod \"cinder-db-sync-jwc6l\" (UID: \"10ba0697-529f-41d3-a1a8-55b50ed024a2\") " pod="openstack/cinder-db-sync-jwc6l" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.987222 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-scripts\") pod \"ceilometer-0\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " pod="openstack/ceilometer-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.987305 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.987342 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-run-httpd\") pod \"ceilometer-0\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " pod="openstack/ceilometer-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.987555 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-log-httpd\") pod \"ceilometer-0\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " pod="openstack/ceilometer-0" Apr 02 13:59:47 crc kubenswrapper[4732]: I0402 13:59:47.988138 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/10ba0697-529f-41d3-a1a8-55b50ed024a2-etc-machine-id\") pod \"cinder-db-sync-jwc6l\" (UID: \"10ba0697-529f-41d3-a1a8-55b50ed024a2\") " pod="openstack/cinder-db-sync-jwc6l" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:47.998672 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ba0697-529f-41d3-a1a8-55b50ed024a2-combined-ca-bundle\") pod \"cinder-db-sync-jwc6l\" (UID: \"10ba0697-529f-41d3-a1a8-55b50ed024a2\") " pod="openstack/cinder-db-sync-jwc6l" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:47.998967 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34aed337-bbff-45a7-b95f-b26c95733c82-combined-ca-bundle\") pod \"barbican-db-sync-d54q6\" (UID: \"34aed337-bbff-45a7-b95f-b26c95733c82\") " pod="openstack/barbican-db-sync-d54q6" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.000197 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-config-data\") pod \"ceilometer-0\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " pod="openstack/ceilometer-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.000764 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/10ba0697-529f-41d3-a1a8-55b50ed024a2-db-sync-config-data\") pod \"cinder-db-sync-jwc6l\" (UID: \"10ba0697-529f-41d3-a1a8-55b50ed024a2\") " pod="openstack/cinder-db-sync-jwc6l" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.001917 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd7m5\" (UniqueName: \"kubernetes.io/projected/10ba0697-529f-41d3-a1a8-55b50ed024a2-kube-api-access-jd7m5\") pod \"cinder-db-sync-jwc6l\" (UID: \"10ba0697-529f-41d3-a1a8-55b50ed024a2\") " pod="openstack/cinder-db-sync-jwc6l" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.002190 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/34aed337-bbff-45a7-b95f-b26c95733c82-db-sync-config-data\") pod \"barbican-db-sync-d54q6\" (UID: \"34aed337-bbff-45a7-b95f-b26c95733c82\") " pod="openstack/barbican-db-sync-d54q6" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.002364 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " pod="openstack/ceilometer-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.002489 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10ba0697-529f-41d3-a1a8-55b50ed024a2-scripts\") pod \"cinder-db-sync-jwc6l\" (UID: \"10ba0697-529f-41d3-a1a8-55b50ed024a2\") " pod="openstack/cinder-db-sync-jwc6l" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.003636 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " pod="openstack/ceilometer-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.014162 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/10ba0697-529f-41d3-a1a8-55b50ed024a2-db-sync-config-data\") pod \"cinder-db-sync-jwc6l\" (UID: \"10ba0697-529f-41d3-a1a8-55b50ed024a2\") " pod="openstack/cinder-db-sync-jwc6l" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.016253 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34aed337-bbff-45a7-b95f-b26c95733c82-combined-ca-bundle\") pod \"barbican-db-sync-d54q6\" (UID: \"34aed337-bbff-45a7-b95f-b26c95733c82\") " pod="openstack/barbican-db-sync-d54q6" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.016432 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-scripts\") pod \"ceilometer-0\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " pod="openstack/ceilometer-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.023874 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/34aed337-bbff-45a7-b95f-b26c95733c82-db-sync-config-data\") pod \"barbican-db-sync-d54q6\" (UID: \"34aed337-bbff-45a7-b95f-b26c95733c82\") " pod="openstack/barbican-db-sync-d54q6" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.025834 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-l9kx4"] Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.026428 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10ba0697-529f-41d3-a1a8-55b50ed024a2-config-data\") pod \"cinder-db-sync-jwc6l\" (UID: \"10ba0697-529f-41d3-a1a8-55b50ed024a2\") " pod="openstack/cinder-db-sync-jwc6l" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.028114 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.061275 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb6j4\" (UniqueName: \"kubernetes.io/projected/34aed337-bbff-45a7-b95f-b26c95733c82-kube-api-access-xb6j4\") pod \"barbican-db-sync-d54q6\" (UID: \"34aed337-bbff-45a7-b95f-b26c95733c82\") " pod="openstack/barbican-db-sync-d54q6" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.061944 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-l9kx4"] Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.063043 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-89fvh" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.075089 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7b485f6745-hxwxb"] Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.080499 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd7m5\" (UniqueName: \"kubernetes.io/projected/10ba0697-529f-41d3-a1a8-55b50ed024a2-kube-api-access-jd7m5\") pod \"cinder-db-sync-jwc6l\" (UID: \"10ba0697-529f-41d3-a1a8-55b50ed024a2\") " pod="openstack/cinder-db-sync-jwc6l" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.083498 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt6px\" (UniqueName: \"kubernetes.io/projected/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-kube-api-access-kt6px\") pod \"ceilometer-0\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " pod="openstack/ceilometer-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.083892 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jwc6l" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.085961 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b485f6745-hxwxb"] Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.086059 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b485f6745-hxwxb" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.088642 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.090183 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.112253 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.113691 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.113740 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-l9kx4\" (UID: \"38e262a1-5383-4dfb-8f89-537e4db5559c\") " pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.113763 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-config\") pod \"dnsmasq-dns-57c957c4ff-l9kx4\" (UID: \"38e262a1-5383-4dfb-8f89-537e4db5559c\") " pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.113783 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64f807d9-0af7-4723-98b2-dd3cbe55df99-config-data\") pod \"placement-db-sync-dh5g6\" (UID: \"64f807d9-0af7-4723-98b2-dd3cbe55df99\") " pod="openstack/placement-db-sync-dh5g6" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.113804 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzz5f\" (UniqueName: \"kubernetes.io/projected/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-kube-api-access-jzz5f\") pod \"glance-default-internal-api-0\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.113826 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-l9kx4\" (UID: \"38e262a1-5383-4dfb-8f89-537e4db5559c\") " pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.113847 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.113872 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64f807d9-0af7-4723-98b2-dd3cbe55df99-scripts\") pod \"placement-db-sync-dh5g6\" (UID: \"64f807d9-0af7-4723-98b2-dd3cbe55df99\") " pod="openstack/placement-db-sync-dh5g6" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.113893 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64f807d9-0af7-4723-98b2-dd3cbe55df99-combined-ca-bundle\") pod \"placement-db-sync-dh5g6\" (UID: \"64f807d9-0af7-4723-98b2-dd3cbe55df99\") " pod="openstack/placement-db-sync-dh5g6" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.113914 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64f807d9-0af7-4723-98b2-dd3cbe55df99-logs\") pod \"placement-db-sync-dh5g6\" (UID: \"64f807d9-0af7-4723-98b2-dd3cbe55df99\") " pod="openstack/placement-db-sync-dh5g6" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.113940 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.113959 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-logs\") pod \"glance-default-internal-api-0\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.113981 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-l9kx4\" (UID: \"38e262a1-5383-4dfb-8f89-537e4db5559c\") " pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.114001 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.114040 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29n8v\" (UniqueName: \"kubernetes.io/projected/64f807d9-0af7-4723-98b2-dd3cbe55df99-kube-api-access-29n8v\") pod \"placement-db-sync-dh5g6\" (UID: \"64f807d9-0af7-4723-98b2-dd3cbe55df99\") " pod="openstack/placement-db-sync-dh5g6" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.114062 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.114084 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.114102 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-l9kx4\" (UID: \"38e262a1-5383-4dfb-8f89-537e4db5559c\") " pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.114125 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sr66\" (UniqueName: \"kubernetes.io/projected/38e262a1-5383-4dfb-8f89-537e4db5559c-kube-api-access-6sr66\") pod \"dnsmasq-dns-57c957c4ff-l9kx4\" (UID: \"38e262a1-5383-4dfb-8f89-537e4db5559c\") " pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.114863 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64f807d9-0af7-4723-98b2-dd3cbe55df99-logs\") pod \"placement-db-sync-dh5g6\" (UID: \"64f807d9-0af7-4723-98b2-dd3cbe55df99\") " pod="openstack/placement-db-sync-dh5g6" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.115464 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.119094 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.119324 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-logs\") pod \"glance-default-internal-api-0\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.127872 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.128099 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.130392 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64f807d9-0af7-4723-98b2-dd3cbe55df99-config-data\") pod \"placement-db-sync-dh5g6\" (UID: \"64f807d9-0af7-4723-98b2-dd3cbe55df99\") " pod="openstack/placement-db-sync-dh5g6" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.150514 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-d54q6" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.160457 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzz5f\" (UniqueName: \"kubernetes.io/projected/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-kube-api-access-jzz5f\") pod \"glance-default-internal-api-0\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.161831 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64f807d9-0af7-4723-98b2-dd3cbe55df99-scripts\") pod \"placement-db-sync-dh5g6\" (UID: \"64f807d9-0af7-4723-98b2-dd3cbe55df99\") " pod="openstack/placement-db-sync-dh5g6" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.162517 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.162770 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64f807d9-0af7-4723-98b2-dd3cbe55df99-combined-ca-bundle\") pod \"placement-db-sync-dh5g6\" (UID: \"64f807d9-0af7-4723-98b2-dd3cbe55df99\") " pod="openstack/placement-db-sync-dh5g6" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.166315 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.173779 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.186748 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.188210 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.202255 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29n8v\" (UniqueName: \"kubernetes.io/projected/64f807d9-0af7-4723-98b2-dd3cbe55df99-kube-api-access-29n8v\") pod \"placement-db-sync-dh5g6\" (UID: \"64f807d9-0af7-4723-98b2-dd3cbe55df99\") " pod="openstack/placement-db-sync-dh5g6" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.210851 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dh5g6" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.215756 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-l9kx4\" (UID: \"38e262a1-5383-4dfb-8f89-537e4db5559c\") " pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.215814 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sr66\" (UniqueName: \"kubernetes.io/projected/38e262a1-5383-4dfb-8f89-537e4db5559c-kube-api-access-6sr66\") pod \"dnsmasq-dns-57c957c4ff-l9kx4\" (UID: \"38e262a1-5383-4dfb-8f89-537e4db5559c\") " pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.215842 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cf737d2-a567-4afe-9490-b97b6c5d09e6-config-data\") pod \"glance-default-external-api-0\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.215875 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cf737d2-a567-4afe-9490-b97b6c5d09e6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.215905 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cf737d2-a567-4afe-9490-b97b6c5d09e6-scripts\") pod \"glance-default-external-api-0\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.215919 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cf737d2-a567-4afe-9490-b97b6c5d09e6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.215954 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-l9kx4\" (UID: \"38e262a1-5383-4dfb-8f89-537e4db5559c\") " pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.215971 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40deaea9-2a82-456d-802f-7829e6d12a9b-logs\") pod \"horizon-7b485f6745-hxwxb\" (UID: \"40deaea9-2a82-456d-802f-7829e6d12a9b\") " pod="openstack/horizon-7b485f6745-hxwxb" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.215993 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-config\") pod \"dnsmasq-dns-57c957c4ff-l9kx4\" (UID: \"38e262a1-5383-4dfb-8f89-537e4db5559c\") " pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.216036 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-l9kx4\" (UID: \"38e262a1-5383-4dfb-8f89-537e4db5559c\") " pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.216054 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/40deaea9-2a82-456d-802f-7829e6d12a9b-horizon-secret-key\") pod \"horizon-7b485f6745-hxwxb\" (UID: \"40deaea9-2a82-456d-802f-7829e6d12a9b\") " pod="openstack/horizon-7b485f6745-hxwxb" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.216074 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40deaea9-2a82-456d-802f-7829e6d12a9b-scripts\") pod \"horizon-7b485f6745-hxwxb\" (UID: \"40deaea9-2a82-456d-802f-7829e6d12a9b\") " pod="openstack/horizon-7b485f6745-hxwxb" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.216105 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2cf737d2-a567-4afe-9490-b97b6c5d09e6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.216139 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40deaea9-2a82-456d-802f-7829e6d12a9b-config-data\") pod \"horizon-7b485f6745-hxwxb\" (UID: \"40deaea9-2a82-456d-802f-7829e6d12a9b\") " pod="openstack/horizon-7b485f6745-hxwxb" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.216178 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.216208 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-l9kx4\" (UID: \"38e262a1-5383-4dfb-8f89-537e4db5559c\") " pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.216229 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4zcf\" (UniqueName: \"kubernetes.io/projected/2cf737d2-a567-4afe-9490-b97b6c5d09e6-kube-api-access-q4zcf\") pod \"glance-default-external-api-0\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.216268 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7c8s\" (UniqueName: \"kubernetes.io/projected/40deaea9-2a82-456d-802f-7829e6d12a9b-kube-api-access-m7c8s\") pod \"horizon-7b485f6745-hxwxb\" (UID: \"40deaea9-2a82-456d-802f-7829e6d12a9b\") " pod="openstack/horizon-7b485f6745-hxwxb" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.216303 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cf737d2-a567-4afe-9490-b97b6c5d09e6-logs\") pod \"glance-default-external-api-0\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.218203 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-l9kx4\" (UID: \"38e262a1-5383-4dfb-8f89-537e4db5559c\") " pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.218950 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-l9kx4\" (UID: \"38e262a1-5383-4dfb-8f89-537e4db5559c\") " pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.219424 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-l9kx4\" (UID: \"38e262a1-5383-4dfb-8f89-537e4db5559c\") " pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.219643 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-l9kx4\" (UID: \"38e262a1-5383-4dfb-8f89-537e4db5559c\") " pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.225989 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-config\") pod \"dnsmasq-dns-57c957c4ff-l9kx4\" (UID: \"38e262a1-5383-4dfb-8f89-537e4db5559c\") " pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.252752 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sr66\" (UniqueName: \"kubernetes.io/projected/38e262a1-5383-4dfb-8f89-537e4db5559c-kube-api-access-6sr66\") pod \"dnsmasq-dns-57c957c4ff-l9kx4\" (UID: \"38e262a1-5383-4dfb-8f89-537e4db5559c\") " pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.317417 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4zcf\" (UniqueName: \"kubernetes.io/projected/2cf737d2-a567-4afe-9490-b97b6c5d09e6-kube-api-access-q4zcf\") pod \"glance-default-external-api-0\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.317768 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7c8s\" (UniqueName: \"kubernetes.io/projected/40deaea9-2a82-456d-802f-7829e6d12a9b-kube-api-access-m7c8s\") pod \"horizon-7b485f6745-hxwxb\" (UID: \"40deaea9-2a82-456d-802f-7829e6d12a9b\") " pod="openstack/horizon-7b485f6745-hxwxb" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.317816 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cf737d2-a567-4afe-9490-b97b6c5d09e6-logs\") pod \"glance-default-external-api-0\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.317873 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cf737d2-a567-4afe-9490-b97b6c5d09e6-config-data\") pod \"glance-default-external-api-0\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.317890 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cf737d2-a567-4afe-9490-b97b6c5d09e6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.317917 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cf737d2-a567-4afe-9490-b97b6c5d09e6-scripts\") pod \"glance-default-external-api-0\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.317940 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cf737d2-a567-4afe-9490-b97b6c5d09e6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.317962 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40deaea9-2a82-456d-802f-7829e6d12a9b-logs\") pod \"horizon-7b485f6745-hxwxb\" (UID: \"40deaea9-2a82-456d-802f-7829e6d12a9b\") " pod="openstack/horizon-7b485f6745-hxwxb" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.317995 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/40deaea9-2a82-456d-802f-7829e6d12a9b-horizon-secret-key\") pod \"horizon-7b485f6745-hxwxb\" (UID: \"40deaea9-2a82-456d-802f-7829e6d12a9b\") " pod="openstack/horizon-7b485f6745-hxwxb" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.318013 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2cf737d2-a567-4afe-9490-b97b6c5d09e6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.318030 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40deaea9-2a82-456d-802f-7829e6d12a9b-scripts\") pod \"horizon-7b485f6745-hxwxb\" (UID: \"40deaea9-2a82-456d-802f-7829e6d12a9b\") " pod="openstack/horizon-7b485f6745-hxwxb" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.318063 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40deaea9-2a82-456d-802f-7829e6d12a9b-config-data\") pod \"horizon-7b485f6745-hxwxb\" (UID: \"40deaea9-2a82-456d-802f-7829e6d12a9b\") " pod="openstack/horizon-7b485f6745-hxwxb" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.318090 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.318338 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.320081 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2cf737d2-a567-4afe-9490-b97b6c5d09e6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.320333 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40deaea9-2a82-456d-802f-7829e6d12a9b-scripts\") pod \"horizon-7b485f6745-hxwxb\" (UID: \"40deaea9-2a82-456d-802f-7829e6d12a9b\") " pod="openstack/horizon-7b485f6745-hxwxb" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.320357 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cf737d2-a567-4afe-9490-b97b6c5d09e6-logs\") pod \"glance-default-external-api-0\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.321054 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40deaea9-2a82-456d-802f-7829e6d12a9b-logs\") pod \"horizon-7b485f6745-hxwxb\" (UID: \"40deaea9-2a82-456d-802f-7829e6d12a9b\") " pod="openstack/horizon-7b485f6745-hxwxb" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.321833 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40deaea9-2a82-456d-802f-7829e6d12a9b-config-data\") pod \"horizon-7b485f6745-hxwxb\" (UID: \"40deaea9-2a82-456d-802f-7829e6d12a9b\") " pod="openstack/horizon-7b485f6745-hxwxb" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.326280 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cf737d2-a567-4afe-9490-b97b6c5d09e6-config-data\") pod \"glance-default-external-api-0\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.328000 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cf737d2-a567-4afe-9490-b97b6c5d09e6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.328544 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cf737d2-a567-4afe-9490-b97b6c5d09e6-scripts\") pod \"glance-default-external-api-0\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.334040 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cf737d2-a567-4afe-9490-b97b6c5d09e6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.334585 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/40deaea9-2a82-456d-802f-7829e6d12a9b-horizon-secret-key\") pod \"horizon-7b485f6745-hxwxb\" (UID: \"40deaea9-2a82-456d-802f-7829e6d12a9b\") " pod="openstack/horizon-7b485f6745-hxwxb" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.337834 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7c8s\" (UniqueName: \"kubernetes.io/projected/40deaea9-2a82-456d-802f-7829e6d12a9b-kube-api-access-m7c8s\") pod \"horizon-7b485f6745-hxwxb\" (UID: \"40deaea9-2a82-456d-802f-7829e6d12a9b\") " pod="openstack/horizon-7b485f6745-hxwxb" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.338343 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4zcf\" (UniqueName: \"kubernetes.io/projected/2cf737d2-a567-4afe-9490-b97b6c5d09e6-kube-api-access-q4zcf\") pod \"glance-default-external-api-0\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.365656 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.413302 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.425535 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.458221 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-nfzft"] Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.463836 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b485f6745-hxwxb" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.483494 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.508746 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Apr 02 13:59:48 crc kubenswrapper[4732]: W0402 13:59:48.547821 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fa7c556_fc74_4001_9066_55e84d9a4670.slice/crio-ed9584750ecabaf7da8f06f68e3c1acb264a190f61149ef6fc4f827eea9e3a4f WatchSource:0}: Error finding container ed9584750ecabaf7da8f06f68e3c1acb264a190f61149ef6fc4f827eea9e3a4f: Status 404 returned error can't find the container with id ed9584750ecabaf7da8f06f68e3c1acb264a190f61149ef6fc4f827eea9e3a4f Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.697006 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-fd6957bc-tm5df"] Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.710656 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-j4fj8"] Apr 02 13:59:48 crc kubenswrapper[4732]: W0402 13:59:48.726453 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc623b208_43b3_4c30_a6a1_9d4849513dfa.slice/crio-eda5997546dc91c262160fa4f74bc4acf143dd66799c7135a59177234ba27abe WatchSource:0}: Error finding container eda5997546dc91c262160fa4f74bc4acf143dd66799c7135a59177234ba27abe: Status 404 returned error can't find the container with id eda5997546dc91c262160fa4f74bc4acf143dd66799c7135a59177234ba27abe Apr 02 13:59:48 crc kubenswrapper[4732]: I0402 13:59:48.850841 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-89fvh"] Apr 02 13:59:48 crc kubenswrapper[4732]: W0402 13:59:48.851397 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod481f6e75_b423_4c6c_a1d6_b43674481fc1.slice/crio-2396f7b4e78e2cb23d1165505e34e72e4f5835ad94dbadeca06ce934f71e6bbc WatchSource:0}: Error finding container 2396f7b4e78e2cb23d1165505e34e72e4f5835ad94dbadeca06ce934f71e6bbc: Status 404 returned error can't find the container with id 2396f7b4e78e2cb23d1165505e34e72e4f5835ad94dbadeca06ce934f71e6bbc Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.137528 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-dh5g6"] Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.155383 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-d54q6"] Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.186243 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jwc6l"] Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.215328 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.263150 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fd6957bc-tm5df" event={"ID":"c623b208-43b3-4c30-a6a1-9d4849513dfa","Type":"ContainerStarted","Data":"eda5997546dc91c262160fa4f74bc4acf143dd66799c7135a59177234ba27abe"} Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.278137 4732 generic.go:334] "Generic (PLEG): container finished" podID="0fa7c556-fc74-4001-9066-55e84d9a4670" containerID="6129e08289aafa8d58002451608692fecbe699b742dd0c140d595647e68b7c49" exitCode=0 Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.278265 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-nfzft" event={"ID":"0fa7c556-fc74-4001-9066-55e84d9a4670","Type":"ContainerDied","Data":"6129e08289aafa8d58002451608692fecbe699b742dd0c140d595647e68b7c49"} Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.278294 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-nfzft" event={"ID":"0fa7c556-fc74-4001-9066-55e84d9a4670","Type":"ContainerStarted","Data":"ed9584750ecabaf7da8f06f68e3c1acb264a190f61149ef6fc4f827eea9e3a4f"} Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.295348 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-l9kx4"] Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.304420 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-89fvh" event={"ID":"481f6e75-b423-4c6c-a1d6-b43674481fc1","Type":"ContainerStarted","Data":"c1ff334494eebb50605f5ce8c2393c63ab95edea02fd9a390bfa160974a7444f"} Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.305491 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-89fvh" event={"ID":"481f6e75-b423-4c6c-a1d6-b43674481fc1","Type":"ContainerStarted","Data":"2396f7b4e78e2cb23d1165505e34e72e4f5835ad94dbadeca06ce934f71e6bbc"} Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.317916 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j4fj8" event={"ID":"28f87ba1-2996-40b9-8c0a-93545e49915e","Type":"ContainerStarted","Data":"99dfb7ae772e0081507ee489bd4b99216c5fdab104692fd5f576570e100b64f4"} Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.318009 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j4fj8" event={"ID":"28f87ba1-2996-40b9-8c0a-93545e49915e","Type":"ContainerStarted","Data":"bcac5b3ded494ac2b87a26d4b1483a912e7c1bc566b95c76a677b0ab015fc668"} Apr 02 13:59:49 crc kubenswrapper[4732]: W0402 13:59:49.334112 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38e262a1_5383_4dfb_8f89_537e4db5559c.slice/crio-dca2f9ffcc7bc0b2a75459394ce56331497db5203904f76d0c6d5cccbdcd0a8e WatchSource:0}: Error finding container dca2f9ffcc7bc0b2a75459394ce56331497db5203904f76d0c6d5cccbdcd0a8e: Status 404 returned error can't find the container with id dca2f9ffcc7bc0b2a75459394ce56331497db5203904f76d0c6d5cccbdcd0a8e Apr 02 13:59:49 crc kubenswrapper[4732]: W0402 13:59:49.344820 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40deaea9_2a82_456d_802f_7829e6d12a9b.slice/crio-95df95b67878c1dd7a56156242c96daa45b9f239412ba96926f1608aaa48a6b8 WatchSource:0}: Error finding container 95df95b67878c1dd7a56156242c96daa45b9f239412ba96926f1608aaa48a6b8: Status 404 returned error can't find the container with id 95df95b67878c1dd7a56156242c96daa45b9f239412ba96926f1608aaa48a6b8 Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.379305 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b485f6745-hxwxb"] Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.410237 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-89fvh" podStartSLOduration=2.410213753 podStartE2EDuration="2.410213753s" podCreationTimestamp="2026-04-02 13:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:59:49.332225532 +0000 UTC m=+1346.236633085" watchObservedRunningTime="2026-04-02 13:59:49.410213753 +0000 UTC m=+1346.314621306" Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.424451 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-j4fj8" podStartSLOduration=2.424432806 podStartE2EDuration="2.424432806s" podCreationTimestamp="2026-04-02 13:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:59:49.351810449 +0000 UTC m=+1346.256218022" watchObservedRunningTime="2026-04-02 13:59:49.424432806 +0000 UTC m=+1346.328840359" Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.594203 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 02 13:59:49 crc kubenswrapper[4732]: W0402 13:59:49.611208 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2878bd1e_d81d_4cc8_9334_3a20fe0b0d3b.slice/crio-23ac7952fae97d3e0d20680d11e51ff55d660a11bec6fd95bfbb5ad8d0d76152 WatchSource:0}: Error finding container 23ac7952fae97d3e0d20680d11e51ff55d660a11bec6fd95bfbb5ad8d0d76152: Status 404 returned error can't find the container with id 23ac7952fae97d3e0d20680d11e51ff55d660a11bec6fd95bfbb5ad8d0d76152 Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.778479 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-nfzft" Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.848644 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-dns-swift-storage-0\") pod \"0fa7c556-fc74-4001-9066-55e84d9a4670\" (UID: \"0fa7c556-fc74-4001-9066-55e84d9a4670\") " Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.848746 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-dns-svc\") pod \"0fa7c556-fc74-4001-9066-55e84d9a4670\" (UID: \"0fa7c556-fc74-4001-9066-55e84d9a4670\") " Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.848820 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-ovsdbserver-sb\") pod \"0fa7c556-fc74-4001-9066-55e84d9a4670\" (UID: \"0fa7c556-fc74-4001-9066-55e84d9a4670\") " Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.848988 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-config\") pod \"0fa7c556-fc74-4001-9066-55e84d9a4670\" (UID: \"0fa7c556-fc74-4001-9066-55e84d9a4670\") " Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.849534 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khdrl\" (UniqueName: \"kubernetes.io/projected/0fa7c556-fc74-4001-9066-55e84d9a4670-kube-api-access-khdrl\") pod \"0fa7c556-fc74-4001-9066-55e84d9a4670\" (UID: \"0fa7c556-fc74-4001-9066-55e84d9a4670\") " Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.849590 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-ovsdbserver-nb\") pod \"0fa7c556-fc74-4001-9066-55e84d9a4670\" (UID: \"0fa7c556-fc74-4001-9066-55e84d9a4670\") " Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.857593 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fa7c556-fc74-4001-9066-55e84d9a4670-kube-api-access-khdrl" (OuterVolumeSpecName: "kube-api-access-khdrl") pod "0fa7c556-fc74-4001-9066-55e84d9a4670" (UID: "0fa7c556-fc74-4001-9066-55e84d9a4670"). InnerVolumeSpecName "kube-api-access-khdrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.873337 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0fa7c556-fc74-4001-9066-55e84d9a4670" (UID: "0fa7c556-fc74-4001-9066-55e84d9a4670"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.877954 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-config" (OuterVolumeSpecName: "config") pod "0fa7c556-fc74-4001-9066-55e84d9a4670" (UID: "0fa7c556-fc74-4001-9066-55e84d9a4670"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.877974 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0fa7c556-fc74-4001-9066-55e84d9a4670" (UID: "0fa7c556-fc74-4001-9066-55e84d9a4670"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.881833 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0fa7c556-fc74-4001-9066-55e84d9a4670" (UID: "0fa7c556-fc74-4001-9066-55e84d9a4670"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.884975 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0fa7c556-fc74-4001-9066-55e84d9a4670" (UID: "0fa7c556-fc74-4001-9066-55e84d9a4670"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.954343 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khdrl\" (UniqueName: \"kubernetes.io/projected/0fa7c556-fc74-4001-9066-55e84d9a4670-kube-api-access-khdrl\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.954376 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.954467 4732 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.954477 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.954486 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:49 crc kubenswrapper[4732]: I0402 13:59:49.954495 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fa7c556-fc74-4001-9066-55e84d9a4670-config\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:50 crc kubenswrapper[4732]: I0402 13:59:50.350501 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-nfzft" Apr 02 13:59:50 crc kubenswrapper[4732]: I0402 13:59:50.350527 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-nfzft" event={"ID":"0fa7c556-fc74-4001-9066-55e84d9a4670","Type":"ContainerDied","Data":"ed9584750ecabaf7da8f06f68e3c1acb264a190f61149ef6fc4f827eea9e3a4f"} Apr 02 13:59:50 crc kubenswrapper[4732]: I0402 13:59:50.350576 4732 scope.go:117] "RemoveContainer" containerID="6129e08289aafa8d58002451608692fecbe699b742dd0c140d595647e68b7c49" Apr 02 13:59:50 crc kubenswrapper[4732]: I0402 13:59:50.355389 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jwc6l" event={"ID":"10ba0697-529f-41d3-a1a8-55b50ed024a2","Type":"ContainerStarted","Data":"29ad76abfe869fb1f8b08f0bad1fdc22194c5ae2f1185f58f2ae082436042f36"} Apr 02 13:59:50 crc kubenswrapper[4732]: I0402 13:59:50.359018 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b485f6745-hxwxb" event={"ID":"40deaea9-2a82-456d-802f-7829e6d12a9b","Type":"ContainerStarted","Data":"95df95b67878c1dd7a56156242c96daa45b9f239412ba96926f1608aaa48a6b8"} Apr 02 13:59:50 crc kubenswrapper[4732]: I0402 13:59:50.361230 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dh5g6" event={"ID":"64f807d9-0af7-4723-98b2-dd3cbe55df99","Type":"ContainerStarted","Data":"35fe3abdb2033f0af21275b5e23119e97373deab4f8cccae867278c4df806f93"} Apr 02 13:59:50 crc kubenswrapper[4732]: I0402 13:59:50.368458 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b","Type":"ContainerStarted","Data":"23ac7952fae97d3e0d20680d11e51ff55d660a11bec6fd95bfbb5ad8d0d76152"} Apr 02 13:59:50 crc kubenswrapper[4732]: I0402 13:59:50.374595 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2","Type":"ContainerStarted","Data":"83b28ad9663b6eaa93684ced8107088ee5f6704b3379e196f07940d35f654d8e"} Apr 02 13:59:50 crc kubenswrapper[4732]: I0402 13:59:50.382400 4732 generic.go:334] "Generic (PLEG): container finished" podID="38e262a1-5383-4dfb-8f89-537e4db5559c" containerID="a284bab016250adabb273688b4d9b26e0a6b7383f15a43c13128f2995a2777b7" exitCode=0 Apr 02 13:59:50 crc kubenswrapper[4732]: I0402 13:59:50.382489 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" event={"ID":"38e262a1-5383-4dfb-8f89-537e4db5559c","Type":"ContainerDied","Data":"a284bab016250adabb273688b4d9b26e0a6b7383f15a43c13128f2995a2777b7"} Apr 02 13:59:50 crc kubenswrapper[4732]: I0402 13:59:50.382529 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" event={"ID":"38e262a1-5383-4dfb-8f89-537e4db5559c","Type":"ContainerStarted","Data":"dca2f9ffcc7bc0b2a75459394ce56331497db5203904f76d0c6d5cccbdcd0a8e"} Apr 02 13:59:50 crc kubenswrapper[4732]: I0402 13:59:50.391561 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-d54q6" event={"ID":"34aed337-bbff-45a7-b95f-b26c95733c82","Type":"ContainerStarted","Data":"0306e2800195b1ed7eaedaf457b0a9f9703f11e85abde57269508338b34fbeb4"} Apr 02 13:59:50 crc kubenswrapper[4732]: I0402 13:59:50.463168 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-nfzft"] Apr 02 13:59:50 crc kubenswrapper[4732]: I0402 13:59:50.476732 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-nfzft"] Apr 02 13:59:50 crc kubenswrapper[4732]: I0402 13:59:50.702290 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fa7c556-fc74-4001-9066-55e84d9a4670" path="/var/lib/kubelet/pods/0fa7c556-fc74-4001-9066-55e84d9a4670/volumes" Apr 02 13:59:50 crc kubenswrapper[4732]: I0402 13:59:50.707182 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Apr 02 13:59:50 crc kubenswrapper[4732]: W0402 13:59:50.778904 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cf737d2_a567_4afe_9490_b97b6c5d09e6.slice/crio-6f6dc487830ac7e6ddfd333c94f11519665603935b3c54e9d164d1551b3e4f3d WatchSource:0}: Error finding container 6f6dc487830ac7e6ddfd333c94f11519665603935b3c54e9d164d1551b3e4f3d: Status 404 returned error can't find the container with id 6f6dc487830ac7e6ddfd333c94f11519665603935b3c54e9d164d1551b3e4f3d Apr 02 13:59:50 crc kubenswrapper[4732]: I0402 13:59:50.878809 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Apr 02 13:59:50 crc kubenswrapper[4732]: I0402 13:59:50.914368 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b485f6745-hxwxb"] Apr 02 13:59:50 crc kubenswrapper[4732]: I0402 13:59:50.967029 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-c799f465-l2dct"] Apr 02 13:59:50 crc kubenswrapper[4732]: E0402 13:59:50.967755 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa7c556-fc74-4001-9066-55e84d9a4670" containerName="init" Apr 02 13:59:50 crc kubenswrapper[4732]: I0402 13:59:50.967783 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa7c556-fc74-4001-9066-55e84d9a4670" containerName="init" Apr 02 13:59:50 crc kubenswrapper[4732]: I0402 13:59:50.968013 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fa7c556-fc74-4001-9066-55e84d9a4670" containerName="init" Apr 02 13:59:50 crc kubenswrapper[4732]: I0402 13:59:50.969279 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c799f465-l2dct" Apr 02 13:59:51 crc kubenswrapper[4732]: I0402 13:59:51.026569 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 02 13:59:51 crc kubenswrapper[4732]: I0402 13:59:51.052404 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-c799f465-l2dct"] Apr 02 13:59:51 crc kubenswrapper[4732]: I0402 13:59:51.067395 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Apr 02 13:59:51 crc kubenswrapper[4732]: I0402 13:59:51.089838 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd752ee8-4492-42c7-9125-974b270451d2-scripts\") pod \"horizon-c799f465-l2dct\" (UID: \"bd752ee8-4492-42c7-9125-974b270451d2\") " pod="openstack/horizon-c799f465-l2dct" Apr 02 13:59:51 crc kubenswrapper[4732]: I0402 13:59:51.089940 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd752ee8-4492-42c7-9125-974b270451d2-config-data\") pod \"horizon-c799f465-l2dct\" (UID: \"bd752ee8-4492-42c7-9125-974b270451d2\") " pod="openstack/horizon-c799f465-l2dct" Apr 02 13:59:51 crc kubenswrapper[4732]: I0402 13:59:51.090007 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bd752ee8-4492-42c7-9125-974b270451d2-horizon-secret-key\") pod \"horizon-c799f465-l2dct\" (UID: \"bd752ee8-4492-42c7-9125-974b270451d2\") " pod="openstack/horizon-c799f465-l2dct" Apr 02 13:59:51 crc kubenswrapper[4732]: I0402 13:59:51.090114 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhsrx\" (UniqueName: \"kubernetes.io/projected/bd752ee8-4492-42c7-9125-974b270451d2-kube-api-access-nhsrx\") pod \"horizon-c799f465-l2dct\" (UID: \"bd752ee8-4492-42c7-9125-974b270451d2\") " pod="openstack/horizon-c799f465-l2dct" Apr 02 13:59:51 crc kubenswrapper[4732]: I0402 13:59:51.090143 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd752ee8-4492-42c7-9125-974b270451d2-logs\") pod \"horizon-c799f465-l2dct\" (UID: \"bd752ee8-4492-42c7-9125-974b270451d2\") " pod="openstack/horizon-c799f465-l2dct" Apr 02 13:59:51 crc kubenswrapper[4732]: I0402 13:59:51.194233 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd752ee8-4492-42c7-9125-974b270451d2-scripts\") pod \"horizon-c799f465-l2dct\" (UID: \"bd752ee8-4492-42c7-9125-974b270451d2\") " pod="openstack/horizon-c799f465-l2dct" Apr 02 13:59:51 crc kubenswrapper[4732]: I0402 13:59:51.194550 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd752ee8-4492-42c7-9125-974b270451d2-config-data\") pod \"horizon-c799f465-l2dct\" (UID: \"bd752ee8-4492-42c7-9125-974b270451d2\") " pod="openstack/horizon-c799f465-l2dct" Apr 02 13:59:51 crc kubenswrapper[4732]: I0402 13:59:51.194738 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bd752ee8-4492-42c7-9125-974b270451d2-horizon-secret-key\") pod \"horizon-c799f465-l2dct\" (UID: \"bd752ee8-4492-42c7-9125-974b270451d2\") " pod="openstack/horizon-c799f465-l2dct" Apr 02 13:59:51 crc kubenswrapper[4732]: I0402 13:59:51.194869 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhsrx\" (UniqueName: \"kubernetes.io/projected/bd752ee8-4492-42c7-9125-974b270451d2-kube-api-access-nhsrx\") pod \"horizon-c799f465-l2dct\" (UID: \"bd752ee8-4492-42c7-9125-974b270451d2\") " pod="openstack/horizon-c799f465-l2dct" Apr 02 13:59:51 crc kubenswrapper[4732]: I0402 13:59:51.195169 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd752ee8-4492-42c7-9125-974b270451d2-logs\") pod \"horizon-c799f465-l2dct\" (UID: \"bd752ee8-4492-42c7-9125-974b270451d2\") " pod="openstack/horizon-c799f465-l2dct" Apr 02 13:59:51 crc kubenswrapper[4732]: I0402 13:59:51.195885 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd752ee8-4492-42c7-9125-974b270451d2-scripts\") pod \"horizon-c799f465-l2dct\" (UID: \"bd752ee8-4492-42c7-9125-974b270451d2\") " pod="openstack/horizon-c799f465-l2dct" Apr 02 13:59:51 crc kubenswrapper[4732]: I0402 13:59:51.196325 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd752ee8-4492-42c7-9125-974b270451d2-logs\") pod \"horizon-c799f465-l2dct\" (UID: \"bd752ee8-4492-42c7-9125-974b270451d2\") " pod="openstack/horizon-c799f465-l2dct" Apr 02 13:59:51 crc kubenswrapper[4732]: I0402 13:59:51.197409 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd752ee8-4492-42c7-9125-974b270451d2-config-data\") pod \"horizon-c799f465-l2dct\" (UID: \"bd752ee8-4492-42c7-9125-974b270451d2\") " pod="openstack/horizon-c799f465-l2dct" Apr 02 13:59:51 crc kubenswrapper[4732]: I0402 13:59:51.206561 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bd752ee8-4492-42c7-9125-974b270451d2-horizon-secret-key\") pod \"horizon-c799f465-l2dct\" (UID: \"bd752ee8-4492-42c7-9125-974b270451d2\") " pod="openstack/horizon-c799f465-l2dct" Apr 02 13:59:51 crc kubenswrapper[4732]: I0402 13:59:51.212399 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhsrx\" (UniqueName: \"kubernetes.io/projected/bd752ee8-4492-42c7-9125-974b270451d2-kube-api-access-nhsrx\") pod \"horizon-c799f465-l2dct\" (UID: \"bd752ee8-4492-42c7-9125-974b270451d2\") " pod="openstack/horizon-c799f465-l2dct" Apr 02 13:59:51 crc kubenswrapper[4732]: I0402 13:59:51.321633 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c799f465-l2dct" Apr 02 13:59:51 crc kubenswrapper[4732]: I0402 13:59:51.450572 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2cf737d2-a567-4afe-9490-b97b6c5d09e6","Type":"ContainerStarted","Data":"6f6dc487830ac7e6ddfd333c94f11519665603935b3c54e9d164d1551b3e4f3d"} Apr 02 13:59:51 crc kubenswrapper[4732]: I0402 13:59:51.452901 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b","Type":"ContainerStarted","Data":"2e7264b3c1df24ae1ba0205d9b07561f3b4ba9436cfa3bfe65c7212df43dc004"} Apr 02 13:59:51 crc kubenswrapper[4732]: I0402 13:59:51.456479 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" event={"ID":"38e262a1-5383-4dfb-8f89-537e4db5559c","Type":"ContainerStarted","Data":"e2f22566a97573ec2b53998abc6a6f23072b22ca3960da4d5420f6e816b9db5b"} Apr 02 13:59:51 crc kubenswrapper[4732]: I0402 13:59:51.457734 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" Apr 02 13:59:51 crc kubenswrapper[4732]: I0402 13:59:51.490190 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" podStartSLOduration=4.490166342 podStartE2EDuration="4.490166342s" podCreationTimestamp="2026-04-02 13:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:59:51.476018041 +0000 UTC m=+1348.380425614" watchObservedRunningTime="2026-04-02 13:59:51.490166342 +0000 UTC m=+1348.394573895" Apr 02 13:59:52 crc kubenswrapper[4732]: I0402 13:59:52.001540 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-c799f465-l2dct"] Apr 02 13:59:52 crc kubenswrapper[4732]: I0402 13:59:52.532762 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b","Type":"ContainerStarted","Data":"6391b06adfefe9e1c2f919abfdf05ce2375ff6ce7ab5fd3ec5d64ca7a1a910b6"} Apr 02 13:59:52 crc kubenswrapper[4732]: I0402 13:59:52.533246 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b" containerName="glance-log" containerID="cri-o://2e7264b3c1df24ae1ba0205d9b07561f3b4ba9436cfa3bfe65c7212df43dc004" gracePeriod=30 Apr 02 13:59:52 crc kubenswrapper[4732]: I0402 13:59:52.534131 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b" containerName="glance-httpd" containerID="cri-o://6391b06adfefe9e1c2f919abfdf05ce2375ff6ce7ab5fd3ec5d64ca7a1a910b6" gracePeriod=30 Apr 02 13:59:52 crc kubenswrapper[4732]: I0402 13:59:52.539010 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c799f465-l2dct" event={"ID":"bd752ee8-4492-42c7-9125-974b270451d2","Type":"ContainerStarted","Data":"39dc7bb4bd1c3887cec75ba568de5c655f546b64051a15df04785ccfbee58913"} Apr 02 13:59:52 crc kubenswrapper[4732]: I0402 13:59:52.550402 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2cf737d2-a567-4afe-9490-b97b6c5d09e6","Type":"ContainerStarted","Data":"e77fdd735b8cff6f5dbb88656da65ba6a92481af4d24778625f0518bf959449a"} Apr 02 13:59:52 crc kubenswrapper[4732]: I0402 13:59:52.557382 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.557362144 podStartE2EDuration="5.557362144s" podCreationTimestamp="2026-04-02 13:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:59:52.552816282 +0000 UTC m=+1349.457223855" watchObservedRunningTime="2026-04-02 13:59:52.557362144 +0000 UTC m=+1349.461769697" Apr 02 13:59:53 crc kubenswrapper[4732]: I0402 13:59:53.570930 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2cf737d2-a567-4afe-9490-b97b6c5d09e6","Type":"ContainerStarted","Data":"425788db80ca1f6c470809c31f38955b4a7630e234688529f843a925e6641fbd"} Apr 02 13:59:53 crc kubenswrapper[4732]: I0402 13:59:53.571094 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2cf737d2-a567-4afe-9490-b97b6c5d09e6" containerName="glance-log" containerID="cri-o://e77fdd735b8cff6f5dbb88656da65ba6a92481af4d24778625f0518bf959449a" gracePeriod=30 Apr 02 13:59:53 crc kubenswrapper[4732]: I0402 13:59:53.571445 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2cf737d2-a567-4afe-9490-b97b6c5d09e6" containerName="glance-httpd" containerID="cri-o://425788db80ca1f6c470809c31f38955b4a7630e234688529f843a925e6641fbd" gracePeriod=30 Apr 02 13:59:53 crc kubenswrapper[4732]: I0402 13:59:53.582408 4732 generic.go:334] "Generic (PLEG): container finished" podID="2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b" containerID="6391b06adfefe9e1c2f919abfdf05ce2375ff6ce7ab5fd3ec5d64ca7a1a910b6" exitCode=0 Apr 02 13:59:53 crc kubenswrapper[4732]: I0402 13:59:53.582442 4732 generic.go:334] "Generic (PLEG): container finished" podID="2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b" containerID="2e7264b3c1df24ae1ba0205d9b07561f3b4ba9436cfa3bfe65c7212df43dc004" exitCode=143 Apr 02 13:59:53 crc kubenswrapper[4732]: I0402 13:59:53.582482 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b","Type":"ContainerDied","Data":"6391b06adfefe9e1c2f919abfdf05ce2375ff6ce7ab5fd3ec5d64ca7a1a910b6"} Apr 02 13:59:53 crc kubenswrapper[4732]: I0402 13:59:53.582531 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b","Type":"ContainerDied","Data":"2e7264b3c1df24ae1ba0205d9b07561f3b4ba9436cfa3bfe65c7212df43dc004"} Apr 02 13:59:53 crc kubenswrapper[4732]: I0402 13:59:53.608461 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.6084400930000005 podStartE2EDuration="6.608440093s" podCreationTimestamp="2026-04-02 13:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 13:59:53.592084483 +0000 UTC m=+1350.496492046" watchObservedRunningTime="2026-04-02 13:59:53.608440093 +0000 UTC m=+1350.512847646" Apr 02 13:59:54 crc kubenswrapper[4732]: I0402 13:59:54.593760 4732 generic.go:334] "Generic (PLEG): container finished" podID="28f87ba1-2996-40b9-8c0a-93545e49915e" containerID="99dfb7ae772e0081507ee489bd4b99216c5fdab104692fd5f576570e100b64f4" exitCode=0 Apr 02 13:59:54 crc kubenswrapper[4732]: I0402 13:59:54.594095 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j4fj8" event={"ID":"28f87ba1-2996-40b9-8c0a-93545e49915e","Type":"ContainerDied","Data":"99dfb7ae772e0081507ee489bd4b99216c5fdab104692fd5f576570e100b64f4"} Apr 02 13:59:54 crc kubenswrapper[4732]: I0402 13:59:54.597132 4732 generic.go:334] "Generic (PLEG): container finished" podID="2cf737d2-a567-4afe-9490-b97b6c5d09e6" containerID="425788db80ca1f6c470809c31f38955b4a7630e234688529f843a925e6641fbd" exitCode=0 Apr 02 13:59:54 crc kubenswrapper[4732]: I0402 13:59:54.597171 4732 generic.go:334] "Generic (PLEG): container finished" podID="2cf737d2-a567-4afe-9490-b97b6c5d09e6" containerID="e77fdd735b8cff6f5dbb88656da65ba6a92481af4d24778625f0518bf959449a" exitCode=143 Apr 02 13:59:54 crc kubenswrapper[4732]: I0402 13:59:54.597193 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2cf737d2-a567-4afe-9490-b97b6c5d09e6","Type":"ContainerDied","Data":"425788db80ca1f6c470809c31f38955b4a7630e234688529f843a925e6641fbd"} Apr 02 13:59:54 crc kubenswrapper[4732]: I0402 13:59:54.597217 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2cf737d2-a567-4afe-9490-b97b6c5d09e6","Type":"ContainerDied","Data":"e77fdd735b8cff6f5dbb88656da65ba6a92481af4d24778625f0518bf959449a"} Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.610901 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b","Type":"ContainerDied","Data":"23ac7952fae97d3e0d20680d11e51ff55d660a11bec6fd95bfbb5ad8d0d76152"} Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.610951 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23ac7952fae97d3e0d20680d11e51ff55d660a11bec6fd95bfbb5ad8d0d76152" Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.661237 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.787704 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-internal-tls-certs\") pod \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.788222 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-combined-ca-bundle\") pod \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.788252 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-config-data\") pod \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.788871 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-logs\") pod \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.788942 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.789053 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-scripts\") pod \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.789071 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzz5f\" (UniqueName: \"kubernetes.io/projected/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-kube-api-access-jzz5f\") pod \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.789113 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-httpd-run\") pod \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\" (UID: \"2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b\") " Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.789468 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-logs" (OuterVolumeSpecName: "logs") pod "2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b" (UID: "2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.789545 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-logs\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.791027 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b" (UID: "2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.796266 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-scripts" (OuterVolumeSpecName: "scripts") pod "2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b" (UID: "2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.797584 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-kube-api-access-jzz5f" (OuterVolumeSpecName: "kube-api-access-jzz5f") pod "2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b" (UID: "2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b"). InnerVolumeSpecName "kube-api-access-jzz5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.800121 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b" (UID: "2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.836762 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b" (UID: "2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.857014 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-config-data" (OuterVolumeSpecName: "config-data") pod "2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b" (UID: "2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.875751 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b" (UID: "2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.891096 4732 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-httpd-run\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.891141 4732 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.891156 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.891167 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.891199 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.891211 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.891221 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzz5f\" (UniqueName: \"kubernetes.io/projected/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b-kube-api-access-jzz5f\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.911987 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Apr 02 13:59:55 crc kubenswrapper[4732]: I0402 13:59:55.992865 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.215396 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-fd6957bc-tm5df"] Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.249381 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-59fb764b6d-vml5x"] Apr 02 13:59:56 crc kubenswrapper[4732]: E0402 13:59:56.249850 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b" containerName="glance-log" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.249877 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b" containerName="glance-log" Apr 02 13:59:56 crc kubenswrapper[4732]: E0402 13:59:56.249896 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b" containerName="glance-httpd" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.249908 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b" containerName="glance-httpd" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.250188 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b" containerName="glance-httpd" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.250215 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b" containerName="glance-log" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.251754 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.254339 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.266983 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-59fb764b6d-vml5x"] Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.323288 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-c799f465-l2dct"] Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.385632 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-54f994999b-b88d7"] Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.392292 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54f994999b-b88d7" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.403264 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-horizon-tls-certs\") pod \"horizon-59fb764b6d-vml5x\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.403326 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-combined-ca-bundle\") pod \"horizon-59fb764b6d-vml5x\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.403770 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-logs\") pod \"horizon-59fb764b6d-vml5x\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.403855 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-scripts\") pod \"horizon-59fb764b6d-vml5x\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.403911 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgkzv\" (UniqueName: \"kubernetes.io/projected/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-kube-api-access-sgkzv\") pod \"horizon-59fb764b6d-vml5x\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.404187 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-horizon-secret-key\") pod \"horizon-59fb764b6d-vml5x\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.404323 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-config-data\") pod \"horizon-59fb764b6d-vml5x\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.430781 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54f994999b-b88d7"] Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.505646 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/97d6e519-a82f-4ce5-9199-4d7db769f86b-horizon-secret-key\") pod \"horizon-54f994999b-b88d7\" (UID: \"97d6e519-a82f-4ce5-9199-4d7db769f86b\") " pod="openstack/horizon-54f994999b-b88d7" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.505966 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-config-data\") pod \"horizon-59fb764b6d-vml5x\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.507155 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97d6e519-a82f-4ce5-9199-4d7db769f86b-scripts\") pod \"horizon-54f994999b-b88d7\" (UID: \"97d6e519-a82f-4ce5-9199-4d7db769f86b\") " pod="openstack/horizon-54f994999b-b88d7" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.507273 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-horizon-tls-certs\") pod \"horizon-59fb764b6d-vml5x\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.507295 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-combined-ca-bundle\") pod \"horizon-59fb764b6d-vml5x\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.507355 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-logs\") pod \"horizon-59fb764b6d-vml5x\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.507381 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97d6e519-a82f-4ce5-9199-4d7db769f86b-combined-ca-bundle\") pod \"horizon-54f994999b-b88d7\" (UID: \"97d6e519-a82f-4ce5-9199-4d7db769f86b\") " pod="openstack/horizon-54f994999b-b88d7" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.507415 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/97d6e519-a82f-4ce5-9199-4d7db769f86b-horizon-tls-certs\") pod \"horizon-54f994999b-b88d7\" (UID: \"97d6e519-a82f-4ce5-9199-4d7db769f86b\") " pod="openstack/horizon-54f994999b-b88d7" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.507454 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-scripts\") pod \"horizon-59fb764b6d-vml5x\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.507490 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgkzv\" (UniqueName: \"kubernetes.io/projected/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-kube-api-access-sgkzv\") pod \"horizon-59fb764b6d-vml5x\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.507520 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97d6e519-a82f-4ce5-9199-4d7db769f86b-logs\") pod \"horizon-54f994999b-b88d7\" (UID: \"97d6e519-a82f-4ce5-9199-4d7db769f86b\") " pod="openstack/horizon-54f994999b-b88d7" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.507634 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97d6e519-a82f-4ce5-9199-4d7db769f86b-config-data\") pod \"horizon-54f994999b-b88d7\" (UID: \"97d6e519-a82f-4ce5-9199-4d7db769f86b\") " pod="openstack/horizon-54f994999b-b88d7" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.507730 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-horizon-secret-key\") pod \"horizon-59fb764b6d-vml5x\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.507802 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tj8h\" (UniqueName: \"kubernetes.io/projected/97d6e519-a82f-4ce5-9199-4d7db769f86b-kube-api-access-2tj8h\") pod \"horizon-54f994999b-b88d7\" (UID: \"97d6e519-a82f-4ce5-9199-4d7db769f86b\") " pod="openstack/horizon-54f994999b-b88d7" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.509219 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-config-data\") pod \"horizon-59fb764b6d-vml5x\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.510245 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-scripts\") pod \"horizon-59fb764b6d-vml5x\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.510408 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-logs\") pod \"horizon-59fb764b6d-vml5x\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.513390 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-horizon-tls-certs\") pod \"horizon-59fb764b6d-vml5x\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.513738 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-horizon-secret-key\") pod \"horizon-59fb764b6d-vml5x\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.515278 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-combined-ca-bundle\") pod \"horizon-59fb764b6d-vml5x\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.526373 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgkzv\" (UniqueName: \"kubernetes.io/projected/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-kube-api-access-sgkzv\") pod \"horizon-59fb764b6d-vml5x\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.595041 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.609032 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97d6e519-a82f-4ce5-9199-4d7db769f86b-config-data\") pod \"horizon-54f994999b-b88d7\" (UID: \"97d6e519-a82f-4ce5-9199-4d7db769f86b\") " pod="openstack/horizon-54f994999b-b88d7" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.609114 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tj8h\" (UniqueName: \"kubernetes.io/projected/97d6e519-a82f-4ce5-9199-4d7db769f86b-kube-api-access-2tj8h\") pod \"horizon-54f994999b-b88d7\" (UID: \"97d6e519-a82f-4ce5-9199-4d7db769f86b\") " pod="openstack/horizon-54f994999b-b88d7" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.609144 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/97d6e519-a82f-4ce5-9199-4d7db769f86b-horizon-secret-key\") pod \"horizon-54f994999b-b88d7\" (UID: \"97d6e519-a82f-4ce5-9199-4d7db769f86b\") " pod="openstack/horizon-54f994999b-b88d7" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.609179 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97d6e519-a82f-4ce5-9199-4d7db769f86b-scripts\") pod \"horizon-54f994999b-b88d7\" (UID: \"97d6e519-a82f-4ce5-9199-4d7db769f86b\") " pod="openstack/horizon-54f994999b-b88d7" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.609224 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97d6e519-a82f-4ce5-9199-4d7db769f86b-combined-ca-bundle\") pod \"horizon-54f994999b-b88d7\" (UID: \"97d6e519-a82f-4ce5-9199-4d7db769f86b\") " pod="openstack/horizon-54f994999b-b88d7" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.609244 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/97d6e519-a82f-4ce5-9199-4d7db769f86b-horizon-tls-certs\") pod \"horizon-54f994999b-b88d7\" (UID: \"97d6e519-a82f-4ce5-9199-4d7db769f86b\") " pod="openstack/horizon-54f994999b-b88d7" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.609270 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97d6e519-a82f-4ce5-9199-4d7db769f86b-logs\") pod \"horizon-54f994999b-b88d7\" (UID: \"97d6e519-a82f-4ce5-9199-4d7db769f86b\") " pod="openstack/horizon-54f994999b-b88d7" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.609958 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97d6e519-a82f-4ce5-9199-4d7db769f86b-scripts\") pod \"horizon-54f994999b-b88d7\" (UID: \"97d6e519-a82f-4ce5-9199-4d7db769f86b\") " pod="openstack/horizon-54f994999b-b88d7" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.610269 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97d6e519-a82f-4ce5-9199-4d7db769f86b-config-data\") pod \"horizon-54f994999b-b88d7\" (UID: \"97d6e519-a82f-4ce5-9199-4d7db769f86b\") " pod="openstack/horizon-54f994999b-b88d7" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.611236 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97d6e519-a82f-4ce5-9199-4d7db769f86b-logs\") pod \"horizon-54f994999b-b88d7\" (UID: \"97d6e519-a82f-4ce5-9199-4d7db769f86b\") " pod="openstack/horizon-54f994999b-b88d7" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.613080 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/97d6e519-a82f-4ce5-9199-4d7db769f86b-horizon-secret-key\") pod \"horizon-54f994999b-b88d7\" (UID: \"97d6e519-a82f-4ce5-9199-4d7db769f86b\") " pod="openstack/horizon-54f994999b-b88d7" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.614457 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/97d6e519-a82f-4ce5-9199-4d7db769f86b-horizon-tls-certs\") pod \"horizon-54f994999b-b88d7\" (UID: \"97d6e519-a82f-4ce5-9199-4d7db769f86b\") " pod="openstack/horizon-54f994999b-b88d7" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.615536 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97d6e519-a82f-4ce5-9199-4d7db769f86b-combined-ca-bundle\") pod \"horizon-54f994999b-b88d7\" (UID: \"97d6e519-a82f-4ce5-9199-4d7db769f86b\") " pod="openstack/horizon-54f994999b-b88d7" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.620467 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.629849 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tj8h\" (UniqueName: \"kubernetes.io/projected/97d6e519-a82f-4ce5-9199-4d7db769f86b-kube-api-access-2tj8h\") pod \"horizon-54f994999b-b88d7\" (UID: \"97d6e519-a82f-4ce5-9199-4d7db769f86b\") " pod="openstack/horizon-54f994999b-b88d7" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.721178 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.730168 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54f994999b-b88d7" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.734386 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.747558 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.749117 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.752522 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.752771 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.774037 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.918046 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aed3c4d-3173-407f-9a70-c20ef18a554d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.918177 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aed3c4d-3173-407f-9a70-c20ef18a554d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.918308 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aed3c4d-3173-407f-9a70-c20ef18a554d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.918417 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9dfz\" (UniqueName: \"kubernetes.io/projected/3aed3c4d-3173-407f-9a70-c20ef18a554d-kube-api-access-g9dfz\") pod \"glance-default-internal-api-0\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.918467 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aed3c4d-3173-407f-9a70-c20ef18a554d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.918580 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.918711 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3aed3c4d-3173-407f-9a70-c20ef18a554d-logs\") pod \"glance-default-internal-api-0\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:56 crc kubenswrapper[4732]: I0402 13:59:56.918786 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3aed3c4d-3173-407f-9a70-c20ef18a554d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:57 crc kubenswrapper[4732]: I0402 13:59:57.020959 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aed3c4d-3173-407f-9a70-c20ef18a554d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:57 crc kubenswrapper[4732]: I0402 13:59:57.021041 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:57 crc kubenswrapper[4732]: I0402 13:59:57.021106 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3aed3c4d-3173-407f-9a70-c20ef18a554d-logs\") pod \"glance-default-internal-api-0\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:57 crc kubenswrapper[4732]: I0402 13:59:57.021143 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3aed3c4d-3173-407f-9a70-c20ef18a554d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:57 crc kubenswrapper[4732]: I0402 13:59:57.021174 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aed3c4d-3173-407f-9a70-c20ef18a554d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:57 crc kubenswrapper[4732]: I0402 13:59:57.021225 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aed3c4d-3173-407f-9a70-c20ef18a554d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:57 crc kubenswrapper[4732]: I0402 13:59:57.021281 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aed3c4d-3173-407f-9a70-c20ef18a554d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:57 crc kubenswrapper[4732]: I0402 13:59:57.021334 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9dfz\" (UniqueName: \"kubernetes.io/projected/3aed3c4d-3173-407f-9a70-c20ef18a554d-kube-api-access-g9dfz\") pod \"glance-default-internal-api-0\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:57 crc kubenswrapper[4732]: I0402 13:59:57.022075 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3aed3c4d-3173-407f-9a70-c20ef18a554d-logs\") pod \"glance-default-internal-api-0\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:57 crc kubenswrapper[4732]: I0402 13:59:57.022495 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3aed3c4d-3173-407f-9a70-c20ef18a554d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:57 crc kubenswrapper[4732]: I0402 13:59:57.023114 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Apr 02 13:59:57 crc kubenswrapper[4732]: I0402 13:59:57.030690 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aed3c4d-3173-407f-9a70-c20ef18a554d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:57 crc kubenswrapper[4732]: I0402 13:59:57.032734 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aed3c4d-3173-407f-9a70-c20ef18a554d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:57 crc kubenswrapper[4732]: I0402 13:59:57.034979 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aed3c4d-3173-407f-9a70-c20ef18a554d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:57 crc kubenswrapper[4732]: I0402 13:59:57.037795 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aed3c4d-3173-407f-9a70-c20ef18a554d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:57 crc kubenswrapper[4732]: I0402 13:59:57.041894 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9dfz\" (UniqueName: \"kubernetes.io/projected/3aed3c4d-3173-407f-9a70-c20ef18a554d-kube-api-access-g9dfz\") pod \"glance-default-internal-api-0\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:57 crc kubenswrapper[4732]: I0402 13:59:57.063686 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " pod="openstack/glance-default-internal-api-0" Apr 02 13:59:57 crc kubenswrapper[4732]: I0402 13:59:57.069455 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Apr 02 13:59:58 crc kubenswrapper[4732]: I0402 13:59:58.428348 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" Apr 02 13:59:58 crc kubenswrapper[4732]: I0402 13:59:58.494727 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-ftbdg"] Apr 02 13:59:58 crc kubenswrapper[4732]: I0402 13:59:58.495162 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" podUID="e48613ce-8967-4bcb-b928-eb2b2c662c0d" containerName="dnsmasq-dns" containerID="cri-o://a498398685860509d8cf0661319803fbdbf100ec5d11457ba2baa28161a1acaa" gracePeriod=10 Apr 02 13:59:58 crc kubenswrapper[4732]: I0402 13:59:58.585806 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" podUID="e48613ce-8967-4bcb-b928-eb2b2c662c0d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: connect: connection refused" Apr 02 13:59:58 crc kubenswrapper[4732]: I0402 13:59:58.690320 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b" path="/var/lib/kubelet/pods/2878bd1e-d81d-4cc8-9334-3a20fe0b0d3b/volumes" Apr 02 13:59:59 crc kubenswrapper[4732]: I0402 13:59:59.654447 4732 generic.go:334] "Generic (PLEG): container finished" podID="e48613ce-8967-4bcb-b928-eb2b2c662c0d" containerID="a498398685860509d8cf0661319803fbdbf100ec5d11457ba2baa28161a1acaa" exitCode=0 Apr 02 13:59:59 crc kubenswrapper[4732]: I0402 13:59:59.654499 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" event={"ID":"e48613ce-8967-4bcb-b928-eb2b2c662c0d","Type":"ContainerDied","Data":"a498398685860509d8cf0661319803fbdbf100ec5d11457ba2baa28161a1acaa"} Apr 02 14:00:00 crc kubenswrapper[4732]: I0402 14:00:00.137177 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585640-289ph"] Apr 02 14:00:00 crc kubenswrapper[4732]: I0402 14:00:00.138648 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585640-289ph" Apr 02 14:00:00 crc kubenswrapper[4732]: I0402 14:00:00.140416 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:00:00 crc kubenswrapper[4732]: I0402 14:00:00.140472 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:00:00 crc kubenswrapper[4732]: I0402 14:00:00.140941 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:00:00 crc kubenswrapper[4732]: I0402 14:00:00.145508 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29585640-n2cb8"] Apr 02 14:00:00 crc kubenswrapper[4732]: I0402 14:00:00.146853 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29585640-n2cb8" Apr 02 14:00:00 crc kubenswrapper[4732]: I0402 14:00:00.149909 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Apr 02 14:00:00 crc kubenswrapper[4732]: I0402 14:00:00.150900 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Apr 02 14:00:00 crc kubenswrapper[4732]: I0402 14:00:00.157161 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585640-289ph"] Apr 02 14:00:00 crc kubenswrapper[4732]: I0402 14:00:00.165644 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29585640-n2cb8"] Apr 02 14:00:00 crc kubenswrapper[4732]: I0402 14:00:00.304398 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09-config-volume\") pod \"collect-profiles-29585640-n2cb8\" (UID: \"8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585640-n2cb8" Apr 02 14:00:00 crc kubenswrapper[4732]: I0402 14:00:00.304443 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzvt5\" (UniqueName: \"kubernetes.io/projected/8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09-kube-api-access-fzvt5\") pod \"collect-profiles-29585640-n2cb8\" (UID: \"8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585640-n2cb8" Apr 02 14:00:00 crc kubenswrapper[4732]: I0402 14:00:00.304706 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx6j9\" (UniqueName: \"kubernetes.io/projected/99deaef2-ca21-4254-9c26-8200edbbd497-kube-api-access-gx6j9\") pod \"auto-csr-approver-29585640-289ph\" (UID: \"99deaef2-ca21-4254-9c26-8200edbbd497\") " pod="openshift-infra/auto-csr-approver-29585640-289ph" Apr 02 14:00:00 crc kubenswrapper[4732]: I0402 14:00:00.304829 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09-secret-volume\") pod \"collect-profiles-29585640-n2cb8\" (UID: \"8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585640-n2cb8" Apr 02 14:00:00 crc kubenswrapper[4732]: I0402 14:00:00.407081 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09-secret-volume\") pod \"collect-profiles-29585640-n2cb8\" (UID: \"8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585640-n2cb8" Apr 02 14:00:00 crc kubenswrapper[4732]: I0402 14:00:00.407266 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09-config-volume\") pod \"collect-profiles-29585640-n2cb8\" (UID: \"8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585640-n2cb8" Apr 02 14:00:00 crc kubenswrapper[4732]: I0402 14:00:00.407294 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzvt5\" (UniqueName: \"kubernetes.io/projected/8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09-kube-api-access-fzvt5\") pod \"collect-profiles-29585640-n2cb8\" (UID: \"8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585640-n2cb8" Apr 02 14:00:00 crc kubenswrapper[4732]: I0402 14:00:00.407360 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx6j9\" (UniqueName: \"kubernetes.io/projected/99deaef2-ca21-4254-9c26-8200edbbd497-kube-api-access-gx6j9\") pod \"auto-csr-approver-29585640-289ph\" (UID: \"99deaef2-ca21-4254-9c26-8200edbbd497\") " pod="openshift-infra/auto-csr-approver-29585640-289ph" Apr 02 14:00:00 crc kubenswrapper[4732]: I0402 14:00:00.408398 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09-config-volume\") pod \"collect-profiles-29585640-n2cb8\" (UID: \"8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585640-n2cb8" Apr 02 14:00:00 crc kubenswrapper[4732]: I0402 14:00:00.419862 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09-secret-volume\") pod \"collect-profiles-29585640-n2cb8\" (UID: \"8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585640-n2cb8" Apr 02 14:00:00 crc kubenswrapper[4732]: I0402 14:00:00.423534 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzvt5\" (UniqueName: \"kubernetes.io/projected/8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09-kube-api-access-fzvt5\") pod \"collect-profiles-29585640-n2cb8\" (UID: \"8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585640-n2cb8" Apr 02 14:00:00 crc kubenswrapper[4732]: I0402 14:00:00.424931 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx6j9\" (UniqueName: \"kubernetes.io/projected/99deaef2-ca21-4254-9c26-8200edbbd497-kube-api-access-gx6j9\") pod \"auto-csr-approver-29585640-289ph\" (UID: \"99deaef2-ca21-4254-9c26-8200edbbd497\") " pod="openshift-infra/auto-csr-approver-29585640-289ph" Apr 02 14:00:00 crc kubenswrapper[4732]: I0402 14:00:00.463353 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585640-289ph" Apr 02 14:00:00 crc kubenswrapper[4732]: I0402 14:00:00.469533 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29585640-n2cb8" Apr 02 14:00:01 crc kubenswrapper[4732]: I0402 14:00:01.955479 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:00:01 crc kubenswrapper[4732]: I0402 14:00:01.956255 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:00:05 crc kubenswrapper[4732]: E0402 14:00:05.031878 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Apr 02 14:00:05 crc kubenswrapper[4732]: E0402 14:00:05.032345 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n594h4h8bh5d6h66h5h544h8h685h548h98h694hdfh7ch4h5bch68h577h7fh7h5ch5chf7hbdh6bh576h5fbh67fh5h54h7dh654q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nhsrx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-c799f465-l2dct_openstack(bd752ee8-4492-42c7-9125-974b270451d2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Apr 02 14:00:05 crc kubenswrapper[4732]: E0402 14:00:05.034354 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-c799f465-l2dct" podUID="bd752ee8-4492-42c7-9125-974b270451d2" Apr 02 14:00:06 crc kubenswrapper[4732]: E0402 14:00:06.932972 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Apr 02 14:00:06 crc kubenswrapper[4732]: E0402 14:00:06.933688 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-29n8v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-dh5g6_openstack(64f807d9-0af7-4723-98b2-dd3cbe55df99): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Apr 02 14:00:06 crc kubenswrapper[4732]: E0402 14:00:06.935085 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-dh5g6" podUID="64f807d9-0af7-4723-98b2-dd3cbe55df99" Apr 02 14:00:06 crc kubenswrapper[4732]: E0402 14:00:06.956581 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Apr 02 14:00:06 crc kubenswrapper[4732]: E0402 14:00:06.956775 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n677h76hc7h5bbh98h78hc5hd6h6fh64h674h578h569h694h57fh65h694hd5h98h67fh68h84h559hc7h68bh546h589h5b8h5b6hb8h655h646q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m7c8s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7b485f6745-hxwxb_openstack(40deaea9-2a82-456d-802f-7829e6d12a9b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Apr 02 14:00:06 crc kubenswrapper[4732]: E0402 14:00:06.958871 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7b485f6745-hxwxb" podUID="40deaea9-2a82-456d-802f-7829e6d12a9b" Apr 02 14:00:06 crc kubenswrapper[4732]: E0402 14:00:06.998892 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Apr 02 14:00:06 crc kubenswrapper[4732]: E0402 14:00:06.999234 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n65bh5b8h66bh5b9h596h695h575hfh597h5f9h5c5h5c7hf7h698h596h59dh67chd4h549h59dh57h55h695h99hd5hdh574hf6h669hb9hc6h655q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m7b5c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-fd6957bc-tm5df_openstack(c623b208-43b3-4c30-a6a1-9d4849513dfa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Apr 02 14:00:07 crc kubenswrapper[4732]: E0402 14:00:07.001436 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-fd6957bc-tm5df" podUID="c623b208-43b3-4c30-a6a1-9d4849513dfa" Apr 02 14:00:07 crc kubenswrapper[4732]: E0402 14:00:07.413768 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Apr 02 14:00:07 crc kubenswrapper[4732]: E0402 14:00:07.413927 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xb6j4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-d54q6_openstack(34aed337-bbff-45a7-b95f-b26c95733c82): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Apr 02 14:00:07 crc kubenswrapper[4732]: E0402 14:00:07.415096 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-d54q6" podUID="34aed337-bbff-45a7-b95f-b26c95733c82" Apr 02 14:00:07 crc kubenswrapper[4732]: I0402 14:00:07.514330 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j4fj8" Apr 02 14:00:07 crc kubenswrapper[4732]: I0402 14:00:07.566203 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-config-data\") pod \"28f87ba1-2996-40b9-8c0a-93545e49915e\" (UID: \"28f87ba1-2996-40b9-8c0a-93545e49915e\") " Apr 02 14:00:07 crc kubenswrapper[4732]: I0402 14:00:07.566268 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlbs7\" (UniqueName: \"kubernetes.io/projected/28f87ba1-2996-40b9-8c0a-93545e49915e-kube-api-access-rlbs7\") pod \"28f87ba1-2996-40b9-8c0a-93545e49915e\" (UID: \"28f87ba1-2996-40b9-8c0a-93545e49915e\") " Apr 02 14:00:07 crc kubenswrapper[4732]: I0402 14:00:07.566297 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-scripts\") pod \"28f87ba1-2996-40b9-8c0a-93545e49915e\" (UID: \"28f87ba1-2996-40b9-8c0a-93545e49915e\") " Apr 02 14:00:07 crc kubenswrapper[4732]: I0402 14:00:07.566318 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-combined-ca-bundle\") pod \"28f87ba1-2996-40b9-8c0a-93545e49915e\" (UID: \"28f87ba1-2996-40b9-8c0a-93545e49915e\") " Apr 02 14:00:07 crc kubenswrapper[4732]: I0402 14:00:07.566449 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-fernet-keys\") pod \"28f87ba1-2996-40b9-8c0a-93545e49915e\" (UID: \"28f87ba1-2996-40b9-8c0a-93545e49915e\") " Apr 02 14:00:07 crc kubenswrapper[4732]: I0402 14:00:07.566579 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-credential-keys\") pod \"28f87ba1-2996-40b9-8c0a-93545e49915e\" (UID: \"28f87ba1-2996-40b9-8c0a-93545e49915e\") " Apr 02 14:00:07 crc kubenswrapper[4732]: I0402 14:00:07.572852 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "28f87ba1-2996-40b9-8c0a-93545e49915e" (UID: "28f87ba1-2996-40b9-8c0a-93545e49915e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:07 crc kubenswrapper[4732]: I0402 14:00:07.572897 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "28f87ba1-2996-40b9-8c0a-93545e49915e" (UID: "28f87ba1-2996-40b9-8c0a-93545e49915e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:07 crc kubenswrapper[4732]: I0402 14:00:07.573001 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-scripts" (OuterVolumeSpecName: "scripts") pod "28f87ba1-2996-40b9-8c0a-93545e49915e" (UID: "28f87ba1-2996-40b9-8c0a-93545e49915e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:07 crc kubenswrapper[4732]: I0402 14:00:07.573143 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28f87ba1-2996-40b9-8c0a-93545e49915e-kube-api-access-rlbs7" (OuterVolumeSpecName: "kube-api-access-rlbs7") pod "28f87ba1-2996-40b9-8c0a-93545e49915e" (UID: "28f87ba1-2996-40b9-8c0a-93545e49915e"). InnerVolumeSpecName "kube-api-access-rlbs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:00:07 crc kubenswrapper[4732]: I0402 14:00:07.598643 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28f87ba1-2996-40b9-8c0a-93545e49915e" (UID: "28f87ba1-2996-40b9-8c0a-93545e49915e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:07 crc kubenswrapper[4732]: I0402 14:00:07.610462 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-config-data" (OuterVolumeSpecName: "config-data") pod "28f87ba1-2996-40b9-8c0a-93545e49915e" (UID: "28f87ba1-2996-40b9-8c0a-93545e49915e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:07 crc kubenswrapper[4732]: I0402 14:00:07.669047 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:07 crc kubenswrapper[4732]: I0402 14:00:07.669082 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlbs7\" (UniqueName: \"kubernetes.io/projected/28f87ba1-2996-40b9-8c0a-93545e49915e-kube-api-access-rlbs7\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:07 crc kubenswrapper[4732]: I0402 14:00:07.669093 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:07 crc kubenswrapper[4732]: I0402 14:00:07.669104 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:07 crc kubenswrapper[4732]: I0402 14:00:07.669114 4732 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-fernet-keys\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:07 crc kubenswrapper[4732]: I0402 14:00:07.669123 4732 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/28f87ba1-2996-40b9-8c0a-93545e49915e-credential-keys\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:07 crc kubenswrapper[4732]: I0402 14:00:07.731117 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j4fj8" Apr 02 14:00:07 crc kubenswrapper[4732]: I0402 14:00:07.731138 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j4fj8" event={"ID":"28f87ba1-2996-40b9-8c0a-93545e49915e","Type":"ContainerDied","Data":"bcac5b3ded494ac2b87a26d4b1483a912e7c1bc566b95c76a677b0ab015fc668"} Apr 02 14:00:07 crc kubenswrapper[4732]: I0402 14:00:07.731885 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcac5b3ded494ac2b87a26d4b1483a912e7c1bc566b95c76a677b0ab015fc668" Apr 02 14:00:07 crc kubenswrapper[4732]: E0402 14:00:07.735785 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-dh5g6" podUID="64f807d9-0af7-4723-98b2-dd3cbe55df99" Apr 02 14:00:07 crc kubenswrapper[4732]: E0402 14:00:07.736069 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-d54q6" podUID="34aed337-bbff-45a7-b95f-b26c95733c82" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.586800 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" podUID="e48613ce-8967-4bcb-b928-eb2b2c662c0d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: i/o timeout" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.602823 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-j4fj8"] Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.611801 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-j4fj8"] Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.699733 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28f87ba1-2996-40b9-8c0a-93545e49915e" path="/var/lib/kubelet/pods/28f87ba1-2996-40b9-8c0a-93545e49915e/volumes" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.700453 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-fb5bv"] Apr 02 14:00:08 crc kubenswrapper[4732]: E0402 14:00:08.707273 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f87ba1-2996-40b9-8c0a-93545e49915e" containerName="keystone-bootstrap" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.707308 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f87ba1-2996-40b9-8c0a-93545e49915e" containerName="keystone-bootstrap" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.707613 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="28f87ba1-2996-40b9-8c0a-93545e49915e" containerName="keystone-bootstrap" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.708178 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fb5bv"] Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.708265 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fb5bv" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.710379 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.710564 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.710783 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.710945 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.711277 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-scsz6" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.792915 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvm69\" (UniqueName: \"kubernetes.io/projected/261837b5-19d6-404f-b88f-b5b6cf88ebec-kube-api-access-kvm69\") pod \"keystone-bootstrap-fb5bv\" (UID: \"261837b5-19d6-404f-b88f-b5b6cf88ebec\") " pod="openstack/keystone-bootstrap-fb5bv" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.792978 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-combined-ca-bundle\") pod \"keystone-bootstrap-fb5bv\" (UID: \"261837b5-19d6-404f-b88f-b5b6cf88ebec\") " pod="openstack/keystone-bootstrap-fb5bv" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.793024 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-credential-keys\") pod \"keystone-bootstrap-fb5bv\" (UID: \"261837b5-19d6-404f-b88f-b5b6cf88ebec\") " pod="openstack/keystone-bootstrap-fb5bv" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.793088 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-scripts\") pod \"keystone-bootstrap-fb5bv\" (UID: \"261837b5-19d6-404f-b88f-b5b6cf88ebec\") " pod="openstack/keystone-bootstrap-fb5bv" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.793118 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-config-data\") pod \"keystone-bootstrap-fb5bv\" (UID: \"261837b5-19d6-404f-b88f-b5b6cf88ebec\") " pod="openstack/keystone-bootstrap-fb5bv" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.793151 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-fernet-keys\") pod \"keystone-bootstrap-fb5bv\" (UID: \"261837b5-19d6-404f-b88f-b5b6cf88ebec\") " pod="openstack/keystone-bootstrap-fb5bv" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.895233 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-combined-ca-bundle\") pod \"keystone-bootstrap-fb5bv\" (UID: \"261837b5-19d6-404f-b88f-b5b6cf88ebec\") " pod="openstack/keystone-bootstrap-fb5bv" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.895294 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-credential-keys\") pod \"keystone-bootstrap-fb5bv\" (UID: \"261837b5-19d6-404f-b88f-b5b6cf88ebec\") " pod="openstack/keystone-bootstrap-fb5bv" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.895338 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-scripts\") pod \"keystone-bootstrap-fb5bv\" (UID: \"261837b5-19d6-404f-b88f-b5b6cf88ebec\") " pod="openstack/keystone-bootstrap-fb5bv" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.895355 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-config-data\") pod \"keystone-bootstrap-fb5bv\" (UID: \"261837b5-19d6-404f-b88f-b5b6cf88ebec\") " pod="openstack/keystone-bootstrap-fb5bv" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.895380 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-fernet-keys\") pod \"keystone-bootstrap-fb5bv\" (UID: \"261837b5-19d6-404f-b88f-b5b6cf88ebec\") " pod="openstack/keystone-bootstrap-fb5bv" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.895486 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvm69\" (UniqueName: \"kubernetes.io/projected/261837b5-19d6-404f-b88f-b5b6cf88ebec-kube-api-access-kvm69\") pod \"keystone-bootstrap-fb5bv\" (UID: \"261837b5-19d6-404f-b88f-b5b6cf88ebec\") " pod="openstack/keystone-bootstrap-fb5bv" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.900798 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-combined-ca-bundle\") pod \"keystone-bootstrap-fb5bv\" (UID: \"261837b5-19d6-404f-b88f-b5b6cf88ebec\") " pod="openstack/keystone-bootstrap-fb5bv" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.901122 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-credential-keys\") pod \"keystone-bootstrap-fb5bv\" (UID: \"261837b5-19d6-404f-b88f-b5b6cf88ebec\") " pod="openstack/keystone-bootstrap-fb5bv" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.901735 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-fernet-keys\") pod \"keystone-bootstrap-fb5bv\" (UID: \"261837b5-19d6-404f-b88f-b5b6cf88ebec\") " pod="openstack/keystone-bootstrap-fb5bv" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.902813 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-config-data\") pod \"keystone-bootstrap-fb5bv\" (UID: \"261837b5-19d6-404f-b88f-b5b6cf88ebec\") " pod="openstack/keystone-bootstrap-fb5bv" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.917111 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-scripts\") pod \"keystone-bootstrap-fb5bv\" (UID: \"261837b5-19d6-404f-b88f-b5b6cf88ebec\") " pod="openstack/keystone-bootstrap-fb5bv" Apr 02 14:00:08 crc kubenswrapper[4732]: I0402 14:00:08.917364 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvm69\" (UniqueName: \"kubernetes.io/projected/261837b5-19d6-404f-b88f-b5b6cf88ebec-kube-api-access-kvm69\") pod \"keystone-bootstrap-fb5bv\" (UID: \"261837b5-19d6-404f-b88f-b5b6cf88ebec\") " pod="openstack/keystone-bootstrap-fb5bv" Apr 02 14:00:09 crc kubenswrapper[4732]: I0402 14:00:09.029339 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fb5bv" Apr 02 14:00:10 crc kubenswrapper[4732]: I0402 14:00:10.760984 4732 generic.go:334] "Generic (PLEG): container finished" podID="481f6e75-b423-4c6c-a1d6-b43674481fc1" containerID="c1ff334494eebb50605f5ce8c2393c63ab95edea02fd9a390bfa160974a7444f" exitCode=0 Apr 02 14:00:10 crc kubenswrapper[4732]: I0402 14:00:10.761073 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-89fvh" event={"ID":"481f6e75-b423-4c6c-a1d6-b43674481fc1","Type":"ContainerDied","Data":"c1ff334494eebb50605f5ce8c2393c63ab95edea02fd9a390bfa160974a7444f"} Apr 02 14:00:13 crc kubenswrapper[4732]: I0402 14:00:13.587142 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" podUID="e48613ce-8967-4bcb-b928-eb2b2c662c0d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: i/o timeout" Apr 02 14:00:13 crc kubenswrapper[4732]: I0402 14:00:13.587988 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.198494 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.207239 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c799f465-l2dct" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.243727 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-89fvh" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.245390 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b485f6745-hxwxb" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.245492 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.256191 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fd6957bc-tm5df" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.260849 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-ovsdbserver-sb\") pod \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\" (UID: \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.261011 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-dns-swift-storage-0\") pod \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\" (UID: \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.261108 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x265x\" (UniqueName: \"kubernetes.io/projected/e48613ce-8967-4bcb-b928-eb2b2c662c0d-kube-api-access-x265x\") pod \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\" (UID: \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.261191 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhsrx\" (UniqueName: \"kubernetes.io/projected/bd752ee8-4492-42c7-9125-974b270451d2-kube-api-access-nhsrx\") pod \"bd752ee8-4492-42c7-9125-974b270451d2\" (UID: \"bd752ee8-4492-42c7-9125-974b270451d2\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.261296 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd752ee8-4492-42c7-9125-974b270451d2-logs\") pod \"bd752ee8-4492-42c7-9125-974b270451d2\" (UID: \"bd752ee8-4492-42c7-9125-974b270451d2\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.261385 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd752ee8-4492-42c7-9125-974b270451d2-scripts\") pod \"bd752ee8-4492-42c7-9125-974b270451d2\" (UID: \"bd752ee8-4492-42c7-9125-974b270451d2\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.261502 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-dns-svc\") pod \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\" (UID: \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.261575 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bd752ee8-4492-42c7-9125-974b270451d2-horizon-secret-key\") pod \"bd752ee8-4492-42c7-9125-974b270451d2\" (UID: \"bd752ee8-4492-42c7-9125-974b270451d2\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.261666 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-config\") pod \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\" (UID: \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.261749 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-ovsdbserver-nb\") pod \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\" (UID: \"e48613ce-8967-4bcb-b928-eb2b2c662c0d\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.261896 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd752ee8-4492-42c7-9125-974b270451d2-config-data\") pod \"bd752ee8-4492-42c7-9125-974b270451d2\" (UID: \"bd752ee8-4492-42c7-9125-974b270451d2\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.263136 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd752ee8-4492-42c7-9125-974b270451d2-logs" (OuterVolumeSpecName: "logs") pod "bd752ee8-4492-42c7-9125-974b270451d2" (UID: "bd752ee8-4492-42c7-9125-974b270451d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.268808 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd752ee8-4492-42c7-9125-974b270451d2-scripts" (OuterVolumeSpecName: "scripts") pod "bd752ee8-4492-42c7-9125-974b270451d2" (UID: "bd752ee8-4492-42c7-9125-974b270451d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.262888 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd752ee8-4492-42c7-9125-974b270451d2-config-data" (OuterVolumeSpecName: "config-data") pod "bd752ee8-4492-42c7-9125-974b270451d2" (UID: "bd752ee8-4492-42c7-9125-974b270451d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.271980 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd752ee8-4492-42c7-9125-974b270451d2-kube-api-access-nhsrx" (OuterVolumeSpecName: "kube-api-access-nhsrx") pod "bd752ee8-4492-42c7-9125-974b270451d2" (UID: "bd752ee8-4492-42c7-9125-974b270451d2"). InnerVolumeSpecName "kube-api-access-nhsrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.286822 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd752ee8-4492-42c7-9125-974b270451d2-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "bd752ee8-4492-42c7-9125-974b270451d2" (UID: "bd752ee8-4492-42c7-9125-974b270451d2"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.290254 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e48613ce-8967-4bcb-b928-eb2b2c662c0d-kube-api-access-x265x" (OuterVolumeSpecName: "kube-api-access-x265x") pod "e48613ce-8967-4bcb-b928-eb2b2c662c0d" (UID: "e48613ce-8967-4bcb-b928-eb2b2c662c0d"). InnerVolumeSpecName "kube-api-access-x265x". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.339019 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e48613ce-8967-4bcb-b928-eb2b2c662c0d" (UID: "e48613ce-8967-4bcb-b928-eb2b2c662c0d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.340602 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-config" (OuterVolumeSpecName: "config") pod "e48613ce-8967-4bcb-b928-eb2b2c662c0d" (UID: "e48613ce-8967-4bcb-b928-eb2b2c662c0d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.342397 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e48613ce-8967-4bcb-b928-eb2b2c662c0d" (UID: "e48613ce-8967-4bcb-b928-eb2b2c662c0d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.350977 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e48613ce-8967-4bcb-b928-eb2b2c662c0d" (UID: "e48613ce-8967-4bcb-b928-eb2b2c662c0d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.361198 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e48613ce-8967-4bcb-b928-eb2b2c662c0d" (UID: "e48613ce-8967-4bcb-b928-eb2b2c662c0d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.363132 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7b5c\" (UniqueName: \"kubernetes.io/projected/c623b208-43b3-4c30-a6a1-9d4849513dfa-kube-api-access-m7b5c\") pod \"c623b208-43b3-4c30-a6a1-9d4849513dfa\" (UID: \"c623b208-43b3-4c30-a6a1-9d4849513dfa\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.363183 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cf737d2-a567-4afe-9490-b97b6c5d09e6-config-data\") pod \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.363210 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40deaea9-2a82-456d-802f-7829e6d12a9b-scripts\") pod \"40deaea9-2a82-456d-802f-7829e6d12a9b\" (UID: \"40deaea9-2a82-456d-802f-7829e6d12a9b\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.363255 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2cf737d2-a567-4afe-9490-b97b6c5d09e6-httpd-run\") pod \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.363273 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c623b208-43b3-4c30-a6a1-9d4849513dfa-scripts\") pod \"c623b208-43b3-4c30-a6a1-9d4849513dfa\" (UID: \"c623b208-43b3-4c30-a6a1-9d4849513dfa\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.363288 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjnxk\" (UniqueName: \"kubernetes.io/projected/481f6e75-b423-4c6c-a1d6-b43674481fc1-kube-api-access-tjnxk\") pod \"481f6e75-b423-4c6c-a1d6-b43674481fc1\" (UID: \"481f6e75-b423-4c6c-a1d6-b43674481fc1\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.363361 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cf737d2-a567-4afe-9490-b97b6c5d09e6-scripts\") pod \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.363380 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c623b208-43b3-4c30-a6a1-9d4849513dfa-config-data\") pod \"c623b208-43b3-4c30-a6a1-9d4849513dfa\" (UID: \"c623b208-43b3-4c30-a6a1-9d4849513dfa\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.363420 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40deaea9-2a82-456d-802f-7829e6d12a9b-logs\") pod \"40deaea9-2a82-456d-802f-7829e6d12a9b\" (UID: \"40deaea9-2a82-456d-802f-7829e6d12a9b\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.363462 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cf737d2-a567-4afe-9490-b97b6c5d09e6-combined-ca-bundle\") pod \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.363485 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c623b208-43b3-4c30-a6a1-9d4849513dfa-logs\") pod \"c623b208-43b3-4c30-a6a1-9d4849513dfa\" (UID: \"c623b208-43b3-4c30-a6a1-9d4849513dfa\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.363559 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4zcf\" (UniqueName: \"kubernetes.io/projected/2cf737d2-a567-4afe-9490-b97b6c5d09e6-kube-api-access-q4zcf\") pod \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.363585 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/40deaea9-2a82-456d-802f-7829e6d12a9b-horizon-secret-key\") pod \"40deaea9-2a82-456d-802f-7829e6d12a9b\" (UID: \"40deaea9-2a82-456d-802f-7829e6d12a9b\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.363704 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40deaea9-2a82-456d-802f-7829e6d12a9b-scripts" (OuterVolumeSpecName: "scripts") pod "40deaea9-2a82-456d-802f-7829e6d12a9b" (UID: "40deaea9-2a82-456d-802f-7829e6d12a9b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.363740 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40deaea9-2a82-456d-802f-7829e6d12a9b-config-data\") pod \"40deaea9-2a82-456d-802f-7829e6d12a9b\" (UID: \"40deaea9-2a82-456d-802f-7829e6d12a9b\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.363790 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7c8s\" (UniqueName: \"kubernetes.io/projected/40deaea9-2a82-456d-802f-7829e6d12a9b-kube-api-access-m7c8s\") pod \"40deaea9-2a82-456d-802f-7829e6d12a9b\" (UID: \"40deaea9-2a82-456d-802f-7829e6d12a9b\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.363817 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c623b208-43b3-4c30-a6a1-9d4849513dfa-horizon-secret-key\") pod \"c623b208-43b3-4c30-a6a1-9d4849513dfa\" (UID: \"c623b208-43b3-4c30-a6a1-9d4849513dfa\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.363842 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481f6e75-b423-4c6c-a1d6-b43674481fc1-combined-ca-bundle\") pod \"481f6e75-b423-4c6c-a1d6-b43674481fc1\" (UID: \"481f6e75-b423-4c6c-a1d6-b43674481fc1\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.363866 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/481f6e75-b423-4c6c-a1d6-b43674481fc1-config\") pod \"481f6e75-b423-4c6c-a1d6-b43674481fc1\" (UID: \"481f6e75-b423-4c6c-a1d6-b43674481fc1\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.363899 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cf737d2-a567-4afe-9490-b97b6c5d09e6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2cf737d2-a567-4afe-9490-b97b6c5d09e6" (UID: "2cf737d2-a567-4afe-9490-b97b6c5d09e6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.363922 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cf737d2-a567-4afe-9490-b97b6c5d09e6-logs\") pod \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.363995 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.364045 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cf737d2-a567-4afe-9490-b97b6c5d09e6-public-tls-certs\") pod \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\" (UID: \"2cf737d2-a567-4afe-9490-b97b6c5d09e6\") " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.364824 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.364840 4732 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.364850 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x265x\" (UniqueName: \"kubernetes.io/projected/e48613ce-8967-4bcb-b928-eb2b2c662c0d-kube-api-access-x265x\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.364866 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhsrx\" (UniqueName: \"kubernetes.io/projected/bd752ee8-4492-42c7-9125-974b270451d2-kube-api-access-nhsrx\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.364877 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd752ee8-4492-42c7-9125-974b270451d2-logs\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.364888 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd752ee8-4492-42c7-9125-974b270451d2-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.364899 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40deaea9-2a82-456d-802f-7829e6d12a9b-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.364908 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.364918 4732 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bd752ee8-4492-42c7-9125-974b270451d2-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.364927 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-config\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.364937 4732 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2cf737d2-a567-4afe-9490-b97b6c5d09e6-httpd-run\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.364948 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e48613ce-8967-4bcb-b928-eb2b2c662c0d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.364959 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd752ee8-4492-42c7-9125-974b270451d2-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.364046 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c623b208-43b3-4c30-a6a1-9d4849513dfa-scripts" (OuterVolumeSpecName: "scripts") pod "c623b208-43b3-4c30-a6a1-9d4849513dfa" (UID: "c623b208-43b3-4c30-a6a1-9d4849513dfa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.364241 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cf737d2-a567-4afe-9490-b97b6c5d09e6-logs" (OuterVolumeSpecName: "logs") pod "2cf737d2-a567-4afe-9490-b97b6c5d09e6" (UID: "2cf737d2-a567-4afe-9490-b97b6c5d09e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.364724 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40deaea9-2a82-456d-802f-7829e6d12a9b-config-data" (OuterVolumeSpecName: "config-data") pod "40deaea9-2a82-456d-802f-7829e6d12a9b" (UID: "40deaea9-2a82-456d-802f-7829e6d12a9b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.365602 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c623b208-43b3-4c30-a6a1-9d4849513dfa-logs" (OuterVolumeSpecName: "logs") pod "c623b208-43b3-4c30-a6a1-9d4849513dfa" (UID: "c623b208-43b3-4c30-a6a1-9d4849513dfa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.365851 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c623b208-43b3-4c30-a6a1-9d4849513dfa-config-data" (OuterVolumeSpecName: "config-data") pod "c623b208-43b3-4c30-a6a1-9d4849513dfa" (UID: "c623b208-43b3-4c30-a6a1-9d4849513dfa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.366063 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40deaea9-2a82-456d-802f-7829e6d12a9b-logs" (OuterVolumeSpecName: "logs") pod "40deaea9-2a82-456d-802f-7829e6d12a9b" (UID: "40deaea9-2a82-456d-802f-7829e6d12a9b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.367469 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40deaea9-2a82-456d-802f-7829e6d12a9b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "40deaea9-2a82-456d-802f-7829e6d12a9b" (UID: "40deaea9-2a82-456d-802f-7829e6d12a9b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.368177 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cf737d2-a567-4afe-9490-b97b6c5d09e6-scripts" (OuterVolumeSpecName: "scripts") pod "2cf737d2-a567-4afe-9490-b97b6c5d09e6" (UID: "2cf737d2-a567-4afe-9490-b97b6c5d09e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.368728 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c623b208-43b3-4c30-a6a1-9d4849513dfa-kube-api-access-m7b5c" (OuterVolumeSpecName: "kube-api-access-m7b5c") pod "c623b208-43b3-4c30-a6a1-9d4849513dfa" (UID: "c623b208-43b3-4c30-a6a1-9d4849513dfa"). InnerVolumeSpecName "kube-api-access-m7b5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.368761 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40deaea9-2a82-456d-802f-7829e6d12a9b-kube-api-access-m7c8s" (OuterVolumeSpecName: "kube-api-access-m7c8s") pod "40deaea9-2a82-456d-802f-7829e6d12a9b" (UID: "40deaea9-2a82-456d-802f-7829e6d12a9b"). InnerVolumeSpecName "kube-api-access-m7c8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.368952 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "2cf737d2-a567-4afe-9490-b97b6c5d09e6" (UID: "2cf737d2-a567-4afe-9490-b97b6c5d09e6"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.369554 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/481f6e75-b423-4c6c-a1d6-b43674481fc1-kube-api-access-tjnxk" (OuterVolumeSpecName: "kube-api-access-tjnxk") pod "481f6e75-b423-4c6c-a1d6-b43674481fc1" (UID: "481f6e75-b423-4c6c-a1d6-b43674481fc1"). InnerVolumeSpecName "kube-api-access-tjnxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.371042 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cf737d2-a567-4afe-9490-b97b6c5d09e6-kube-api-access-q4zcf" (OuterVolumeSpecName: "kube-api-access-q4zcf") pod "2cf737d2-a567-4afe-9490-b97b6c5d09e6" (UID: "2cf737d2-a567-4afe-9490-b97b6c5d09e6"). InnerVolumeSpecName "kube-api-access-q4zcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.371225 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c623b208-43b3-4c30-a6a1-9d4849513dfa-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c623b208-43b3-4c30-a6a1-9d4849513dfa" (UID: "c623b208-43b3-4c30-a6a1-9d4849513dfa"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.395849 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/481f6e75-b423-4c6c-a1d6-b43674481fc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "481f6e75-b423-4c6c-a1d6-b43674481fc1" (UID: "481f6e75-b423-4c6c-a1d6-b43674481fc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.395913 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cf737d2-a567-4afe-9490-b97b6c5d09e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cf737d2-a567-4afe-9490-b97b6c5d09e6" (UID: "2cf737d2-a567-4afe-9490-b97b6c5d09e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.406393 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/481f6e75-b423-4c6c-a1d6-b43674481fc1-config" (OuterVolumeSpecName: "config") pod "481f6e75-b423-4c6c-a1d6-b43674481fc1" (UID: "481f6e75-b423-4c6c-a1d6-b43674481fc1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.415565 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cf737d2-a567-4afe-9490-b97b6c5d09e6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2cf737d2-a567-4afe-9490-b97b6c5d09e6" (UID: "2cf737d2-a567-4afe-9490-b97b6c5d09e6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.417117 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cf737d2-a567-4afe-9490-b97b6c5d09e6-config-data" (OuterVolumeSpecName: "config-data") pod "2cf737d2-a567-4afe-9490-b97b6c5d09e6" (UID: "2cf737d2-a567-4afe-9490-b97b6c5d09e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.466032 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cf737d2-a567-4afe-9490-b97b6c5d09e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.466070 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c623b208-43b3-4c30-a6a1-9d4849513dfa-logs\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.466083 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4zcf\" (UniqueName: \"kubernetes.io/projected/2cf737d2-a567-4afe-9490-b97b6c5d09e6-kube-api-access-q4zcf\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.466094 4732 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/40deaea9-2a82-456d-802f-7829e6d12a9b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.466102 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40deaea9-2a82-456d-802f-7829e6d12a9b-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.466110 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7c8s\" (UniqueName: \"kubernetes.io/projected/40deaea9-2a82-456d-802f-7829e6d12a9b-kube-api-access-m7c8s\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.466118 4732 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c623b208-43b3-4c30-a6a1-9d4849513dfa-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.466126 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481f6e75-b423-4c6c-a1d6-b43674481fc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.466134 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/481f6e75-b423-4c6c-a1d6-b43674481fc1-config\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.466142 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cf737d2-a567-4afe-9490-b97b6c5d09e6-logs\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.466169 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.466179 4732 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cf737d2-a567-4afe-9490-b97b6c5d09e6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.466187 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7b5c\" (UniqueName: \"kubernetes.io/projected/c623b208-43b3-4c30-a6a1-9d4849513dfa-kube-api-access-m7b5c\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.466195 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cf737d2-a567-4afe-9490-b97b6c5d09e6-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.466202 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c623b208-43b3-4c30-a6a1-9d4849513dfa-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.466210 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjnxk\" (UniqueName: \"kubernetes.io/projected/481f6e75-b423-4c6c-a1d6-b43674481fc1-kube-api-access-tjnxk\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.466217 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cf737d2-a567-4afe-9490-b97b6c5d09e6-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.466225 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c623b208-43b3-4c30-a6a1-9d4849513dfa-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.466234 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40deaea9-2a82-456d-802f-7829e6d12a9b-logs\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.484328 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.568984 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.569352 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.808413 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-89fvh" event={"ID":"481f6e75-b423-4c6c-a1d6-b43674481fc1","Type":"ContainerDied","Data":"2396f7b4e78e2cb23d1165505e34e72e4f5835ad94dbadeca06ce934f71e6bbc"} Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.808454 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2396f7b4e78e2cb23d1165505e34e72e4f5835ad94dbadeca06ce934f71e6bbc" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.808733 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-89fvh" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.809451 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c799f465-l2dct" event={"ID":"bd752ee8-4492-42c7-9125-974b270451d2","Type":"ContainerDied","Data":"39dc7bb4bd1c3887cec75ba568de5c655f546b64051a15df04785ccfbee58913"} Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.809479 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c799f465-l2dct" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.810372 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b485f6745-hxwxb" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.810377 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b485f6745-hxwxb" event={"ID":"40deaea9-2a82-456d-802f-7829e6d12a9b","Type":"ContainerDied","Data":"95df95b67878c1dd7a56156242c96daa45b9f239412ba96926f1608aaa48a6b8"} Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.813083 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.813093 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2cf737d2-a567-4afe-9490-b97b6c5d09e6","Type":"ContainerDied","Data":"6f6dc487830ac7e6ddfd333c94f11519665603935b3c54e9d164d1551b3e4f3d"} Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.813401 4732 scope.go:117] "RemoveContainer" containerID="425788db80ca1f6c470809c31f38955b4a7630e234688529f843a925e6641fbd" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.817761 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fd6957bc-tm5df" event={"ID":"c623b208-43b3-4c30-a6a1-9d4849513dfa","Type":"ContainerDied","Data":"eda5997546dc91c262160fa4f74bc4acf143dd66799c7135a59177234ba27abe"} Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.817834 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fd6957bc-tm5df" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.836185 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" event={"ID":"e48613ce-8967-4bcb-b928-eb2b2c662c0d","Type":"ContainerDied","Data":"645f2e48aba1dd4aee1325a0560d67b95e7b92a88ba28a4464846b43b1e3d883"} Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.836272 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.871143 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b485f6745-hxwxb"] Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.892280 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7b485f6745-hxwxb"] Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.911523 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-c799f465-l2dct"] Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.919173 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-c799f465-l2dct"] Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.949092 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-fd6957bc-tm5df"] Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.963706 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-fd6957bc-tm5df"] Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.974844 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.985118 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Apr 02 14:00:16 crc kubenswrapper[4732]: I0402 14:00:16.992543 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-ftbdg"] Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:16.999663 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-ftbdg"] Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.009091 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Apr 02 14:00:17 crc kubenswrapper[4732]: E0402 14:00:17.009594 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cf737d2-a567-4afe-9490-b97b6c5d09e6" containerName="glance-httpd" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.009705 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cf737d2-a567-4afe-9490-b97b6c5d09e6" containerName="glance-httpd" Apr 02 14:00:17 crc kubenswrapper[4732]: E0402 14:00:17.009731 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cf737d2-a567-4afe-9490-b97b6c5d09e6" containerName="glance-log" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.009739 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cf737d2-a567-4afe-9490-b97b6c5d09e6" containerName="glance-log" Apr 02 14:00:17 crc kubenswrapper[4732]: E0402 14:00:17.009781 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e48613ce-8967-4bcb-b928-eb2b2c662c0d" containerName="dnsmasq-dns" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.009789 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e48613ce-8967-4bcb-b928-eb2b2c662c0d" containerName="dnsmasq-dns" Apr 02 14:00:17 crc kubenswrapper[4732]: E0402 14:00:17.009807 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481f6e75-b423-4c6c-a1d6-b43674481fc1" containerName="neutron-db-sync" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.009813 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="481f6e75-b423-4c6c-a1d6-b43674481fc1" containerName="neutron-db-sync" Apr 02 14:00:17 crc kubenswrapper[4732]: E0402 14:00:17.009857 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e48613ce-8967-4bcb-b928-eb2b2c662c0d" containerName="init" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.009865 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e48613ce-8967-4bcb-b928-eb2b2c662c0d" containerName="init" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.010068 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="481f6e75-b423-4c6c-a1d6-b43674481fc1" containerName="neutron-db-sync" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.010105 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cf737d2-a567-4afe-9490-b97b6c5d09e6" containerName="glance-httpd" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.010119 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e48613ce-8967-4bcb-b928-eb2b2c662c0d" containerName="dnsmasq-dns" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.010126 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cf737d2-a567-4afe-9490-b97b6c5d09e6" containerName="glance-log" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.011306 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.013213 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.016884 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.018759 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.074992 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.075070 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-config-data\") pod \"glance-default-external-api-0\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.075098 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.075128 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.075146 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.075207 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b6qg\" (UniqueName: \"kubernetes.io/projected/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-kube-api-access-7b6qg\") pod \"glance-default-external-api-0\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.075238 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-logs\") pod \"glance-default-external-api-0\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.075271 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-scripts\") pod \"glance-default-external-api-0\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.177058 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-logs\") pod \"glance-default-external-api-0\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.177122 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-scripts\") pod \"glance-default-external-api-0\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.177176 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.177233 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-config-data\") pod \"glance-default-external-api-0\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.177261 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.177298 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.177323 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.177371 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b6qg\" (UniqueName: \"kubernetes.io/projected/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-kube-api-access-7b6qg\") pod \"glance-default-external-api-0\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.177697 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-logs\") pod \"glance-default-external-api-0\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.178182 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.178284 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.181664 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.182199 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-config-data\") pod \"glance-default-external-api-0\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.183440 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-scripts\") pod \"glance-default-external-api-0\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.186118 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.199404 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b6qg\" (UniqueName: \"kubernetes.io/projected/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-kube-api-access-7b6qg\") pod \"glance-default-external-api-0\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.204847 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.325436 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.507776 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-6cb8x"] Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.509579 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.535782 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-6cb8x"] Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.584410 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-6cb8x\" (UID: \"b7924745-2bd5-4642-a2a0-21f8647be92b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.584537 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-6cb8x\" (UID: \"b7924745-2bd5-4642-a2a0-21f8647be92b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.584604 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-6cb8x\" (UID: \"b7924745-2bd5-4642-a2a0-21f8647be92b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.584678 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br5bw\" (UniqueName: \"kubernetes.io/projected/b7924745-2bd5-4642-a2a0-21f8647be92b-kube-api-access-br5bw\") pod \"dnsmasq-dns-5ccc5c4795-6cb8x\" (UID: \"b7924745-2bd5-4642-a2a0-21f8647be92b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.584748 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-6cb8x\" (UID: \"b7924745-2bd5-4642-a2a0-21f8647be92b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.584801 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-config\") pod \"dnsmasq-dns-5ccc5c4795-6cb8x\" (UID: \"b7924745-2bd5-4642-a2a0-21f8647be92b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" Apr 02 14:00:17 crc kubenswrapper[4732]: E0402 14:00:17.607652 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Apr 02 14:00:17 crc kubenswrapper[4732]: E0402 14:00:17.608464 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jd7m5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-jwc6l_openstack(10ba0697-529f-41d3-a1a8-55b50ed024a2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Apr 02 14:00:17 crc kubenswrapper[4732]: E0402 14:00:17.609877 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-jwc6l" podUID="10ba0697-529f-41d3-a1a8-55b50ed024a2" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.654990 4732 scope.go:117] "RemoveContainer" containerID="e77fdd735b8cff6f5dbb88656da65ba6a92481af4d24778625f0518bf959449a" Apr 02 14:00:17 crc kubenswrapper[4732]: W0402 14:00:17.655527 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3aed3c4d_3173_407f_9a70_c20ef18a554d.slice/crio-f29a05790ed5f6792add4a7193cc0c21631d0f9cd2272be4e822f3df3619efa6 WatchSource:0}: Error finding container f29a05790ed5f6792add4a7193cc0c21631d0f9cd2272be4e822f3df3619efa6: Status 404 returned error can't find the container with id f29a05790ed5f6792add4a7193cc0c21631d0f9cd2272be4e822f3df3619efa6 Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.657259 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-cd55d6846-4zs9k"] Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.661757 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cd55d6846-4zs9k" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.669942 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.670063 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.671491 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.671752 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-7tnt2" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.688051 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-6cb8x\" (UID: \"b7924745-2bd5-4642-a2a0-21f8647be92b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.688163 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-6cb8x\" (UID: \"b7924745-2bd5-4642-a2a0-21f8647be92b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.688216 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-6cb8x\" (UID: \"b7924745-2bd5-4642-a2a0-21f8647be92b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.688257 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br5bw\" (UniqueName: \"kubernetes.io/projected/b7924745-2bd5-4642-a2a0-21f8647be92b-kube-api-access-br5bw\") pod \"dnsmasq-dns-5ccc5c4795-6cb8x\" (UID: \"b7924745-2bd5-4642-a2a0-21f8647be92b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.688305 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-6cb8x\" (UID: \"b7924745-2bd5-4642-a2a0-21f8647be92b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.688329 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-config\") pod \"dnsmasq-dns-5ccc5c4795-6cb8x\" (UID: \"b7924745-2bd5-4642-a2a0-21f8647be92b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.689458 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-config\") pod \"dnsmasq-dns-5ccc5c4795-6cb8x\" (UID: \"b7924745-2bd5-4642-a2a0-21f8647be92b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.690200 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-6cb8x\" (UID: \"b7924745-2bd5-4642-a2a0-21f8647be92b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.692125 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-6cb8x\" (UID: \"b7924745-2bd5-4642-a2a0-21f8647be92b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.692459 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-6cb8x\" (UID: \"b7924745-2bd5-4642-a2a0-21f8647be92b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.693196 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-6cb8x\" (UID: \"b7924745-2bd5-4642-a2a0-21f8647be92b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.727520 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br5bw\" (UniqueName: \"kubernetes.io/projected/b7924745-2bd5-4642-a2a0-21f8647be92b-kube-api-access-br5bw\") pod \"dnsmasq-dns-5ccc5c4795-6cb8x\" (UID: \"b7924745-2bd5-4642-a2a0-21f8647be92b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.727957 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cd55d6846-4zs9k"] Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.791698 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16168989-9c80-47a7-92ea-8be3984e5d99-combined-ca-bundle\") pod \"neutron-cd55d6846-4zs9k\" (UID: \"16168989-9c80-47a7-92ea-8be3984e5d99\") " pod="openstack/neutron-cd55d6846-4zs9k" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.792089 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwj2d\" (UniqueName: \"kubernetes.io/projected/16168989-9c80-47a7-92ea-8be3984e5d99-kube-api-access-dwj2d\") pod \"neutron-cd55d6846-4zs9k\" (UID: \"16168989-9c80-47a7-92ea-8be3984e5d99\") " pod="openstack/neutron-cd55d6846-4zs9k" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.792160 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/16168989-9c80-47a7-92ea-8be3984e5d99-httpd-config\") pod \"neutron-cd55d6846-4zs9k\" (UID: \"16168989-9c80-47a7-92ea-8be3984e5d99\") " pod="openstack/neutron-cd55d6846-4zs9k" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.792272 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/16168989-9c80-47a7-92ea-8be3984e5d99-config\") pod \"neutron-cd55d6846-4zs9k\" (UID: \"16168989-9c80-47a7-92ea-8be3984e5d99\") " pod="openstack/neutron-cd55d6846-4zs9k" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.792295 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/16168989-9c80-47a7-92ea-8be3984e5d99-ovndb-tls-certs\") pod \"neutron-cd55d6846-4zs9k\" (UID: \"16168989-9c80-47a7-92ea-8be3984e5d99\") " pod="openstack/neutron-cd55d6846-4zs9k" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.831740 4732 scope.go:117] "RemoveContainer" containerID="a498398685860509d8cf0661319803fbdbf100ec5d11457ba2baa28161a1acaa" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.842971 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.880467 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3aed3c4d-3173-407f-9a70-c20ef18a554d","Type":"ContainerStarted","Data":"f29a05790ed5f6792add4a7193cc0c21631d0f9cd2272be4e822f3df3619efa6"} Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.894158 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwj2d\" (UniqueName: \"kubernetes.io/projected/16168989-9c80-47a7-92ea-8be3984e5d99-kube-api-access-dwj2d\") pod \"neutron-cd55d6846-4zs9k\" (UID: \"16168989-9c80-47a7-92ea-8be3984e5d99\") " pod="openstack/neutron-cd55d6846-4zs9k" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.894244 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/16168989-9c80-47a7-92ea-8be3984e5d99-httpd-config\") pod \"neutron-cd55d6846-4zs9k\" (UID: \"16168989-9c80-47a7-92ea-8be3984e5d99\") " pod="openstack/neutron-cd55d6846-4zs9k" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.894312 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/16168989-9c80-47a7-92ea-8be3984e5d99-config\") pod \"neutron-cd55d6846-4zs9k\" (UID: \"16168989-9c80-47a7-92ea-8be3984e5d99\") " pod="openstack/neutron-cd55d6846-4zs9k" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.894336 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/16168989-9c80-47a7-92ea-8be3984e5d99-ovndb-tls-certs\") pod \"neutron-cd55d6846-4zs9k\" (UID: \"16168989-9c80-47a7-92ea-8be3984e5d99\") " pod="openstack/neutron-cd55d6846-4zs9k" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.894446 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16168989-9c80-47a7-92ea-8be3984e5d99-combined-ca-bundle\") pod \"neutron-cd55d6846-4zs9k\" (UID: \"16168989-9c80-47a7-92ea-8be3984e5d99\") " pod="openstack/neutron-cd55d6846-4zs9k" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.903825 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/16168989-9c80-47a7-92ea-8be3984e5d99-ovndb-tls-certs\") pod \"neutron-cd55d6846-4zs9k\" (UID: \"16168989-9c80-47a7-92ea-8be3984e5d99\") " pod="openstack/neutron-cd55d6846-4zs9k" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.911201 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16168989-9c80-47a7-92ea-8be3984e5d99-combined-ca-bundle\") pod \"neutron-cd55d6846-4zs9k\" (UID: \"16168989-9c80-47a7-92ea-8be3984e5d99\") " pod="openstack/neutron-cd55d6846-4zs9k" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.912372 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/16168989-9c80-47a7-92ea-8be3984e5d99-config\") pod \"neutron-cd55d6846-4zs9k\" (UID: \"16168989-9c80-47a7-92ea-8be3984e5d99\") " pod="openstack/neutron-cd55d6846-4zs9k" Apr 02 14:00:17 crc kubenswrapper[4732]: E0402 14:00:17.912950 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-jwc6l" podUID="10ba0697-529f-41d3-a1a8-55b50ed024a2" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.917853 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/16168989-9c80-47a7-92ea-8be3984e5d99-httpd-config\") pod \"neutron-cd55d6846-4zs9k\" (UID: \"16168989-9c80-47a7-92ea-8be3984e5d99\") " pod="openstack/neutron-cd55d6846-4zs9k" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.931317 4732 scope.go:117] "RemoveContainer" containerID="efd3c9b6a794d607a7b118465d06e0041af8c6c5969b81d4c611982fc16c04f5" Apr 02 14:00:17 crc kubenswrapper[4732]: I0402 14:00:17.946977 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwj2d\" (UniqueName: \"kubernetes.io/projected/16168989-9c80-47a7-92ea-8be3984e5d99-kube-api-access-dwj2d\") pod \"neutron-cd55d6846-4zs9k\" (UID: \"16168989-9c80-47a7-92ea-8be3984e5d99\") " pod="openstack/neutron-cd55d6846-4zs9k" Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.033570 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cd55d6846-4zs9k" Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.234021 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54f994999b-b88d7"] Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.430631 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585640-289ph"] Apr 02 14:00:18 crc kubenswrapper[4732]: W0402 14:00:18.432041 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99deaef2_ca21_4254_9c26_8200edbbd497.slice/crio-ee64fa02c151e4753f2c43759329ed3e21cdca7f34eadfa82ed472120ba40fea WatchSource:0}: Error finding container ee64fa02c151e4753f2c43759329ed3e21cdca7f34eadfa82ed472120ba40fea: Status 404 returned error can't find the container with id ee64fa02c151e4753f2c43759329ed3e21cdca7f34eadfa82ed472120ba40fea Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.450841 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fb5bv"] Apr 02 14:00:18 crc kubenswrapper[4732]: W0402 14:00:18.458536 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod261837b5_19d6_404f_b88f_b5b6cf88ebec.slice/crio-893f6c98db8b2fa99416d7a843dc273372c162b036ac63826bc790525ac198f0 WatchSource:0}: Error finding container 893f6c98db8b2fa99416d7a843dc273372c162b036ac63826bc790525ac198f0: Status 404 returned error can't find the container with id 893f6c98db8b2fa99416d7a843dc273372c162b036ac63826bc790525ac198f0 Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.462178 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-59fb764b6d-vml5x"] Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.577555 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29585640-n2cb8"] Apr 02 14:00:18 crc kubenswrapper[4732]: W0402 14:00:18.581789 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d43ed34_cc78_4cfa_b5bd_b0c892cc6d09.slice/crio-fb339696d4e2763ef51366bdc84fb85204fd6fd2bf637b0c5849aa97abb26365 WatchSource:0}: Error finding container fb339696d4e2763ef51366bdc84fb85204fd6fd2bf637b0c5849aa97abb26365: Status 404 returned error can't find the container with id fb339696d4e2763ef51366bdc84fb85204fd6fd2bf637b0c5849aa97abb26365 Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.588429 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-895cf5cf-ftbdg" podUID="e48613ce-8967-4bcb-b928-eb2b2c662c0d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: i/o timeout" Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.667030 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-6cb8x"] Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.703099 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cf737d2-a567-4afe-9490-b97b6c5d09e6" path="/var/lib/kubelet/pods/2cf737d2-a567-4afe-9490-b97b6c5d09e6/volumes" Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.703933 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40deaea9-2a82-456d-802f-7829e6d12a9b" path="/var/lib/kubelet/pods/40deaea9-2a82-456d-802f-7829e6d12a9b/volumes" Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.704375 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd752ee8-4492-42c7-9125-974b270451d2" path="/var/lib/kubelet/pods/bd752ee8-4492-42c7-9125-974b270451d2/volumes" Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.721030 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c623b208-43b3-4c30-a6a1-9d4849513dfa" path="/var/lib/kubelet/pods/c623b208-43b3-4c30-a6a1-9d4849513dfa/volumes" Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.721695 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e48613ce-8967-4bcb-b928-eb2b2c662c0d" path="/var/lib/kubelet/pods/e48613ce-8967-4bcb-b928-eb2b2c662c0d/volumes" Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.848364 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Apr 02 14:00:18 crc kubenswrapper[4732]: W0402 14:00:18.880038 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fbd853a_4252_4cf9_a5f3_a79c7360a62c.slice/crio-f11b23500df71f2522d7c74a4aa7b3e662134098c846722429819d5f8798e6ca WatchSource:0}: Error finding container f11b23500df71f2522d7c74a4aa7b3e662134098c846722429819d5f8798e6ca: Status 404 returned error can't find the container with id f11b23500df71f2522d7c74a4aa7b3e662134098c846722429819d5f8798e6ca Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.896562 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585640-289ph" event={"ID":"99deaef2-ca21-4254-9c26-8200edbbd497","Type":"ContainerStarted","Data":"ee64fa02c151e4753f2c43759329ed3e21cdca7f34eadfa82ed472120ba40fea"} Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.899269 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54f994999b-b88d7" event={"ID":"97d6e519-a82f-4ce5-9199-4d7db769f86b","Type":"ContainerStarted","Data":"26b9043f81cd671d1aef6e1726f11e0b458731857e31e19797af12f7eceb36cc"} Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.901793 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fb5bv" event={"ID":"261837b5-19d6-404f-b88f-b5b6cf88ebec","Type":"ContainerStarted","Data":"84418e9105b2ad1c6b30a264558b8d882276562246909a9034ee9b5e81184d7a"} Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.901831 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fb5bv" event={"ID":"261837b5-19d6-404f-b88f-b5b6cf88ebec","Type":"ContainerStarted","Data":"893f6c98db8b2fa99416d7a843dc273372c162b036ac63826bc790525ac198f0"} Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.903765 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29585640-n2cb8" event={"ID":"8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09","Type":"ContainerStarted","Data":"4347607c418aa9f67017b66ba5ecfbe94038cd13bb2b1b6bc83411912a4e5471"} Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.903807 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29585640-n2cb8" event={"ID":"8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09","Type":"ContainerStarted","Data":"fb339696d4e2763ef51366bdc84fb85204fd6fd2bf637b0c5849aa97abb26365"} Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.919465 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" event={"ID":"b7924745-2bd5-4642-a2a0-21f8647be92b","Type":"ContainerStarted","Data":"8d5336ef4acfd59c12f405bed6fc11bf3c57546864e2fe92f6fa0d933567e76e"} Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.921843 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-fb5bv" podStartSLOduration=10.921830716 podStartE2EDuration="10.921830716s" podCreationTimestamp="2026-04-02 14:00:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:00:18.915889155 +0000 UTC m=+1375.820296708" watchObservedRunningTime="2026-04-02 14:00:18.921830716 +0000 UTC m=+1375.826238269" Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.928374 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2","Type":"ContainerStarted","Data":"da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1"} Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.931064 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59fb764b6d-vml5x" event={"ID":"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525","Type":"ContainerStarted","Data":"4b1706da63a3a81d63b698d942bdeeacfb33d97d83cb7dc15eb16115530f9f12"} Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.934002 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3aed3c4d-3173-407f-9a70-c20ef18a554d","Type":"ContainerStarted","Data":"64b90e5513c61a0a240c665e8829faf848278958fab87319d195ef359146e53b"} Apr 02 14:00:18 crc kubenswrapper[4732]: I0402 14:00:18.939869 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29585640-n2cb8" podStartSLOduration=18.939856161 podStartE2EDuration="18.939856161s" podCreationTimestamp="2026-04-02 14:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:00:18.938023512 +0000 UTC m=+1375.842431065" watchObservedRunningTime="2026-04-02 14:00:18.939856161 +0000 UTC m=+1375.844263704" Apr 02 14:00:19 crc kubenswrapper[4732]: I0402 14:00:19.418280 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cd55d6846-4zs9k"] Apr 02 14:00:19 crc kubenswrapper[4732]: W0402 14:00:19.454940 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16168989_9c80_47a7_92ea_8be3984e5d99.slice/crio-54feea172d4cc2765cc10a2c9f6f31c474a9ea49b5f141adcab410b0f18eb69b WatchSource:0}: Error finding container 54feea172d4cc2765cc10a2c9f6f31c474a9ea49b5f141adcab410b0f18eb69b: Status 404 returned error can't find the container with id 54feea172d4cc2765cc10a2c9f6f31c474a9ea49b5f141adcab410b0f18eb69b Apr 02 14:00:19 crc kubenswrapper[4732]: I0402 14:00:19.949062 4732 generic.go:334] "Generic (PLEG): container finished" podID="8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09" containerID="4347607c418aa9f67017b66ba5ecfbe94038cd13bb2b1b6bc83411912a4e5471" exitCode=0 Apr 02 14:00:19 crc kubenswrapper[4732]: I0402 14:00:19.949392 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29585640-n2cb8" event={"ID":"8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09","Type":"ContainerDied","Data":"4347607c418aa9f67017b66ba5ecfbe94038cd13bb2b1b6bc83411912a4e5471"} Apr 02 14:00:19 crc kubenswrapper[4732]: I0402 14:00:19.950768 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1fbd853a-4252-4cf9-a5f3-a79c7360a62c","Type":"ContainerStarted","Data":"f11b23500df71f2522d7c74a4aa7b3e662134098c846722429819d5f8798e6ca"} Apr 02 14:00:19 crc kubenswrapper[4732]: I0402 14:00:19.952951 4732 generic.go:334] "Generic (PLEG): container finished" podID="b7924745-2bd5-4642-a2a0-21f8647be92b" containerID="aa725130e6c0af58d6bc8fa560fcc0c0ef18a460a1891aac88bb0ea3d0db9f83" exitCode=0 Apr 02 14:00:19 crc kubenswrapper[4732]: I0402 14:00:19.953256 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" event={"ID":"b7924745-2bd5-4642-a2a0-21f8647be92b","Type":"ContainerDied","Data":"aa725130e6c0af58d6bc8fa560fcc0c0ef18a460a1891aac88bb0ea3d0db9f83"} Apr 02 14:00:19 crc kubenswrapper[4732]: I0402 14:00:19.956880 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3aed3c4d-3173-407f-9a70-c20ef18a554d","Type":"ContainerStarted","Data":"7c190a4e465759d8b097e15e3ca8d2d640bd547fb325eb7b0705d436799a0337"} Apr 02 14:00:19 crc kubenswrapper[4732]: I0402 14:00:19.958551 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59fb764b6d-vml5x" event={"ID":"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525","Type":"ContainerStarted","Data":"aecec17f0e79f52981ef4808d1800fe67aab89d44b00ed1c7e216149c5ee7fc8"} Apr 02 14:00:19 crc kubenswrapper[4732]: I0402 14:00:19.960025 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cd55d6846-4zs9k" event={"ID":"16168989-9c80-47a7-92ea-8be3984e5d99","Type":"ContainerStarted","Data":"54feea172d4cc2765cc10a2c9f6f31c474a9ea49b5f141adcab410b0f18eb69b"} Apr 02 14:00:19 crc kubenswrapper[4732]: I0402 14:00:19.964055 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54f994999b-b88d7" event={"ID":"97d6e519-a82f-4ce5-9199-4d7db769f86b","Type":"ContainerStarted","Data":"612052c10b96afb9ba8c654733afe65760862981c14176c013a4af21a3353e1c"} Apr 02 14:00:19 crc kubenswrapper[4732]: I0402 14:00:19.964097 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54f994999b-b88d7" event={"ID":"97d6e519-a82f-4ce5-9199-4d7db769f86b","Type":"ContainerStarted","Data":"a6621bc7a363f165937f7eaacac03d7e1c2081ce5899f99e3a0aff5abadaa465"} Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.020647 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-76fc857857-knj58"] Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.027490 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.030981 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.032210 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.046370 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76fc857857-knj58"] Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.046413 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9578\" (UniqueName: \"kubernetes.io/projected/88748d2e-8313-467e-b707-e82e1af776d5-kube-api-access-b9578\") pod \"neutron-76fc857857-knj58\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.046495 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-internal-tls-certs\") pod \"neutron-76fc857857-knj58\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.046562 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-config\") pod \"neutron-76fc857857-knj58\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.046604 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-httpd-config\") pod \"neutron-76fc857857-knj58\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.046684 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-combined-ca-bundle\") pod \"neutron-76fc857857-knj58\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.046745 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-ovndb-tls-certs\") pod \"neutron-76fc857857-knj58\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.046837 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-public-tls-certs\") pod \"neutron-76fc857857-knj58\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.149104 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-public-tls-certs\") pod \"neutron-76fc857857-knj58\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.149369 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9578\" (UniqueName: \"kubernetes.io/projected/88748d2e-8313-467e-b707-e82e1af776d5-kube-api-access-b9578\") pod \"neutron-76fc857857-knj58\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.149406 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-internal-tls-certs\") pod \"neutron-76fc857857-knj58\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.149447 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-config\") pod \"neutron-76fc857857-knj58\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.149467 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-httpd-config\") pod \"neutron-76fc857857-knj58\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.149501 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-combined-ca-bundle\") pod \"neutron-76fc857857-knj58\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.149543 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-ovndb-tls-certs\") pod \"neutron-76fc857857-knj58\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.154787 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-internal-tls-certs\") pod \"neutron-76fc857857-knj58\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.160643 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-combined-ca-bundle\") pod \"neutron-76fc857857-knj58\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.160966 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-ovndb-tls-certs\") pod \"neutron-76fc857857-knj58\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.168336 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-httpd-config\") pod \"neutron-76fc857857-knj58\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.168598 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9578\" (UniqueName: \"kubernetes.io/projected/88748d2e-8313-467e-b707-e82e1af776d5-kube-api-access-b9578\") pod \"neutron-76fc857857-knj58\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.169189 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-public-tls-certs\") pod \"neutron-76fc857857-knj58\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.171166 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-config\") pod \"neutron-76fc857857-knj58\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.364249 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:20 crc kubenswrapper[4732]: I0402 14:00:20.986041 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1fbd853a-4252-4cf9-a5f3-a79c7360a62c","Type":"ContainerStarted","Data":"af0f0cd140e2c2dea25a93d14d04b37a18f2e24ae31a55b2beb619dc7d51e799"} Apr 02 14:00:21 crc kubenswrapper[4732]: I0402 14:00:20.999958 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" event={"ID":"b7924745-2bd5-4642-a2a0-21f8647be92b","Type":"ContainerStarted","Data":"add10d00c96f619f2970add749542c0e114a3d6f8c0bf2d320b1141303f526c9"} Apr 02 14:00:21 crc kubenswrapper[4732]: I0402 14:00:21.001772 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" Apr 02 14:00:21 crc kubenswrapper[4732]: I0402 14:00:21.005891 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2","Type":"ContainerStarted","Data":"e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61"} Apr 02 14:00:21 crc kubenswrapper[4732]: I0402 14:00:21.016522 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59fb764b6d-vml5x" event={"ID":"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525","Type":"ContainerStarted","Data":"56cb1e14c01780829e3d908dfe3c04fea17b711fbfdaa9357cab714feddecfb5"} Apr 02 14:00:21 crc kubenswrapper[4732]: I0402 14:00:21.024770 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" podStartSLOduration=4.024746643 podStartE2EDuration="4.024746643s" podCreationTimestamp="2026-04-02 14:00:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:00:21.024009603 +0000 UTC m=+1377.928417166" watchObservedRunningTime="2026-04-02 14:00:21.024746643 +0000 UTC m=+1377.929154196" Apr 02 14:00:21 crc kubenswrapper[4732]: I0402 14:00:21.037797 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cd55d6846-4zs9k" event={"ID":"16168989-9c80-47a7-92ea-8be3984e5d99","Type":"ContainerStarted","Data":"3429e76b65d136138b088fd6ee1f204b388fa132a9e8c6d42c421ee5d3126c70"} Apr 02 14:00:21 crc kubenswrapper[4732]: I0402 14:00:21.037864 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cd55d6846-4zs9k" event={"ID":"16168989-9c80-47a7-92ea-8be3984e5d99","Type":"ContainerStarted","Data":"0fc3be86c5bbdc911a2aa0733cc4678d1d83c286c3a20005f7c7cdbcde9da39b"} Apr 02 14:00:21 crc kubenswrapper[4732]: I0402 14:00:21.058873 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-59fb764b6d-vml5x" podStartSLOduration=24.534716029 podStartE2EDuration="25.058851511s" podCreationTimestamp="2026-04-02 13:59:56 +0000 UTC" firstStartedPulling="2026-04-02 14:00:18.466675612 +0000 UTC m=+1375.371083165" lastFinishedPulling="2026-04-02 14:00:18.990811094 +0000 UTC m=+1375.895218647" observedRunningTime="2026-04-02 14:00:21.044804133 +0000 UTC m=+1377.949211706" watchObservedRunningTime="2026-04-02 14:00:21.058851511 +0000 UTC m=+1377.963259064" Apr 02 14:00:21 crc kubenswrapper[4732]: I0402 14:00:21.089248 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-54f994999b-b88d7" podStartSLOduration=24.619177485 podStartE2EDuration="25.08922801s" podCreationTimestamp="2026-04-02 13:59:56 +0000 UTC" firstStartedPulling="2026-04-02 14:00:18.262717667 +0000 UTC m=+1375.167125220" lastFinishedPulling="2026-04-02 14:00:18.732768192 +0000 UTC m=+1375.637175745" observedRunningTime="2026-04-02 14:00:21.081357208 +0000 UTC m=+1377.985764771" watchObservedRunningTime="2026-04-02 14:00:21.08922801 +0000 UTC m=+1377.993635573" Apr 02 14:00:21 crc kubenswrapper[4732]: I0402 14:00:21.129017 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76fc857857-knj58"] Apr 02 14:00:21 crc kubenswrapper[4732]: I0402 14:00:21.131091 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=25.131070687 podStartE2EDuration="25.131070687s" podCreationTimestamp="2026-04-02 13:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:00:21.105539199 +0000 UTC m=+1378.009946762" watchObservedRunningTime="2026-04-02 14:00:21.131070687 +0000 UTC m=+1378.035478240" Apr 02 14:00:21 crc kubenswrapper[4732]: I0402 14:00:21.150933 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-cd55d6846-4zs9k" podStartSLOduration=4.150914042 podStartE2EDuration="4.150914042s" podCreationTimestamp="2026-04-02 14:00:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:00:21.136306708 +0000 UTC m=+1378.040714271" watchObservedRunningTime="2026-04-02 14:00:21.150914042 +0000 UTC m=+1378.055321595" Apr 02 14:00:21 crc kubenswrapper[4732]: I0402 14:00:21.500782 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29585640-n2cb8" Apr 02 14:00:21 crc kubenswrapper[4732]: I0402 14:00:21.597335 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09-secret-volume\") pod \"8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09\" (UID: \"8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09\") " Apr 02 14:00:21 crc kubenswrapper[4732]: I0402 14:00:21.597423 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09-config-volume\") pod \"8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09\" (UID: \"8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09\") " Apr 02 14:00:21 crc kubenswrapper[4732]: I0402 14:00:21.597633 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzvt5\" (UniqueName: \"kubernetes.io/projected/8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09-kube-api-access-fzvt5\") pod \"8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09\" (UID: \"8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09\") " Apr 02 14:00:21 crc kubenswrapper[4732]: I0402 14:00:21.598308 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09-config-volume" (OuterVolumeSpecName: "config-volume") pod "8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09" (UID: "8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:21 crc kubenswrapper[4732]: I0402 14:00:21.604899 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09" (UID: "8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:21 crc kubenswrapper[4732]: I0402 14:00:21.604945 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09-kube-api-access-fzvt5" (OuterVolumeSpecName: "kube-api-access-fzvt5") pod "8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09" (UID: "8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09"). InnerVolumeSpecName "kube-api-access-fzvt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:00:21 crc kubenswrapper[4732]: I0402 14:00:21.699922 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzvt5\" (UniqueName: \"kubernetes.io/projected/8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09-kube-api-access-fzvt5\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:21 crc kubenswrapper[4732]: I0402 14:00:21.699955 4732 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09-secret-volume\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:21 crc kubenswrapper[4732]: I0402 14:00:21.699967 4732 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09-config-volume\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:22 crc kubenswrapper[4732]: I0402 14:00:22.093942 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1fbd853a-4252-4cf9-a5f3-a79c7360a62c","Type":"ContainerStarted","Data":"4fbd4e38d326424f02f37bb618d5c6d08b882e6c122760e527d6ba58f02e1e92"} Apr 02 14:00:22 crc kubenswrapper[4732]: I0402 14:00:22.123240 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76fc857857-knj58" event={"ID":"88748d2e-8313-467e-b707-e82e1af776d5","Type":"ContainerStarted","Data":"4f997405de3433e224878d0c0a68fb4b21ec3d43799d53e42d66304cefa5956d"} Apr 02 14:00:22 crc kubenswrapper[4732]: I0402 14:00:22.123332 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:22 crc kubenswrapper[4732]: I0402 14:00:22.123366 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76fc857857-knj58" event={"ID":"88748d2e-8313-467e-b707-e82e1af776d5","Type":"ContainerStarted","Data":"4bcf35b9dca26827b3b6375d989726ce646c242d2a9befcc53a1c6bbdc4e4e3d"} Apr 02 14:00:22 crc kubenswrapper[4732]: I0402 14:00:22.123382 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76fc857857-knj58" event={"ID":"88748d2e-8313-467e-b707-e82e1af776d5","Type":"ContainerStarted","Data":"aae01bcd908a540852406a3baf1c25bd0e1b0e43b9f76a9580475d13a78f0edb"} Apr 02 14:00:22 crc kubenswrapper[4732]: I0402 14:00:22.128240 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.128222223 podStartE2EDuration="6.128222223s" podCreationTimestamp="2026-04-02 14:00:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:00:22.117522075 +0000 UTC m=+1379.021929648" watchObservedRunningTime="2026-04-02 14:00:22.128222223 +0000 UTC m=+1379.032629766" Apr 02 14:00:22 crc kubenswrapper[4732]: I0402 14:00:22.140037 4732 generic.go:334] "Generic (PLEG): container finished" podID="99deaef2-ca21-4254-9c26-8200edbbd497" containerID="af6f7141a9c8da30c13a335a189e8641b675cd71fc8da314fc45e4b603f98b2b" exitCode=0 Apr 02 14:00:22 crc kubenswrapper[4732]: I0402 14:00:22.140349 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585640-289ph" event={"ID":"99deaef2-ca21-4254-9c26-8200edbbd497","Type":"ContainerDied","Data":"af6f7141a9c8da30c13a335a189e8641b675cd71fc8da314fc45e4b603f98b2b"} Apr 02 14:00:22 crc kubenswrapper[4732]: I0402 14:00:22.160342 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29585640-n2cb8" Apr 02 14:00:22 crc kubenswrapper[4732]: I0402 14:00:22.161772 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29585640-n2cb8" event={"ID":"8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09","Type":"ContainerDied","Data":"fb339696d4e2763ef51366bdc84fb85204fd6fd2bf637b0c5849aa97abb26365"} Apr 02 14:00:22 crc kubenswrapper[4732]: I0402 14:00:22.161898 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb339696d4e2763ef51366bdc84fb85204fd6fd2bf637b0c5849aa97abb26365" Apr 02 14:00:22 crc kubenswrapper[4732]: I0402 14:00:22.162365 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-cd55d6846-4zs9k" Apr 02 14:00:22 crc kubenswrapper[4732]: I0402 14:00:22.173545 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-76fc857857-knj58" podStartSLOduration=3.173519414 podStartE2EDuration="3.173519414s" podCreationTimestamp="2026-04-02 14:00:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:00:22.150839463 +0000 UTC m=+1379.055247026" watchObservedRunningTime="2026-04-02 14:00:22.173519414 +0000 UTC m=+1379.077926967" Apr 02 14:00:24 crc kubenswrapper[4732]: I0402 14:00:24.179334 4732 generic.go:334] "Generic (PLEG): container finished" podID="261837b5-19d6-404f-b88f-b5b6cf88ebec" containerID="84418e9105b2ad1c6b30a264558b8d882276562246909a9034ee9b5e81184d7a" exitCode=0 Apr 02 14:00:24 crc kubenswrapper[4732]: I0402 14:00:24.179420 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fb5bv" event={"ID":"261837b5-19d6-404f-b88f-b5b6cf88ebec","Type":"ContainerDied","Data":"84418e9105b2ad1c6b30a264558b8d882276562246909a9034ee9b5e81184d7a"} Apr 02 14:00:26 crc kubenswrapper[4732]: I0402 14:00:26.596141 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 14:00:26 crc kubenswrapper[4732]: I0402 14:00:26.596719 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 14:00:26 crc kubenswrapper[4732]: I0402 14:00:26.731553 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-54f994999b-b88d7" Apr 02 14:00:26 crc kubenswrapper[4732]: I0402 14:00:26.731600 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-54f994999b-b88d7" Apr 02 14:00:27 crc kubenswrapper[4732]: I0402 14:00:27.070623 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Apr 02 14:00:27 crc kubenswrapper[4732]: I0402 14:00:27.070718 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Apr 02 14:00:27 crc kubenswrapper[4732]: I0402 14:00:27.070771 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Apr 02 14:00:27 crc kubenswrapper[4732]: I0402 14:00:27.070961 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Apr 02 14:00:27 crc kubenswrapper[4732]: I0402 14:00:27.096785 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Apr 02 14:00:27 crc kubenswrapper[4732]: I0402 14:00:27.106954 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Apr 02 14:00:27 crc kubenswrapper[4732]: I0402 14:00:27.326262 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Apr 02 14:00:27 crc kubenswrapper[4732]: I0402 14:00:27.326314 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Apr 02 14:00:27 crc kubenswrapper[4732]: I0402 14:00:27.355722 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Apr 02 14:00:27 crc kubenswrapper[4732]: I0402 14:00:27.368746 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Apr 02 14:00:27 crc kubenswrapper[4732]: I0402 14:00:27.844789 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" Apr 02 14:00:27 crc kubenswrapper[4732]: I0402 14:00:27.933684 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-l9kx4"] Apr 02 14:00:27 crc kubenswrapper[4732]: I0402 14:00:27.933912 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" podUID="38e262a1-5383-4dfb-8f89-537e4db5559c" containerName="dnsmasq-dns" containerID="cri-o://e2f22566a97573ec2b53998abc6a6f23072b22ca3960da4d5420f6e816b9db5b" gracePeriod=10 Apr 02 14:00:28 crc kubenswrapper[4732]: I0402 14:00:28.237237 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Apr 02 14:00:28 crc kubenswrapper[4732]: I0402 14:00:28.237456 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Apr 02 14:00:28 crc kubenswrapper[4732]: I0402 14:00:28.427259 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" podUID="38e262a1-5383-4dfb-8f89-537e4db5559c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.145:5353: connect: connection refused" Apr 02 14:00:29 crc kubenswrapper[4732]: I0402 14:00:29.246049 4732 generic.go:334] "Generic (PLEG): container finished" podID="38e262a1-5383-4dfb-8f89-537e4db5559c" containerID="e2f22566a97573ec2b53998abc6a6f23072b22ca3960da4d5420f6e816b9db5b" exitCode=0 Apr 02 14:00:29 crc kubenswrapper[4732]: I0402 14:00:29.246136 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" event={"ID":"38e262a1-5383-4dfb-8f89-537e4db5559c","Type":"ContainerDied","Data":"e2f22566a97573ec2b53998abc6a6f23072b22ca3960da4d5420f6e816b9db5b"} Apr 02 14:00:29 crc kubenswrapper[4732]: I0402 14:00:29.434701 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Apr 02 14:00:29 crc kubenswrapper[4732]: I0402 14:00:29.434800 4732 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 02 14:00:29 crc kubenswrapper[4732]: I0402 14:00:29.441276 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Apr 02 14:00:30 crc kubenswrapper[4732]: I0402 14:00:30.361739 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Apr 02 14:00:30 crc kubenswrapper[4732]: I0402 14:00:30.361849 4732 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 02 14:00:30 crc kubenswrapper[4732]: I0402 14:00:30.362324 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.304978 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fb5bv" event={"ID":"261837b5-19d6-404f-b88f-b5b6cf88ebec","Type":"ContainerDied","Data":"893f6c98db8b2fa99416d7a843dc273372c162b036ac63826bc790525ac198f0"} Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.305327 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="893f6c98db8b2fa99416d7a843dc273372c162b036ac63826bc790525ac198f0" Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.411666 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fb5bv" Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.438104 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585640-289ph" Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.506203 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-scripts\") pod \"261837b5-19d6-404f-b88f-b5b6cf88ebec\" (UID: \"261837b5-19d6-404f-b88f-b5b6cf88ebec\") " Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.506265 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-combined-ca-bundle\") pod \"261837b5-19d6-404f-b88f-b5b6cf88ebec\" (UID: \"261837b5-19d6-404f-b88f-b5b6cf88ebec\") " Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.506318 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvm69\" (UniqueName: \"kubernetes.io/projected/261837b5-19d6-404f-b88f-b5b6cf88ebec-kube-api-access-kvm69\") pod \"261837b5-19d6-404f-b88f-b5b6cf88ebec\" (UID: \"261837b5-19d6-404f-b88f-b5b6cf88ebec\") " Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.506376 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-credential-keys\") pod \"261837b5-19d6-404f-b88f-b5b6cf88ebec\" (UID: \"261837b5-19d6-404f-b88f-b5b6cf88ebec\") " Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.506407 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-fernet-keys\") pod \"261837b5-19d6-404f-b88f-b5b6cf88ebec\" (UID: \"261837b5-19d6-404f-b88f-b5b6cf88ebec\") " Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.506476 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-config-data\") pod \"261837b5-19d6-404f-b88f-b5b6cf88ebec\" (UID: \"261837b5-19d6-404f-b88f-b5b6cf88ebec\") " Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.506527 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx6j9\" (UniqueName: \"kubernetes.io/projected/99deaef2-ca21-4254-9c26-8200edbbd497-kube-api-access-gx6j9\") pod \"99deaef2-ca21-4254-9c26-8200edbbd497\" (UID: \"99deaef2-ca21-4254-9c26-8200edbbd497\") " Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.518885 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-scripts" (OuterVolumeSpecName: "scripts") pod "261837b5-19d6-404f-b88f-b5b6cf88ebec" (UID: "261837b5-19d6-404f-b88f-b5b6cf88ebec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.519452 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/261837b5-19d6-404f-b88f-b5b6cf88ebec-kube-api-access-kvm69" (OuterVolumeSpecName: "kube-api-access-kvm69") pod "261837b5-19d6-404f-b88f-b5b6cf88ebec" (UID: "261837b5-19d6-404f-b88f-b5b6cf88ebec"). InnerVolumeSpecName "kube-api-access-kvm69". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.519828 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99deaef2-ca21-4254-9c26-8200edbbd497-kube-api-access-gx6j9" (OuterVolumeSpecName: "kube-api-access-gx6j9") pod "99deaef2-ca21-4254-9c26-8200edbbd497" (UID: "99deaef2-ca21-4254-9c26-8200edbbd497"). InnerVolumeSpecName "kube-api-access-gx6j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.523450 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "261837b5-19d6-404f-b88f-b5b6cf88ebec" (UID: "261837b5-19d6-404f-b88f-b5b6cf88ebec"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.525707 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "261837b5-19d6-404f-b88f-b5b6cf88ebec" (UID: "261837b5-19d6-404f-b88f-b5b6cf88ebec"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.552016 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "261837b5-19d6-404f-b88f-b5b6cf88ebec" (UID: "261837b5-19d6-404f-b88f-b5b6cf88ebec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.604597 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-config-data" (OuterVolumeSpecName: "config-data") pod "261837b5-19d6-404f-b88f-b5b6cf88ebec" (UID: "261837b5-19d6-404f-b88f-b5b6cf88ebec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.609505 4732 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-credential-keys\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.609651 4732 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-fernet-keys\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.609725 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.609789 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx6j9\" (UniqueName: \"kubernetes.io/projected/99deaef2-ca21-4254-9c26-8200edbbd497-kube-api-access-gx6j9\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.609970 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.610723 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/261837b5-19d6-404f-b88f-b5b6cf88ebec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.610894 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvm69\" (UniqueName: \"kubernetes.io/projected/261837b5-19d6-404f-b88f-b5b6cf88ebec-kube-api-access-kvm69\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.829757 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.922352 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-ovsdbserver-sb\") pod \"38e262a1-5383-4dfb-8f89-537e4db5559c\" (UID: \"38e262a1-5383-4dfb-8f89-537e4db5559c\") " Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.922542 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-ovsdbserver-nb\") pod \"38e262a1-5383-4dfb-8f89-537e4db5559c\" (UID: \"38e262a1-5383-4dfb-8f89-537e4db5559c\") " Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.922589 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sr66\" (UniqueName: \"kubernetes.io/projected/38e262a1-5383-4dfb-8f89-537e4db5559c-kube-api-access-6sr66\") pod \"38e262a1-5383-4dfb-8f89-537e4db5559c\" (UID: \"38e262a1-5383-4dfb-8f89-537e4db5559c\") " Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.922648 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-config\") pod \"38e262a1-5383-4dfb-8f89-537e4db5559c\" (UID: \"38e262a1-5383-4dfb-8f89-537e4db5559c\") " Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.922710 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-dns-swift-storage-0\") pod \"38e262a1-5383-4dfb-8f89-537e4db5559c\" (UID: \"38e262a1-5383-4dfb-8f89-537e4db5559c\") " Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.922812 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-dns-svc\") pod \"38e262a1-5383-4dfb-8f89-537e4db5559c\" (UID: \"38e262a1-5383-4dfb-8f89-537e4db5559c\") " Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.926061 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.926105 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.926143 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.926934 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7fb2687018e193fb92c41619c313936d4cbab14821cf21277c10428a796150c1"} pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.926981 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" containerID="cri-o://7fb2687018e193fb92c41619c313936d4cbab14821cf21277c10428a796150c1" gracePeriod=600 Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.944595 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38e262a1-5383-4dfb-8f89-537e4db5559c-kube-api-access-6sr66" (OuterVolumeSpecName: "kube-api-access-6sr66") pod "38e262a1-5383-4dfb-8f89-537e4db5559c" (UID: "38e262a1-5383-4dfb-8f89-537e4db5559c"). InnerVolumeSpecName "kube-api-access-6sr66". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:00:31 crc kubenswrapper[4732]: I0402 14:00:31.997082 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-config" (OuterVolumeSpecName: "config") pod "38e262a1-5383-4dfb-8f89-537e4db5559c" (UID: "38e262a1-5383-4dfb-8f89-537e4db5559c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.004941 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "38e262a1-5383-4dfb-8f89-537e4db5559c" (UID: "38e262a1-5383-4dfb-8f89-537e4db5559c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.026800 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sr66\" (UniqueName: \"kubernetes.io/projected/38e262a1-5383-4dfb-8f89-537e4db5559c-kube-api-access-6sr66\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.026831 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-config\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.026840 4732 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.030068 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "38e262a1-5383-4dfb-8f89-537e4db5559c" (UID: "38e262a1-5383-4dfb-8f89-537e4db5559c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.031883 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "38e262a1-5383-4dfb-8f89-537e4db5559c" (UID: "38e262a1-5383-4dfb-8f89-537e4db5559c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.037063 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "38e262a1-5383-4dfb-8f89-537e4db5559c" (UID: "38e262a1-5383-4dfb-8f89-537e4db5559c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.128656 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.128692 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.128701 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38e262a1-5383-4dfb-8f89-537e4db5559c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.329665 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.329841 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-l9kx4" event={"ID":"38e262a1-5383-4dfb-8f89-537e4db5559c","Type":"ContainerDied","Data":"dca2f9ffcc7bc0b2a75459394ce56331497db5203904f76d0c6d5cccbdcd0a8e"} Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.329956 4732 scope.go:117] "RemoveContainer" containerID="e2f22566a97573ec2b53998abc6a6f23072b22ca3960da4d5420f6e816b9db5b" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.348199 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-d54q6" event={"ID":"34aed337-bbff-45a7-b95f-b26c95733c82","Type":"ContainerStarted","Data":"d8df9e280fcecedceff628b0366a8cfd007c16737c9bc8bf5c8a30e7d6f7222a"} Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.361420 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585640-289ph" event={"ID":"99deaef2-ca21-4254-9c26-8200edbbd497","Type":"ContainerDied","Data":"ee64fa02c151e4753f2c43759329ed3e21cdca7f34eadfa82ed472120ba40fea"} Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.361476 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee64fa02c151e4753f2c43759329ed3e21cdca7f34eadfa82ed472120ba40fea" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.361549 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585640-289ph" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.372828 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-d54q6" podStartSLOduration=3.317782304 podStartE2EDuration="45.372815537s" podCreationTimestamp="2026-04-02 13:59:47 +0000 UTC" firstStartedPulling="2026-04-02 13:59:49.261609379 +0000 UTC m=+1346.166016932" lastFinishedPulling="2026-04-02 14:00:31.316642612 +0000 UTC m=+1388.221050165" observedRunningTime="2026-04-02 14:00:32.36994939 +0000 UTC m=+1389.274356933" watchObservedRunningTime="2026-04-02 14:00:32.372815537 +0000 UTC m=+1389.277223090" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.394901 4732 generic.go:334] "Generic (PLEG): container finished" podID="38409e5e-4545-49da-8f6c-4bfb30582878" containerID="7fb2687018e193fb92c41619c313936d4cbab14821cf21277c10428a796150c1" exitCode=0 Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.395302 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerDied","Data":"7fb2687018e193fb92c41619c313936d4cbab14821cf21277c10428a796150c1"} Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.407501 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dh5g6" event={"ID":"64f807d9-0af7-4723-98b2-dd3cbe55df99","Type":"ContainerStarted","Data":"640881bf9aa75a112cdc7f52c1419406cdb9d754515996f2df4a58ab10015672"} Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.451988 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-dh5g6" podStartSLOduration=3.373438404 podStartE2EDuration="45.45197119s" podCreationTimestamp="2026-04-02 13:59:47 +0000 UTC" firstStartedPulling="2026-04-02 13:59:49.244244191 +0000 UTC m=+1346.148651744" lastFinishedPulling="2026-04-02 14:00:31.322776977 +0000 UTC m=+1388.227184530" observedRunningTime="2026-04-02 14:00:32.449100132 +0000 UTC m=+1389.353507685" watchObservedRunningTime="2026-04-02 14:00:32.45197119 +0000 UTC m=+1389.356378743" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.461813 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fb5bv" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.463897 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2","Type":"ContainerStarted","Data":"620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549"} Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.548673 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d4c8876f7-592x4"] Apr 02 14:00:32 crc kubenswrapper[4732]: E0402 14:00:32.549072 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09" containerName="collect-profiles" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.549084 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09" containerName="collect-profiles" Apr 02 14:00:32 crc kubenswrapper[4732]: E0402 14:00:32.549093 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38e262a1-5383-4dfb-8f89-537e4db5559c" containerName="init" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.549099 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="38e262a1-5383-4dfb-8f89-537e4db5559c" containerName="init" Apr 02 14:00:32 crc kubenswrapper[4732]: E0402 14:00:32.549115 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38e262a1-5383-4dfb-8f89-537e4db5559c" containerName="dnsmasq-dns" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.549121 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="38e262a1-5383-4dfb-8f89-537e4db5559c" containerName="dnsmasq-dns" Apr 02 14:00:32 crc kubenswrapper[4732]: E0402 14:00:32.549134 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99deaef2-ca21-4254-9c26-8200edbbd497" containerName="oc" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.549140 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="99deaef2-ca21-4254-9c26-8200edbbd497" containerName="oc" Apr 02 14:00:32 crc kubenswrapper[4732]: E0402 14:00:32.549152 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="261837b5-19d6-404f-b88f-b5b6cf88ebec" containerName="keystone-bootstrap" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.549158 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="261837b5-19d6-404f-b88f-b5b6cf88ebec" containerName="keystone-bootstrap" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.549308 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="99deaef2-ca21-4254-9c26-8200edbbd497" containerName="oc" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.549326 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="38e262a1-5383-4dfb-8f89-537e4db5559c" containerName="dnsmasq-dns" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.549337 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="261837b5-19d6-404f-b88f-b5b6cf88ebec" containerName="keystone-bootstrap" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.549347 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09" containerName="collect-profiles" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.549916 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.573516 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.573967 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.574592 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-scsz6" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.576604 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.577196 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.577724 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.606750 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d4c8876f7-592x4"] Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.635478 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585634-7tqn2"] Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.638816 4732 scope.go:117] "RemoveContainer" containerID="a284bab016250adabb273688b4d9b26e0a6b7383f15a43c13128f2995a2777b7" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.646114 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5acfdea3-28ba-47f3-860c-6e7af2fe3222-fernet-keys\") pod \"keystone-d4c8876f7-592x4\" (UID: \"5acfdea3-28ba-47f3-860c-6e7af2fe3222\") " pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.646278 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvtfd\" (UniqueName: \"kubernetes.io/projected/5acfdea3-28ba-47f3-860c-6e7af2fe3222-kube-api-access-dvtfd\") pod \"keystone-d4c8876f7-592x4\" (UID: \"5acfdea3-28ba-47f3-860c-6e7af2fe3222\") " pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.646377 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5acfdea3-28ba-47f3-860c-6e7af2fe3222-config-data\") pod \"keystone-d4c8876f7-592x4\" (UID: \"5acfdea3-28ba-47f3-860c-6e7af2fe3222\") " pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.646421 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5acfdea3-28ba-47f3-860c-6e7af2fe3222-scripts\") pod \"keystone-d4c8876f7-592x4\" (UID: \"5acfdea3-28ba-47f3-860c-6e7af2fe3222\") " pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.646449 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5acfdea3-28ba-47f3-860c-6e7af2fe3222-internal-tls-certs\") pod \"keystone-d4c8876f7-592x4\" (UID: \"5acfdea3-28ba-47f3-860c-6e7af2fe3222\") " pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.646491 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5acfdea3-28ba-47f3-860c-6e7af2fe3222-public-tls-certs\") pod \"keystone-d4c8876f7-592x4\" (UID: \"5acfdea3-28ba-47f3-860c-6e7af2fe3222\") " pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.646512 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5acfdea3-28ba-47f3-860c-6e7af2fe3222-credential-keys\") pod \"keystone-d4c8876f7-592x4\" (UID: \"5acfdea3-28ba-47f3-860c-6e7af2fe3222\") " pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.646546 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5acfdea3-28ba-47f3-860c-6e7af2fe3222-combined-ca-bundle\") pod \"keystone-d4c8876f7-592x4\" (UID: \"5acfdea3-28ba-47f3-860c-6e7af2fe3222\") " pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.665725 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585634-7tqn2"] Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.703340 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="761913ea-6b4c-408f-a0ec-8d9a0179832a" path="/var/lib/kubelet/pods/761913ea-6b4c-408f-a0ec-8d9a0179832a/volumes" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.709681 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-l9kx4"] Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.717884 4732 scope.go:117] "RemoveContainer" containerID="6beef2fa99836ab6f985ec458e30c6e22b8f1d0b42722462a9fe13d02e226853" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.715599 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-l9kx4"] Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.748264 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5acfdea3-28ba-47f3-860c-6e7af2fe3222-combined-ca-bundle\") pod \"keystone-d4c8876f7-592x4\" (UID: \"5acfdea3-28ba-47f3-860c-6e7af2fe3222\") " pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.748350 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5acfdea3-28ba-47f3-860c-6e7af2fe3222-fernet-keys\") pod \"keystone-d4c8876f7-592x4\" (UID: \"5acfdea3-28ba-47f3-860c-6e7af2fe3222\") " pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.748431 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvtfd\" (UniqueName: \"kubernetes.io/projected/5acfdea3-28ba-47f3-860c-6e7af2fe3222-kube-api-access-dvtfd\") pod \"keystone-d4c8876f7-592x4\" (UID: \"5acfdea3-28ba-47f3-860c-6e7af2fe3222\") " pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.748495 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5acfdea3-28ba-47f3-860c-6e7af2fe3222-config-data\") pod \"keystone-d4c8876f7-592x4\" (UID: \"5acfdea3-28ba-47f3-860c-6e7af2fe3222\") " pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.748532 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5acfdea3-28ba-47f3-860c-6e7af2fe3222-scripts\") pod \"keystone-d4c8876f7-592x4\" (UID: \"5acfdea3-28ba-47f3-860c-6e7af2fe3222\") " pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.748564 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5acfdea3-28ba-47f3-860c-6e7af2fe3222-internal-tls-certs\") pod \"keystone-d4c8876f7-592x4\" (UID: \"5acfdea3-28ba-47f3-860c-6e7af2fe3222\") " pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.748602 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5acfdea3-28ba-47f3-860c-6e7af2fe3222-credential-keys\") pod \"keystone-d4c8876f7-592x4\" (UID: \"5acfdea3-28ba-47f3-860c-6e7af2fe3222\") " pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.748636 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5acfdea3-28ba-47f3-860c-6e7af2fe3222-public-tls-certs\") pod \"keystone-d4c8876f7-592x4\" (UID: \"5acfdea3-28ba-47f3-860c-6e7af2fe3222\") " pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.754870 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5acfdea3-28ba-47f3-860c-6e7af2fe3222-public-tls-certs\") pod \"keystone-d4c8876f7-592x4\" (UID: \"5acfdea3-28ba-47f3-860c-6e7af2fe3222\") " pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.756314 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5acfdea3-28ba-47f3-860c-6e7af2fe3222-fernet-keys\") pod \"keystone-d4c8876f7-592x4\" (UID: \"5acfdea3-28ba-47f3-860c-6e7af2fe3222\") " pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.760071 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5acfdea3-28ba-47f3-860c-6e7af2fe3222-config-data\") pod \"keystone-d4c8876f7-592x4\" (UID: \"5acfdea3-28ba-47f3-860c-6e7af2fe3222\") " pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.771826 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5acfdea3-28ba-47f3-860c-6e7af2fe3222-internal-tls-certs\") pod \"keystone-d4c8876f7-592x4\" (UID: \"5acfdea3-28ba-47f3-860c-6e7af2fe3222\") " pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.772136 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5acfdea3-28ba-47f3-860c-6e7af2fe3222-credential-keys\") pod \"keystone-d4c8876f7-592x4\" (UID: \"5acfdea3-28ba-47f3-860c-6e7af2fe3222\") " pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.772376 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5acfdea3-28ba-47f3-860c-6e7af2fe3222-scripts\") pod \"keystone-d4c8876f7-592x4\" (UID: \"5acfdea3-28ba-47f3-860c-6e7af2fe3222\") " pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.776890 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvtfd\" (UniqueName: \"kubernetes.io/projected/5acfdea3-28ba-47f3-860c-6e7af2fe3222-kube-api-access-dvtfd\") pod \"keystone-d4c8876f7-592x4\" (UID: \"5acfdea3-28ba-47f3-860c-6e7af2fe3222\") " pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.776969 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5acfdea3-28ba-47f3-860c-6e7af2fe3222-combined-ca-bundle\") pod \"keystone-d4c8876f7-592x4\" (UID: \"5acfdea3-28ba-47f3-860c-6e7af2fe3222\") " pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:32 crc kubenswrapper[4732]: I0402 14:00:32.922962 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:33 crc kubenswrapper[4732]: W0402 14:00:33.420213 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5acfdea3_28ba_47f3_860c_6e7af2fe3222.slice/crio-f802d81361129e91b45cbfd03474f2a1b10fce7d7c4c0a6300b2534ce641a06a WatchSource:0}: Error finding container f802d81361129e91b45cbfd03474f2a1b10fce7d7c4c0a6300b2534ce641a06a: Status 404 returned error can't find the container with id f802d81361129e91b45cbfd03474f2a1b10fce7d7c4c0a6300b2534ce641a06a Apr 02 14:00:33 crc kubenswrapper[4732]: I0402 14:00:33.459910 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d4c8876f7-592x4"] Apr 02 14:00:33 crc kubenswrapper[4732]: I0402 14:00:33.474459 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jwc6l" event={"ID":"10ba0697-529f-41d3-a1a8-55b50ed024a2","Type":"ContainerStarted","Data":"29099b7d93412a262eb083604ee4e20a01b86f78aa60f531a53fea918e12553a"} Apr 02 14:00:33 crc kubenswrapper[4732]: I0402 14:00:33.493838 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d4c8876f7-592x4" event={"ID":"5acfdea3-28ba-47f3-860c-6e7af2fe3222","Type":"ContainerStarted","Data":"f802d81361129e91b45cbfd03474f2a1b10fce7d7c4c0a6300b2534ce641a06a"} Apr 02 14:00:33 crc kubenswrapper[4732]: I0402 14:00:33.505217 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-jwc6l" podStartSLOduration=3.435205349 podStartE2EDuration="46.505202456s" podCreationTimestamp="2026-04-02 13:59:47 +0000 UTC" firstStartedPulling="2026-04-02 13:59:49.261789714 +0000 UTC m=+1346.166197267" lastFinishedPulling="2026-04-02 14:00:32.331786821 +0000 UTC m=+1389.236194374" observedRunningTime="2026-04-02 14:00:33.499735629 +0000 UTC m=+1390.404143182" watchObservedRunningTime="2026-04-02 14:00:33.505202456 +0000 UTC m=+1390.409609999" Apr 02 14:00:33 crc kubenswrapper[4732]: I0402 14:00:33.507606 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerStarted","Data":"de6153f9349b412a56e88983b18d3d8fdd63881d0461412cebd345d437c6871b"} Apr 02 14:00:34 crc kubenswrapper[4732]: I0402 14:00:34.522653 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d4c8876f7-592x4" event={"ID":"5acfdea3-28ba-47f3-860c-6e7af2fe3222","Type":"ContainerStarted","Data":"2c5a017552633b578016cb93f8399c58c06be0e99fbc85b8a212d9dc8dd5f946"} Apr 02 14:00:34 crc kubenswrapper[4732]: I0402 14:00:34.570393 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-d4c8876f7-592x4" podStartSLOduration=2.5703711350000003 podStartE2EDuration="2.570371135s" podCreationTimestamp="2026-04-02 14:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:00:34.547072287 +0000 UTC m=+1391.451479850" watchObservedRunningTime="2026-04-02 14:00:34.570371135 +0000 UTC m=+1391.474778688" Apr 02 14:00:34 crc kubenswrapper[4732]: I0402 14:00:34.695944 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38e262a1-5383-4dfb-8f89-537e4db5559c" path="/var/lib/kubelet/pods/38e262a1-5383-4dfb-8f89-537e4db5559c/volumes" Apr 02 14:00:35 crc kubenswrapper[4732]: I0402 14:00:35.534671 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:00:36 crc kubenswrapper[4732]: I0402 14:00:36.598603 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-59fb764b6d-vml5x" podUID="80fe4580-a48f-4cba-b3b3-d6ccbc7f0525" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Apr 02 14:00:36 crc kubenswrapper[4732]: I0402 14:00:36.733815 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-54f994999b-b88d7" podUID="97d6e519-a82f-4ce5-9199-4d7db769f86b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Apr 02 14:00:37 crc kubenswrapper[4732]: I0402 14:00:37.557996 4732 generic.go:334] "Generic (PLEG): container finished" podID="64f807d9-0af7-4723-98b2-dd3cbe55df99" containerID="640881bf9aa75a112cdc7f52c1419406cdb9d754515996f2df4a58ab10015672" exitCode=0 Apr 02 14:00:37 crc kubenswrapper[4732]: I0402 14:00:37.558040 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dh5g6" event={"ID":"64f807d9-0af7-4723-98b2-dd3cbe55df99","Type":"ContainerDied","Data":"640881bf9aa75a112cdc7f52c1419406cdb9d754515996f2df4a58ab10015672"} Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.310188 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dh5g6" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.372608 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64f807d9-0af7-4723-98b2-dd3cbe55df99-scripts\") pod \"64f807d9-0af7-4723-98b2-dd3cbe55df99\" (UID: \"64f807d9-0af7-4723-98b2-dd3cbe55df99\") " Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.372723 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64f807d9-0af7-4723-98b2-dd3cbe55df99-logs\") pod \"64f807d9-0af7-4723-98b2-dd3cbe55df99\" (UID: \"64f807d9-0af7-4723-98b2-dd3cbe55df99\") " Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.372838 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29n8v\" (UniqueName: \"kubernetes.io/projected/64f807d9-0af7-4723-98b2-dd3cbe55df99-kube-api-access-29n8v\") pod \"64f807d9-0af7-4723-98b2-dd3cbe55df99\" (UID: \"64f807d9-0af7-4723-98b2-dd3cbe55df99\") " Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.372867 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64f807d9-0af7-4723-98b2-dd3cbe55df99-combined-ca-bundle\") pod \"64f807d9-0af7-4723-98b2-dd3cbe55df99\" (UID: \"64f807d9-0af7-4723-98b2-dd3cbe55df99\") " Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.372919 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64f807d9-0af7-4723-98b2-dd3cbe55df99-config-data\") pod \"64f807d9-0af7-4723-98b2-dd3cbe55df99\" (UID: \"64f807d9-0af7-4723-98b2-dd3cbe55df99\") " Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.373759 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64f807d9-0af7-4723-98b2-dd3cbe55df99-logs" (OuterVolumeSpecName: "logs") pod "64f807d9-0af7-4723-98b2-dd3cbe55df99" (UID: "64f807d9-0af7-4723-98b2-dd3cbe55df99"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.384777 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64f807d9-0af7-4723-98b2-dd3cbe55df99-scripts" (OuterVolumeSpecName: "scripts") pod "64f807d9-0af7-4723-98b2-dd3cbe55df99" (UID: "64f807d9-0af7-4723-98b2-dd3cbe55df99"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.385781 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64f807d9-0af7-4723-98b2-dd3cbe55df99-kube-api-access-29n8v" (OuterVolumeSpecName: "kube-api-access-29n8v") pod "64f807d9-0af7-4723-98b2-dd3cbe55df99" (UID: "64f807d9-0af7-4723-98b2-dd3cbe55df99"). InnerVolumeSpecName "kube-api-access-29n8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.399446 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64f807d9-0af7-4723-98b2-dd3cbe55df99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64f807d9-0af7-4723-98b2-dd3cbe55df99" (UID: "64f807d9-0af7-4723-98b2-dd3cbe55df99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.408791 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64f807d9-0af7-4723-98b2-dd3cbe55df99-config-data" (OuterVolumeSpecName: "config-data") pod "64f807d9-0af7-4723-98b2-dd3cbe55df99" (UID: "64f807d9-0af7-4723-98b2-dd3cbe55df99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.474867 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64f807d9-0af7-4723-98b2-dd3cbe55df99-logs\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.474902 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29n8v\" (UniqueName: \"kubernetes.io/projected/64f807d9-0af7-4723-98b2-dd3cbe55df99-kube-api-access-29n8v\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.474912 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64f807d9-0af7-4723-98b2-dd3cbe55df99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.474920 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64f807d9-0af7-4723-98b2-dd3cbe55df99-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.474929 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64f807d9-0af7-4723-98b2-dd3cbe55df99-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.578193 4732 generic.go:334] "Generic (PLEG): container finished" podID="34aed337-bbff-45a7-b95f-b26c95733c82" containerID="d8df9e280fcecedceff628b0366a8cfd007c16737c9bc8bf5c8a30e7d6f7222a" exitCode=0 Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.578299 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-d54q6" event={"ID":"34aed337-bbff-45a7-b95f-b26c95733c82","Type":"ContainerDied","Data":"d8df9e280fcecedceff628b0366a8cfd007c16737c9bc8bf5c8a30e7d6f7222a"} Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.581948 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dh5g6" event={"ID":"64f807d9-0af7-4723-98b2-dd3cbe55df99","Type":"ContainerDied","Data":"35fe3abdb2033f0af21275b5e23119e97373deab4f8cccae867278c4df806f93"} Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.581990 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35fe3abdb2033f0af21275b5e23119e97373deab4f8cccae867278c4df806f93" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.581991 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dh5g6" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.676915 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5688fc477d-p59pf"] Apr 02 14:00:39 crc kubenswrapper[4732]: E0402 14:00:39.677485 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64f807d9-0af7-4723-98b2-dd3cbe55df99" containerName="placement-db-sync" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.677500 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="64f807d9-0af7-4723-98b2-dd3cbe55df99" containerName="placement-db-sync" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.677723 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="64f807d9-0af7-4723-98b2-dd3cbe55df99" containerName="placement-db-sync" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.683736 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.690833 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5688fc477d-p59pf"] Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.693838 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.694014 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.694257 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.694601 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.694701 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-t4l5w" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.780023 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c11a1fe8-1217-4e5b-b172-642b85527099-scripts\") pod \"placement-5688fc477d-p59pf\" (UID: \"c11a1fe8-1217-4e5b-b172-642b85527099\") " pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.780141 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c11a1fe8-1217-4e5b-b172-642b85527099-internal-tls-certs\") pod \"placement-5688fc477d-p59pf\" (UID: \"c11a1fe8-1217-4e5b-b172-642b85527099\") " pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.780171 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cn5l\" (UniqueName: \"kubernetes.io/projected/c11a1fe8-1217-4e5b-b172-642b85527099-kube-api-access-5cn5l\") pod \"placement-5688fc477d-p59pf\" (UID: \"c11a1fe8-1217-4e5b-b172-642b85527099\") " pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.780217 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11a1fe8-1217-4e5b-b172-642b85527099-config-data\") pod \"placement-5688fc477d-p59pf\" (UID: \"c11a1fe8-1217-4e5b-b172-642b85527099\") " pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.780290 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11a1fe8-1217-4e5b-b172-642b85527099-combined-ca-bundle\") pod \"placement-5688fc477d-p59pf\" (UID: \"c11a1fe8-1217-4e5b-b172-642b85527099\") " pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.780331 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c11a1fe8-1217-4e5b-b172-642b85527099-public-tls-certs\") pod \"placement-5688fc477d-p59pf\" (UID: \"c11a1fe8-1217-4e5b-b172-642b85527099\") " pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.780370 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c11a1fe8-1217-4e5b-b172-642b85527099-logs\") pod \"placement-5688fc477d-p59pf\" (UID: \"c11a1fe8-1217-4e5b-b172-642b85527099\") " pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.881991 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c11a1fe8-1217-4e5b-b172-642b85527099-scripts\") pod \"placement-5688fc477d-p59pf\" (UID: \"c11a1fe8-1217-4e5b-b172-642b85527099\") " pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.882416 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c11a1fe8-1217-4e5b-b172-642b85527099-internal-tls-certs\") pod \"placement-5688fc477d-p59pf\" (UID: \"c11a1fe8-1217-4e5b-b172-642b85527099\") " pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.882446 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cn5l\" (UniqueName: \"kubernetes.io/projected/c11a1fe8-1217-4e5b-b172-642b85527099-kube-api-access-5cn5l\") pod \"placement-5688fc477d-p59pf\" (UID: \"c11a1fe8-1217-4e5b-b172-642b85527099\") " pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.882483 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11a1fe8-1217-4e5b-b172-642b85527099-config-data\") pod \"placement-5688fc477d-p59pf\" (UID: \"c11a1fe8-1217-4e5b-b172-642b85527099\") " pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.882512 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11a1fe8-1217-4e5b-b172-642b85527099-combined-ca-bundle\") pod \"placement-5688fc477d-p59pf\" (UID: \"c11a1fe8-1217-4e5b-b172-642b85527099\") " pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.882538 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c11a1fe8-1217-4e5b-b172-642b85527099-public-tls-certs\") pod \"placement-5688fc477d-p59pf\" (UID: \"c11a1fe8-1217-4e5b-b172-642b85527099\") " pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.882567 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c11a1fe8-1217-4e5b-b172-642b85527099-logs\") pod \"placement-5688fc477d-p59pf\" (UID: \"c11a1fe8-1217-4e5b-b172-642b85527099\") " pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.883336 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c11a1fe8-1217-4e5b-b172-642b85527099-logs\") pod \"placement-5688fc477d-p59pf\" (UID: \"c11a1fe8-1217-4e5b-b172-642b85527099\") " pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.886783 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c11a1fe8-1217-4e5b-b172-642b85527099-scripts\") pod \"placement-5688fc477d-p59pf\" (UID: \"c11a1fe8-1217-4e5b-b172-642b85527099\") " pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.886941 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c11a1fe8-1217-4e5b-b172-642b85527099-internal-tls-certs\") pod \"placement-5688fc477d-p59pf\" (UID: \"c11a1fe8-1217-4e5b-b172-642b85527099\") " pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.887440 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11a1fe8-1217-4e5b-b172-642b85527099-combined-ca-bundle\") pod \"placement-5688fc477d-p59pf\" (UID: \"c11a1fe8-1217-4e5b-b172-642b85527099\") " pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.890601 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c11a1fe8-1217-4e5b-b172-642b85527099-public-tls-certs\") pod \"placement-5688fc477d-p59pf\" (UID: \"c11a1fe8-1217-4e5b-b172-642b85527099\") " pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.898351 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11a1fe8-1217-4e5b-b172-642b85527099-config-data\") pod \"placement-5688fc477d-p59pf\" (UID: \"c11a1fe8-1217-4e5b-b172-642b85527099\") " pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:00:39 crc kubenswrapper[4732]: I0402 14:00:39.902262 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cn5l\" (UniqueName: \"kubernetes.io/projected/c11a1fe8-1217-4e5b-b172-642b85527099-kube-api-access-5cn5l\") pod \"placement-5688fc477d-p59pf\" (UID: \"c11a1fe8-1217-4e5b-b172-642b85527099\") " pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:00:40 crc kubenswrapper[4732]: I0402 14:00:40.032499 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:00:40 crc kubenswrapper[4732]: I0402 14:00:40.552791 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5688fc477d-p59pf"] Apr 02 14:00:40 crc kubenswrapper[4732]: I0402 14:00:40.604110 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2","Type":"ContainerStarted","Data":"2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a"} Apr 02 14:00:40 crc kubenswrapper[4732]: I0402 14:00:40.604450 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" containerName="ceilometer-central-agent" containerID="cri-o://da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1" gracePeriod=30 Apr 02 14:00:40 crc kubenswrapper[4732]: I0402 14:00:40.605445 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Apr 02 14:00:40 crc kubenswrapper[4732]: I0402 14:00:40.605846 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" containerName="proxy-httpd" containerID="cri-o://2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a" gracePeriod=30 Apr 02 14:00:40 crc kubenswrapper[4732]: I0402 14:00:40.605964 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" containerName="sg-core" containerID="cri-o://620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549" gracePeriod=30 Apr 02 14:00:40 crc kubenswrapper[4732]: I0402 14:00:40.606185 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" containerName="ceilometer-notification-agent" containerID="cri-o://e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61" gracePeriod=30 Apr 02 14:00:40 crc kubenswrapper[4732]: I0402 14:00:40.610604 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5688fc477d-p59pf" event={"ID":"c11a1fe8-1217-4e5b-b172-642b85527099","Type":"ContainerStarted","Data":"83afd951ea2917a3f4bb396e24f4f648cd8a6fdf2e6f39e3174ed6dde9ac5a6e"} Apr 02 14:00:40 crc kubenswrapper[4732]: I0402 14:00:40.632185 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.921762465 podStartE2EDuration="53.632161373s" podCreationTimestamp="2026-04-02 13:59:47 +0000 UTC" firstStartedPulling="2026-04-02 13:59:49.271266129 +0000 UTC m=+1346.175673672" lastFinishedPulling="2026-04-02 14:00:39.981665027 +0000 UTC m=+1396.886072580" observedRunningTime="2026-04-02 14:00:40.624570219 +0000 UTC m=+1397.528977782" watchObservedRunningTime="2026-04-02 14:00:40.632161373 +0000 UTC m=+1397.536568926" Apr 02 14:00:40 crc kubenswrapper[4732]: I0402 14:00:40.934081 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-d54q6" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.107920 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/34aed337-bbff-45a7-b95f-b26c95733c82-db-sync-config-data\") pod \"34aed337-bbff-45a7-b95f-b26c95733c82\" (UID: \"34aed337-bbff-45a7-b95f-b26c95733c82\") " Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.108416 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34aed337-bbff-45a7-b95f-b26c95733c82-combined-ca-bundle\") pod \"34aed337-bbff-45a7-b95f-b26c95733c82\" (UID: \"34aed337-bbff-45a7-b95f-b26c95733c82\") " Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.108598 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb6j4\" (UniqueName: \"kubernetes.io/projected/34aed337-bbff-45a7-b95f-b26c95733c82-kube-api-access-xb6j4\") pod \"34aed337-bbff-45a7-b95f-b26c95733c82\" (UID: \"34aed337-bbff-45a7-b95f-b26c95733c82\") " Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.113804 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34aed337-bbff-45a7-b95f-b26c95733c82-kube-api-access-xb6j4" (OuterVolumeSpecName: "kube-api-access-xb6j4") pod "34aed337-bbff-45a7-b95f-b26c95733c82" (UID: "34aed337-bbff-45a7-b95f-b26c95733c82"). InnerVolumeSpecName "kube-api-access-xb6j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.113838 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34aed337-bbff-45a7-b95f-b26c95733c82-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "34aed337-bbff-45a7-b95f-b26c95733c82" (UID: "34aed337-bbff-45a7-b95f-b26c95733c82"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.146205 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34aed337-bbff-45a7-b95f-b26c95733c82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34aed337-bbff-45a7-b95f-b26c95733c82" (UID: "34aed337-bbff-45a7-b95f-b26c95733c82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.211432 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb6j4\" (UniqueName: \"kubernetes.io/projected/34aed337-bbff-45a7-b95f-b26c95733c82-kube-api-access-xb6j4\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.211486 4732 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/34aed337-bbff-45a7-b95f-b26c95733c82-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.211500 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34aed337-bbff-45a7-b95f-b26c95733c82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.586073 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.624483 4732 generic.go:334] "Generic (PLEG): container finished" podID="d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" containerID="2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a" exitCode=0 Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.624523 4732 generic.go:334] "Generic (PLEG): container finished" podID="d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" containerID="620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549" exitCode=2 Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.624533 4732 generic.go:334] "Generic (PLEG): container finished" podID="d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" containerID="e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61" exitCode=0 Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.624542 4732 generic.go:334] "Generic (PLEG): container finished" podID="d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" containerID="da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1" exitCode=0 Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.624593 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2","Type":"ContainerDied","Data":"2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a"} Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.624644 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2","Type":"ContainerDied","Data":"620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549"} Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.624660 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2","Type":"ContainerDied","Data":"e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61"} Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.624674 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2","Type":"ContainerDied","Data":"da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1"} Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.624685 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2","Type":"ContainerDied","Data":"83b28ad9663b6eaa93684ced8107088ee5f6704b3379e196f07940d35f654d8e"} Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.624702 4732 scope.go:117] "RemoveContainer" containerID="2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.624902 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.631285 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5688fc477d-p59pf" event={"ID":"c11a1fe8-1217-4e5b-b172-642b85527099","Type":"ContainerStarted","Data":"e70d098726b7cceec23f9b8b6b9b7b81fc25df6c08073a3e050300b6d6a40d37"} Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.631318 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5688fc477d-p59pf" event={"ID":"c11a1fe8-1217-4e5b-b172-642b85527099","Type":"ContainerStarted","Data":"cb9a9a86013f516e45687c5588c9d54233eec6af2a9b6126ba8dcfb65cdeeb64"} Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.631337 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.631408 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.644783 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-d54q6" event={"ID":"34aed337-bbff-45a7-b95f-b26c95733c82","Type":"ContainerDied","Data":"0306e2800195b1ed7eaedaf457b0a9f9703f11e85abde57269508338b34fbeb4"} Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.644831 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0306e2800195b1ed7eaedaf457b0a9f9703f11e85abde57269508338b34fbeb4" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.644897 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-d54q6" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.666421 4732 scope.go:117] "RemoveContainer" containerID="620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.688204 4732 scope.go:117] "RemoveContainer" containerID="e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.706655 4732 scope.go:117] "RemoveContainer" containerID="da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.721191 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-config-data\") pod \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.721257 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-log-httpd\") pod \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.723844 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" (UID: "d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.724099 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt6px\" (UniqueName: \"kubernetes.io/projected/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-kube-api-access-kt6px\") pod \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.724181 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-run-httpd\") pod \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.724212 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-combined-ca-bundle\") pod \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.724386 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-sg-core-conf-yaml\") pod \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.724410 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-scripts\") pod \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\" (UID: \"d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2\") " Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.726657 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" (UID: "d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.727394 4732 scope.go:117] "RemoveContainer" containerID="2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.728421 4732 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-run-httpd\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.728439 4732 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-log-httpd\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:41 crc kubenswrapper[4732]: E0402 14:00:41.730916 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a\": container with ID starting with 2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a not found: ID does not exist" containerID="2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.730967 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a"} err="failed to get container status \"2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a\": rpc error: code = NotFound desc = could not find container \"2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a\": container with ID starting with 2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a not found: ID does not exist" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.730999 4732 scope.go:117] "RemoveContainer" containerID="620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549" Apr 02 14:00:41 crc kubenswrapper[4732]: E0402 14:00:41.732512 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549\": container with ID starting with 620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549 not found: ID does not exist" containerID="620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.733447 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549"} err="failed to get container status \"620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549\": rpc error: code = NotFound desc = could not find container \"620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549\": container with ID starting with 620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549 not found: ID does not exist" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.733567 4732 scope.go:117] "RemoveContainer" containerID="e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61" Apr 02 14:00:41 crc kubenswrapper[4732]: E0402 14:00:41.735660 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61\": container with ID starting with e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61 not found: ID does not exist" containerID="e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.735711 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61"} err="failed to get container status \"e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61\": rpc error: code = NotFound desc = could not find container \"e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61\": container with ID starting with e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61 not found: ID does not exist" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.735743 4732 scope.go:117] "RemoveContainer" containerID="da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1" Apr 02 14:00:41 crc kubenswrapper[4732]: E0402 14:00:41.739782 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1\": container with ID starting with da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1 not found: ID does not exist" containerID="da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.739836 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1"} err="failed to get container status \"da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1\": rpc error: code = NotFound desc = could not find container \"da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1\": container with ID starting with da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1 not found: ID does not exist" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.739865 4732 scope.go:117] "RemoveContainer" containerID="2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.743751 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a"} err="failed to get container status \"2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a\": rpc error: code = NotFound desc = could not find container \"2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a\": container with ID starting with 2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a not found: ID does not exist" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.743794 4732 scope.go:117] "RemoveContainer" containerID="620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.747806 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549"} err="failed to get container status \"620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549\": rpc error: code = NotFound desc = could not find container \"620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549\": container with ID starting with 620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549 not found: ID does not exist" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.747857 4732 scope.go:117] "RemoveContainer" containerID="e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.748410 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61"} err="failed to get container status \"e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61\": rpc error: code = NotFound desc = could not find container \"e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61\": container with ID starting with e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61 not found: ID does not exist" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.748479 4732 scope.go:117] "RemoveContainer" containerID="da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.749316 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1"} err="failed to get container status \"da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1\": rpc error: code = NotFound desc = could not find container \"da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1\": container with ID starting with da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1 not found: ID does not exist" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.749343 4732 scope.go:117] "RemoveContainer" containerID="2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.750469 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a"} err="failed to get container status \"2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a\": rpc error: code = NotFound desc = could not find container \"2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a\": container with ID starting with 2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a not found: ID does not exist" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.750497 4732 scope.go:117] "RemoveContainer" containerID="620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.751108 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549"} err="failed to get container status \"620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549\": rpc error: code = NotFound desc = could not find container \"620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549\": container with ID starting with 620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549 not found: ID does not exist" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.751204 4732 scope.go:117] "RemoveContainer" containerID="e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.751495 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61"} err="failed to get container status \"e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61\": rpc error: code = NotFound desc = could not find container \"e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61\": container with ID starting with e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61 not found: ID does not exist" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.751588 4732 scope.go:117] "RemoveContainer" containerID="da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.754891 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-scripts" (OuterVolumeSpecName: "scripts") pod "d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" (UID: "d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.755753 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1"} err="failed to get container status \"da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1\": rpc error: code = NotFound desc = could not find container \"da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1\": container with ID starting with da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1 not found: ID does not exist" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.755803 4732 scope.go:117] "RemoveContainer" containerID="2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.759743 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a"} err="failed to get container status \"2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a\": rpc error: code = NotFound desc = could not find container \"2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a\": container with ID starting with 2f8c6d45bd4fd75cf6720514103ee845ac7693894f5b6a7280e4e418f6a8dc3a not found: ID does not exist" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.759782 4732 scope.go:117] "RemoveContainer" containerID="620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.761517 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549"} err="failed to get container status \"620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549\": rpc error: code = NotFound desc = could not find container \"620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549\": container with ID starting with 620c17f9c4acee5d78d07655bb56f5be139ff6a014f3b638f1d82c103c177549 not found: ID does not exist" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.761551 4732 scope.go:117] "RemoveContainer" containerID="e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.767784 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61"} err="failed to get container status \"e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61\": rpc error: code = NotFound desc = could not find container \"e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61\": container with ID starting with e8022f6d0b6e80a948eb75dbb2d62993b11b0f7ded2641993dac62d06557fa61 not found: ID does not exist" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.767831 4732 scope.go:117] "RemoveContainer" containerID="da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.771829 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-kube-api-access-kt6px" (OuterVolumeSpecName: "kube-api-access-kt6px") pod "d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" (UID: "d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2"). InnerVolumeSpecName "kube-api-access-kt6px". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.771837 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1"} err="failed to get container status \"da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1\": rpc error: code = NotFound desc = could not find container \"da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1\": container with ID starting with da5dc9a5e046457bbb671c6f3a0e51770bb3f4a02a5572f43becce6ef4dddcf1 not found: ID does not exist" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.779412 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" (UID: "d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.830221 4732 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.830513 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.830595 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt6px\" (UniqueName: \"kubernetes.io/projected/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-kube-api-access-kt6px\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.893860 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" (UID: "d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.953128 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:41 crc kubenswrapper[4732]: I0402 14:00:41.953269 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-config-data" (OuterVolumeSpecName: "config-data") pod "d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" (UID: "d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.014912 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5688fc477d-p59pf" podStartSLOduration=3.014890728 podStartE2EDuration="3.014890728s" podCreationTimestamp="2026-04-02 14:00:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:00:41.659867952 +0000 UTC m=+1398.564275515" watchObservedRunningTime="2026-04-02 14:00:42.014890728 +0000 UTC m=+1398.919298281" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.018850 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-sxl2t"] Apr 02 14:00:42 crc kubenswrapper[4732]: E0402 14:00:42.019190 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" containerName="ceilometer-notification-agent" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.019201 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" containerName="ceilometer-notification-agent" Apr 02 14:00:42 crc kubenswrapper[4732]: E0402 14:00:42.019216 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" containerName="ceilometer-central-agent" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.019223 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" containerName="ceilometer-central-agent" Apr 02 14:00:42 crc kubenswrapper[4732]: E0402 14:00:42.019253 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" containerName="sg-core" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.019260 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" containerName="sg-core" Apr 02 14:00:42 crc kubenswrapper[4732]: E0402 14:00:42.019271 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34aed337-bbff-45a7-b95f-b26c95733c82" containerName="barbican-db-sync" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.019277 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="34aed337-bbff-45a7-b95f-b26c95733c82" containerName="barbican-db-sync" Apr 02 14:00:42 crc kubenswrapper[4732]: E0402 14:00:42.019294 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" containerName="proxy-httpd" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.019301 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" containerName="proxy-httpd" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.019453 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" containerName="sg-core" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.019467 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" containerName="ceilometer-central-agent" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.019477 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" containerName="proxy-httpd" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.019487 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" containerName="ceilometer-notification-agent" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.019500 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="34aed337-bbff-45a7-b95f-b26c95733c82" containerName="barbican-db-sync" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.020334 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.043677 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-sxl2t"] Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.054469 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzjzq\" (UniqueName: \"kubernetes.io/projected/df175395-7cbc-4158-907a-b0ffce6a6efb-kube-api-access-mzjzq\") pod \"dnsmasq-dns-688c87cc99-sxl2t\" (UID: \"df175395-7cbc-4158-907a-b0ffce6a6efb\") " pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.054825 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-sxl2t\" (UID: \"df175395-7cbc-4158-907a-b0ffce6a6efb\") " pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.054931 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-sxl2t\" (UID: \"df175395-7cbc-4158-907a-b0ffce6a6efb\") " pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.055075 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-config\") pod \"dnsmasq-dns-688c87cc99-sxl2t\" (UID: \"df175395-7cbc-4158-907a-b0ffce6a6efb\") " pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.055185 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-dns-svc\") pod \"dnsmasq-dns-688c87cc99-sxl2t\" (UID: \"df175395-7cbc-4158-907a-b0ffce6a6efb\") " pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.055326 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-sxl2t\" (UID: \"df175395-7cbc-4158-907a-b0ffce6a6efb\") " pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.055541 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.083682 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-98d7bf879-xkszz"] Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.085593 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-98d7bf879-xkszz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.097683 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-pxmzf" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.097941 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.098109 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.121467 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-f897cfb64-ql8wz"] Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.123648 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-f897cfb64-ql8wz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.130977 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.162006 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-sxl2t\" (UID: \"df175395-7cbc-4158-907a-b0ffce6a6efb\") " pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.162049 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-sxl2t\" (UID: \"df175395-7cbc-4158-907a-b0ffce6a6efb\") " pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.162118 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-config\") pod \"dnsmasq-dns-688c87cc99-sxl2t\" (UID: \"df175395-7cbc-4158-907a-b0ffce6a6efb\") " pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.162152 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-dns-svc\") pod \"dnsmasq-dns-688c87cc99-sxl2t\" (UID: \"df175395-7cbc-4158-907a-b0ffce6a6efb\") " pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.162173 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-sxl2t\" (UID: \"df175395-7cbc-4158-907a-b0ffce6a6efb\") " pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.162215 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzjzq\" (UniqueName: \"kubernetes.io/projected/df175395-7cbc-4158-907a-b0ffce6a6efb-kube-api-access-mzjzq\") pod \"dnsmasq-dns-688c87cc99-sxl2t\" (UID: \"df175395-7cbc-4158-907a-b0ffce6a6efb\") " pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.163452 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-sxl2t\" (UID: \"df175395-7cbc-4158-907a-b0ffce6a6efb\") " pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.165222 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-sxl2t\" (UID: \"df175395-7cbc-4158-907a-b0ffce6a6efb\") " pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.166467 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-sxl2t\" (UID: \"df175395-7cbc-4158-907a-b0ffce6a6efb\") " pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.171529 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-config\") pod \"dnsmasq-dns-688c87cc99-sxl2t\" (UID: \"df175395-7cbc-4158-907a-b0ffce6a6efb\") " pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.173919 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-dns-svc\") pod \"dnsmasq-dns-688c87cc99-sxl2t\" (UID: \"df175395-7cbc-4158-907a-b0ffce6a6efb\") " pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.207547 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzjzq\" (UniqueName: \"kubernetes.io/projected/df175395-7cbc-4158-907a-b0ffce6a6efb-kube-api-access-mzjzq\") pod \"dnsmasq-dns-688c87cc99-sxl2t\" (UID: \"df175395-7cbc-4158-907a-b0ffce6a6efb\") " pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.240712 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-98d7bf879-xkszz"] Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.264965 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eed39a7-f437-403d-acab-246fa6d25c4b-config-data\") pod \"barbican-worker-98d7bf879-xkszz\" (UID: \"8eed39a7-f437-403d-acab-246fa6d25c4b\") " pod="openstack/barbican-worker-98d7bf879-xkszz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.265212 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2mbg\" (UniqueName: \"kubernetes.io/projected/5e017590-845a-4f52-a6ae-258890dd6388-kube-api-access-g2mbg\") pod \"barbican-keystone-listener-f897cfb64-ql8wz\" (UID: \"5e017590-845a-4f52-a6ae-258890dd6388\") " pod="openstack/barbican-keystone-listener-f897cfb64-ql8wz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.265287 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e017590-845a-4f52-a6ae-258890dd6388-logs\") pod \"barbican-keystone-listener-f897cfb64-ql8wz\" (UID: \"5e017590-845a-4f52-a6ae-258890dd6388\") " pod="openstack/barbican-keystone-listener-f897cfb64-ql8wz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.265381 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eed39a7-f437-403d-acab-246fa6d25c4b-combined-ca-bundle\") pod \"barbican-worker-98d7bf879-xkszz\" (UID: \"8eed39a7-f437-403d-acab-246fa6d25c4b\") " pod="openstack/barbican-worker-98d7bf879-xkszz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.265570 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctt8x\" (UniqueName: \"kubernetes.io/projected/8eed39a7-f437-403d-acab-246fa6d25c4b-kube-api-access-ctt8x\") pod \"barbican-worker-98d7bf879-xkszz\" (UID: \"8eed39a7-f437-403d-acab-246fa6d25c4b\") " pod="openstack/barbican-worker-98d7bf879-xkszz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.265804 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eed39a7-f437-403d-acab-246fa6d25c4b-logs\") pod \"barbican-worker-98d7bf879-xkszz\" (UID: \"8eed39a7-f437-403d-acab-246fa6d25c4b\") " pod="openstack/barbican-worker-98d7bf879-xkszz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.265874 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e017590-845a-4f52-a6ae-258890dd6388-combined-ca-bundle\") pod \"barbican-keystone-listener-f897cfb64-ql8wz\" (UID: \"5e017590-845a-4f52-a6ae-258890dd6388\") " pod="openstack/barbican-keystone-listener-f897cfb64-ql8wz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.265905 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e017590-845a-4f52-a6ae-258890dd6388-config-data-custom\") pod \"barbican-keystone-listener-f897cfb64-ql8wz\" (UID: \"5e017590-845a-4f52-a6ae-258890dd6388\") " pod="openstack/barbican-keystone-listener-f897cfb64-ql8wz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.266000 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e017590-845a-4f52-a6ae-258890dd6388-config-data\") pod \"barbican-keystone-listener-f897cfb64-ql8wz\" (UID: \"5e017590-845a-4f52-a6ae-258890dd6388\") " pod="openstack/barbican-keystone-listener-f897cfb64-ql8wz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.266530 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8eed39a7-f437-403d-acab-246fa6d25c4b-config-data-custom\") pod \"barbican-worker-98d7bf879-xkszz\" (UID: \"8eed39a7-f437-403d-acab-246fa6d25c4b\") " pod="openstack/barbican-worker-98d7bf879-xkszz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.303077 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-f897cfb64-ql8wz"] Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.354584 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6488b8fdcd-j9s62"] Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.356114 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6488b8fdcd-j9s62" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.359750 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.360291 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.366272 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6488b8fdcd-j9s62"] Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.368041 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e017590-845a-4f52-a6ae-258890dd6388-config-data\") pod \"barbican-keystone-listener-f897cfb64-ql8wz\" (UID: \"5e017590-845a-4f52-a6ae-258890dd6388\") " pod="openstack/barbican-keystone-listener-f897cfb64-ql8wz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.368179 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8eed39a7-f437-403d-acab-246fa6d25c4b-config-data-custom\") pod \"barbican-worker-98d7bf879-xkszz\" (UID: \"8eed39a7-f437-403d-acab-246fa6d25c4b\") " pod="openstack/barbican-worker-98d7bf879-xkszz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.368841 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eed39a7-f437-403d-acab-246fa6d25c4b-config-data\") pod \"barbican-worker-98d7bf879-xkszz\" (UID: \"8eed39a7-f437-403d-acab-246fa6d25c4b\") " pod="openstack/barbican-worker-98d7bf879-xkszz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.368887 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2mbg\" (UniqueName: \"kubernetes.io/projected/5e017590-845a-4f52-a6ae-258890dd6388-kube-api-access-g2mbg\") pod \"barbican-keystone-listener-f897cfb64-ql8wz\" (UID: \"5e017590-845a-4f52-a6ae-258890dd6388\") " pod="openstack/barbican-keystone-listener-f897cfb64-ql8wz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.368922 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e017590-845a-4f52-a6ae-258890dd6388-logs\") pod \"barbican-keystone-listener-f897cfb64-ql8wz\" (UID: \"5e017590-845a-4f52-a6ae-258890dd6388\") " pod="openstack/barbican-keystone-listener-f897cfb64-ql8wz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.368975 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eed39a7-f437-403d-acab-246fa6d25c4b-combined-ca-bundle\") pod \"barbican-worker-98d7bf879-xkszz\" (UID: \"8eed39a7-f437-403d-acab-246fa6d25c4b\") " pod="openstack/barbican-worker-98d7bf879-xkszz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.369032 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctt8x\" (UniqueName: \"kubernetes.io/projected/8eed39a7-f437-403d-acab-246fa6d25c4b-kube-api-access-ctt8x\") pod \"barbican-worker-98d7bf879-xkszz\" (UID: \"8eed39a7-f437-403d-acab-246fa6d25c4b\") " pod="openstack/barbican-worker-98d7bf879-xkszz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.369105 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eed39a7-f437-403d-acab-246fa6d25c4b-logs\") pod \"barbican-worker-98d7bf879-xkszz\" (UID: \"8eed39a7-f437-403d-acab-246fa6d25c4b\") " pod="openstack/barbican-worker-98d7bf879-xkszz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.369139 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e017590-845a-4f52-a6ae-258890dd6388-combined-ca-bundle\") pod \"barbican-keystone-listener-f897cfb64-ql8wz\" (UID: \"5e017590-845a-4f52-a6ae-258890dd6388\") " pod="openstack/barbican-keystone-listener-f897cfb64-ql8wz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.369162 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e017590-845a-4f52-a6ae-258890dd6388-config-data-custom\") pod \"barbican-keystone-listener-f897cfb64-ql8wz\" (UID: \"5e017590-845a-4f52-a6ae-258890dd6388\") " pod="openstack/barbican-keystone-listener-f897cfb64-ql8wz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.369690 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eed39a7-f437-403d-acab-246fa6d25c4b-logs\") pod \"barbican-worker-98d7bf879-xkszz\" (UID: \"8eed39a7-f437-403d-acab-246fa6d25c4b\") " pod="openstack/barbican-worker-98d7bf879-xkszz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.369851 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e017590-845a-4f52-a6ae-258890dd6388-logs\") pod \"barbican-keystone-listener-f897cfb64-ql8wz\" (UID: \"5e017590-845a-4f52-a6ae-258890dd6388\") " pod="openstack/barbican-keystone-listener-f897cfb64-ql8wz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.377390 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8eed39a7-f437-403d-acab-246fa6d25c4b-config-data-custom\") pod \"barbican-worker-98d7bf879-xkszz\" (UID: \"8eed39a7-f437-403d-acab-246fa6d25c4b\") " pod="openstack/barbican-worker-98d7bf879-xkszz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.377990 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e017590-845a-4f52-a6ae-258890dd6388-combined-ca-bundle\") pod \"barbican-keystone-listener-f897cfb64-ql8wz\" (UID: \"5e017590-845a-4f52-a6ae-258890dd6388\") " pod="openstack/barbican-keystone-listener-f897cfb64-ql8wz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.378323 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e017590-845a-4f52-a6ae-258890dd6388-config-data-custom\") pod \"barbican-keystone-listener-f897cfb64-ql8wz\" (UID: \"5e017590-845a-4f52-a6ae-258890dd6388\") " pod="openstack/barbican-keystone-listener-f897cfb64-ql8wz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.380296 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e017590-845a-4f52-a6ae-258890dd6388-config-data\") pod \"barbican-keystone-listener-f897cfb64-ql8wz\" (UID: \"5e017590-845a-4f52-a6ae-258890dd6388\") " pod="openstack/barbican-keystone-listener-f897cfb64-ql8wz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.385965 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eed39a7-f437-403d-acab-246fa6d25c4b-combined-ca-bundle\") pod \"barbican-worker-98d7bf879-xkszz\" (UID: \"8eed39a7-f437-403d-acab-246fa6d25c4b\") " pod="openstack/barbican-worker-98d7bf879-xkszz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.392438 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eed39a7-f437-403d-acab-246fa6d25c4b-config-data\") pod \"barbican-worker-98d7bf879-xkszz\" (UID: \"8eed39a7-f437-403d-acab-246fa6d25c4b\") " pod="openstack/barbican-worker-98d7bf879-xkszz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.396271 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctt8x\" (UniqueName: \"kubernetes.io/projected/8eed39a7-f437-403d-acab-246fa6d25c4b-kube-api-access-ctt8x\") pod \"barbican-worker-98d7bf879-xkszz\" (UID: \"8eed39a7-f437-403d-acab-246fa6d25c4b\") " pod="openstack/barbican-worker-98d7bf879-xkszz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.400130 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2mbg\" (UniqueName: \"kubernetes.io/projected/5e017590-845a-4f52-a6ae-258890dd6388-kube-api-access-g2mbg\") pod \"barbican-keystone-listener-f897cfb64-ql8wz\" (UID: \"5e017590-845a-4f52-a6ae-258890dd6388\") " pod="openstack/barbican-keystone-listener-f897cfb64-ql8wz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.427197 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.455017 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-98d7bf879-xkszz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.458433 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.471615 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mhhm\" (UniqueName: \"kubernetes.io/projected/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-kube-api-access-7mhhm\") pod \"barbican-api-6488b8fdcd-j9s62\" (UID: \"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb\") " pod="openstack/barbican-api-6488b8fdcd-j9s62" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.471671 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-config-data\") pod \"barbican-api-6488b8fdcd-j9s62\" (UID: \"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb\") " pod="openstack/barbican-api-6488b8fdcd-j9s62" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.471719 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-logs\") pod \"barbican-api-6488b8fdcd-j9s62\" (UID: \"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb\") " pod="openstack/barbican-api-6488b8fdcd-j9s62" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.471735 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-config-data-custom\") pod \"barbican-api-6488b8fdcd-j9s62\" (UID: \"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb\") " pod="openstack/barbican-api-6488b8fdcd-j9s62" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.471796 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-combined-ca-bundle\") pod \"barbican-api-6488b8fdcd-j9s62\" (UID: \"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb\") " pod="openstack/barbican-api-6488b8fdcd-j9s62" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.476756 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.479817 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.494813 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.495051 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.500476 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-f897cfb64-ql8wz" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.517678 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.573460 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0e53540-98b3-463a-9611-a48c2fbfc0f5-run-httpd\") pod \"ceilometer-0\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " pod="openstack/ceilometer-0" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.573797 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-combined-ca-bundle\") pod \"barbican-api-6488b8fdcd-j9s62\" (UID: \"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb\") " pod="openstack/barbican-api-6488b8fdcd-j9s62" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.573821 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0e53540-98b3-463a-9611-a48c2fbfc0f5-log-httpd\") pod \"ceilometer-0\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " pod="openstack/ceilometer-0" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.573838 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c0e53540-98b3-463a-9611-a48c2fbfc0f5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " pod="openstack/ceilometer-0" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.573907 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e53540-98b3-463a-9611-a48c2fbfc0f5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " pod="openstack/ceilometer-0" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.573936 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx4kr\" (UniqueName: \"kubernetes.io/projected/c0e53540-98b3-463a-9611-a48c2fbfc0f5-kube-api-access-kx4kr\") pod \"ceilometer-0\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " pod="openstack/ceilometer-0" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.574031 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mhhm\" (UniqueName: \"kubernetes.io/projected/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-kube-api-access-7mhhm\") pod \"barbican-api-6488b8fdcd-j9s62\" (UID: \"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb\") " pod="openstack/barbican-api-6488b8fdcd-j9s62" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.574107 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-config-data\") pod \"barbican-api-6488b8fdcd-j9s62\" (UID: \"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb\") " pod="openstack/barbican-api-6488b8fdcd-j9s62" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.574254 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0e53540-98b3-463a-9611-a48c2fbfc0f5-scripts\") pod \"ceilometer-0\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " pod="openstack/ceilometer-0" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.574326 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-logs\") pod \"barbican-api-6488b8fdcd-j9s62\" (UID: \"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb\") " pod="openstack/barbican-api-6488b8fdcd-j9s62" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.574350 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e53540-98b3-463a-9611-a48c2fbfc0f5-config-data\") pod \"ceilometer-0\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " pod="openstack/ceilometer-0" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.574373 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-config-data-custom\") pod \"barbican-api-6488b8fdcd-j9s62\" (UID: \"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb\") " pod="openstack/barbican-api-6488b8fdcd-j9s62" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.575037 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-logs\") pod \"barbican-api-6488b8fdcd-j9s62\" (UID: \"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb\") " pod="openstack/barbican-api-6488b8fdcd-j9s62" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.578712 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-combined-ca-bundle\") pod \"barbican-api-6488b8fdcd-j9s62\" (UID: \"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb\") " pod="openstack/barbican-api-6488b8fdcd-j9s62" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.582426 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-config-data\") pod \"barbican-api-6488b8fdcd-j9s62\" (UID: \"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb\") " pod="openstack/barbican-api-6488b8fdcd-j9s62" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.588285 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-config-data-custom\") pod \"barbican-api-6488b8fdcd-j9s62\" (UID: \"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb\") " pod="openstack/barbican-api-6488b8fdcd-j9s62" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.593972 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mhhm\" (UniqueName: \"kubernetes.io/projected/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-kube-api-access-7mhhm\") pod \"barbican-api-6488b8fdcd-j9s62\" (UID: \"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb\") " pod="openstack/barbican-api-6488b8fdcd-j9s62" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.637871 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6488b8fdcd-j9s62" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.667427 4732 generic.go:334] "Generic (PLEG): container finished" podID="10ba0697-529f-41d3-a1a8-55b50ed024a2" containerID="29099b7d93412a262eb083604ee4e20a01b86f78aa60f531a53fea918e12553a" exitCode=0 Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.667494 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jwc6l" event={"ID":"10ba0697-529f-41d3-a1a8-55b50ed024a2","Type":"ContainerDied","Data":"29099b7d93412a262eb083604ee4e20a01b86f78aa60f531a53fea918e12553a"} Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.675969 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c0e53540-98b3-463a-9611-a48c2fbfc0f5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " pod="openstack/ceilometer-0" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.676000 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0e53540-98b3-463a-9611-a48c2fbfc0f5-log-httpd\") pod \"ceilometer-0\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " pod="openstack/ceilometer-0" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.676064 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e53540-98b3-463a-9611-a48c2fbfc0f5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " pod="openstack/ceilometer-0" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.676091 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx4kr\" (UniqueName: \"kubernetes.io/projected/c0e53540-98b3-463a-9611-a48c2fbfc0f5-kube-api-access-kx4kr\") pod \"ceilometer-0\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " pod="openstack/ceilometer-0" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.676158 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0e53540-98b3-463a-9611-a48c2fbfc0f5-scripts\") pod \"ceilometer-0\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " pod="openstack/ceilometer-0" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.676184 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e53540-98b3-463a-9611-a48c2fbfc0f5-config-data\") pod \"ceilometer-0\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " pod="openstack/ceilometer-0" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.676242 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0e53540-98b3-463a-9611-a48c2fbfc0f5-run-httpd\") pod \"ceilometer-0\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " pod="openstack/ceilometer-0" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.676648 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0e53540-98b3-463a-9611-a48c2fbfc0f5-run-httpd\") pod \"ceilometer-0\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " pod="openstack/ceilometer-0" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.679377 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0e53540-98b3-463a-9611-a48c2fbfc0f5-log-httpd\") pod \"ceilometer-0\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " pod="openstack/ceilometer-0" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.693996 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e53540-98b3-463a-9611-a48c2fbfc0f5-config-data\") pod \"ceilometer-0\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " pod="openstack/ceilometer-0" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.696172 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0e53540-98b3-463a-9611-a48c2fbfc0f5-scripts\") pod \"ceilometer-0\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " pod="openstack/ceilometer-0" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.701844 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c0e53540-98b3-463a-9611-a48c2fbfc0f5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " pod="openstack/ceilometer-0" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.710183 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx4kr\" (UniqueName: \"kubernetes.io/projected/c0e53540-98b3-463a-9611-a48c2fbfc0f5-kube-api-access-kx4kr\") pod \"ceilometer-0\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " pod="openstack/ceilometer-0" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.710267 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e53540-98b3-463a-9611-a48c2fbfc0f5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " pod="openstack/ceilometer-0" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.715196 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2" path="/var/lib/kubelet/pods/d9a50fe7-e9c1-4b4f-b263-9afd882e4ea2/volumes" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.951573 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 14:00:42 crc kubenswrapper[4732]: I0402 14:00:42.996350 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-sxl2t"] Apr 02 14:00:43 crc kubenswrapper[4732]: W0402 14:00:43.027288 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf175395_7cbc_4158_907a_b0ffce6a6efb.slice/crio-e875c5961bfe1a0e28d994fff2e74baf4626c1e3462d487d459b48ba81bfec43 WatchSource:0}: Error finding container e875c5961bfe1a0e28d994fff2e74baf4626c1e3462d487d459b48ba81bfec43: Status 404 returned error can't find the container with id e875c5961bfe1a0e28d994fff2e74baf4626c1e3462d487d459b48ba81bfec43 Apr 02 14:00:43 crc kubenswrapper[4732]: I0402 14:00:43.068894 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-98d7bf879-xkszz"] Apr 02 14:00:43 crc kubenswrapper[4732]: W0402 14:00:43.070626 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8eed39a7_f437_403d_acab_246fa6d25c4b.slice/crio-ee97b5eeb709905a40d2ebeca6ccccc2014c8048426b125e2b43af2fb4ef66a3 WatchSource:0}: Error finding container ee97b5eeb709905a40d2ebeca6ccccc2014c8048426b125e2b43af2fb4ef66a3: Status 404 returned error can't find the container with id ee97b5eeb709905a40d2ebeca6ccccc2014c8048426b125e2b43af2fb4ef66a3 Apr 02 14:00:43 crc kubenswrapper[4732]: I0402 14:00:43.121025 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-f897cfb64-ql8wz"] Apr 02 14:00:43 crc kubenswrapper[4732]: W0402 14:00:43.126544 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e017590_845a_4f52_a6ae_258890dd6388.slice/crio-d55a4773a4c341c4daa91f22be1bcb72069b7b2c6feba511ee2fee1cd7c6154d WatchSource:0}: Error finding container d55a4773a4c341c4daa91f22be1bcb72069b7b2c6feba511ee2fee1cd7c6154d: Status 404 returned error can't find the container with id d55a4773a4c341c4daa91f22be1bcb72069b7b2c6feba511ee2fee1cd7c6154d Apr 02 14:00:43 crc kubenswrapper[4732]: I0402 14:00:43.201493 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6488b8fdcd-j9s62"] Apr 02 14:00:43 crc kubenswrapper[4732]: I0402 14:00:43.442747 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:00:43 crc kubenswrapper[4732]: W0402 14:00:43.449362 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0e53540_98b3_463a_9611_a48c2fbfc0f5.slice/crio-e992f20926ac9ac17ef5bd416178fca892141381cc8ab3ce4a581e7afd28138f WatchSource:0}: Error finding container e992f20926ac9ac17ef5bd416178fca892141381cc8ab3ce4a581e7afd28138f: Status 404 returned error can't find the container with id e992f20926ac9ac17ef5bd416178fca892141381cc8ab3ce4a581e7afd28138f Apr 02 14:00:43 crc kubenswrapper[4732]: I0402 14:00:43.718831 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0e53540-98b3-463a-9611-a48c2fbfc0f5","Type":"ContainerStarted","Data":"e992f20926ac9ac17ef5bd416178fca892141381cc8ab3ce4a581e7afd28138f"} Apr 02 14:00:43 crc kubenswrapper[4732]: I0402 14:00:43.721794 4732 generic.go:334] "Generic (PLEG): container finished" podID="df175395-7cbc-4158-907a-b0ffce6a6efb" containerID="e76ca9aaf5f8fc2b70cb140c6f56b7a8c1165dd43deace98f1541d4c99b9337f" exitCode=0 Apr 02 14:00:43 crc kubenswrapper[4732]: I0402 14:00:43.721885 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" event={"ID":"df175395-7cbc-4158-907a-b0ffce6a6efb","Type":"ContainerDied","Data":"e76ca9aaf5f8fc2b70cb140c6f56b7a8c1165dd43deace98f1541d4c99b9337f"} Apr 02 14:00:43 crc kubenswrapper[4732]: I0402 14:00:43.721923 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" event={"ID":"df175395-7cbc-4158-907a-b0ffce6a6efb","Type":"ContainerStarted","Data":"e875c5961bfe1a0e28d994fff2e74baf4626c1e3462d487d459b48ba81bfec43"} Apr 02 14:00:43 crc kubenswrapper[4732]: I0402 14:00:43.740439 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6488b8fdcd-j9s62" event={"ID":"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb","Type":"ContainerStarted","Data":"3a555b820f07523abb2da70694769e55897c6dcea2fadc09fa51cb7d48986781"} Apr 02 14:00:43 crc kubenswrapper[4732]: I0402 14:00:43.740492 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6488b8fdcd-j9s62" event={"ID":"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb","Type":"ContainerStarted","Data":"cf9df17eaa8995dc201ebfe7375574ee5f065a9488d521516cac30a5a10298e0"} Apr 02 14:00:43 crc kubenswrapper[4732]: I0402 14:00:43.743041 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f897cfb64-ql8wz" event={"ID":"5e017590-845a-4f52-a6ae-258890dd6388","Type":"ContainerStarted","Data":"d55a4773a4c341c4daa91f22be1bcb72069b7b2c6feba511ee2fee1cd7c6154d"} Apr 02 14:00:43 crc kubenswrapper[4732]: I0402 14:00:43.749821 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-98d7bf879-xkszz" event={"ID":"8eed39a7-f437-403d-acab-246fa6d25c4b","Type":"ContainerStarted","Data":"ee97b5eeb709905a40d2ebeca6ccccc2014c8048426b125e2b43af2fb4ef66a3"} Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.077998 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jwc6l" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.117297 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ba0697-529f-41d3-a1a8-55b50ed024a2-combined-ca-bundle\") pod \"10ba0697-529f-41d3-a1a8-55b50ed024a2\" (UID: \"10ba0697-529f-41d3-a1a8-55b50ed024a2\") " Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.117388 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10ba0697-529f-41d3-a1a8-55b50ed024a2-scripts\") pod \"10ba0697-529f-41d3-a1a8-55b50ed024a2\" (UID: \"10ba0697-529f-41d3-a1a8-55b50ed024a2\") " Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.117462 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/10ba0697-529f-41d3-a1a8-55b50ed024a2-etc-machine-id\") pod \"10ba0697-529f-41d3-a1a8-55b50ed024a2\" (UID: \"10ba0697-529f-41d3-a1a8-55b50ed024a2\") " Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.117530 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd7m5\" (UniqueName: \"kubernetes.io/projected/10ba0697-529f-41d3-a1a8-55b50ed024a2-kube-api-access-jd7m5\") pod \"10ba0697-529f-41d3-a1a8-55b50ed024a2\" (UID: \"10ba0697-529f-41d3-a1a8-55b50ed024a2\") " Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.117570 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10ba0697-529f-41d3-a1a8-55b50ed024a2-config-data\") pod \"10ba0697-529f-41d3-a1a8-55b50ed024a2\" (UID: \"10ba0697-529f-41d3-a1a8-55b50ed024a2\") " Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.117653 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/10ba0697-529f-41d3-a1a8-55b50ed024a2-db-sync-config-data\") pod \"10ba0697-529f-41d3-a1a8-55b50ed024a2\" (UID: \"10ba0697-529f-41d3-a1a8-55b50ed024a2\") " Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.119033 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/10ba0697-529f-41d3-a1a8-55b50ed024a2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "10ba0697-529f-41d3-a1a8-55b50ed024a2" (UID: "10ba0697-529f-41d3-a1a8-55b50ed024a2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.126891 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10ba0697-529f-41d3-a1a8-55b50ed024a2-kube-api-access-jd7m5" (OuterVolumeSpecName: "kube-api-access-jd7m5") pod "10ba0697-529f-41d3-a1a8-55b50ed024a2" (UID: "10ba0697-529f-41d3-a1a8-55b50ed024a2"). InnerVolumeSpecName "kube-api-access-jd7m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.128747 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10ba0697-529f-41d3-a1a8-55b50ed024a2-scripts" (OuterVolumeSpecName: "scripts") pod "10ba0697-529f-41d3-a1a8-55b50ed024a2" (UID: "10ba0697-529f-41d3-a1a8-55b50ed024a2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.130853 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10ba0697-529f-41d3-a1a8-55b50ed024a2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "10ba0697-529f-41d3-a1a8-55b50ed024a2" (UID: "10ba0697-529f-41d3-a1a8-55b50ed024a2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.169713 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10ba0697-529f-41d3-a1a8-55b50ed024a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10ba0697-529f-41d3-a1a8-55b50ed024a2" (UID: "10ba0697-529f-41d3-a1a8-55b50ed024a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.177776 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10ba0697-529f-41d3-a1a8-55b50ed024a2-config-data" (OuterVolumeSpecName: "config-data") pod "10ba0697-529f-41d3-a1a8-55b50ed024a2" (UID: "10ba0697-529f-41d3-a1a8-55b50ed024a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.236934 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10ba0697-529f-41d3-a1a8-55b50ed024a2-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.236972 4732 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/10ba0697-529f-41d3-a1a8-55b50ed024a2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.236987 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ba0697-529f-41d3-a1a8-55b50ed024a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.237000 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10ba0697-529f-41d3-a1a8-55b50ed024a2-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.237012 4732 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/10ba0697-529f-41d3-a1a8-55b50ed024a2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.237025 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jd7m5\" (UniqueName: \"kubernetes.io/projected/10ba0697-529f-41d3-a1a8-55b50ed024a2-kube-api-access-jd7m5\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.760271 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0e53540-98b3-463a-9611-a48c2fbfc0f5","Type":"ContainerStarted","Data":"f9dbf40bc89d07703caf265aa216a0d8f8682ac26fc7209514eb887f20e06559"} Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.766924 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" event={"ID":"df175395-7cbc-4158-907a-b0ffce6a6efb","Type":"ContainerStarted","Data":"8cf41a4bd5d2561802f652340045d6cce5f45df033272a00228b23776c4106cd"} Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.767193 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.769770 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6488b8fdcd-j9s62" event={"ID":"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb","Type":"ContainerStarted","Data":"aeb47ad0e1a6ab9d71c5c23188f45a87d4579aee6e0d77d78be8c7c18a512803"} Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.769922 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6488b8fdcd-j9s62" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.771156 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jwc6l" event={"ID":"10ba0697-529f-41d3-a1a8-55b50ed024a2","Type":"ContainerDied","Data":"29ad76abfe869fb1f8b08f0bad1fdc22194c5ae2f1185f58f2ae082436042f36"} Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.771178 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29ad76abfe869fb1f8b08f0bad1fdc22194c5ae2f1185f58f2ae082436042f36" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.771260 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jwc6l" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.790717 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" podStartSLOduration=3.790699664 podStartE2EDuration="3.790699664s" podCreationTimestamp="2026-04-02 14:00:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:00:44.78979162 +0000 UTC m=+1401.694199173" watchObservedRunningTime="2026-04-02 14:00:44.790699664 +0000 UTC m=+1401.695107207" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.875705 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6488b8fdcd-j9s62" podStartSLOduration=2.8756689939999998 podStartE2EDuration="2.875668994s" podCreationTimestamp="2026-04-02 14:00:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:00:44.809021668 +0000 UTC m=+1401.713429221" watchObservedRunningTime="2026-04-02 14:00:44.875668994 +0000 UTC m=+1401.780076547" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.889107 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Apr 02 14:00:44 crc kubenswrapper[4732]: E0402 14:00:44.889590 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10ba0697-529f-41d3-a1a8-55b50ed024a2" containerName="cinder-db-sync" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.889644 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="10ba0697-529f-41d3-a1a8-55b50ed024a2" containerName="cinder-db-sync" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.889890 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="10ba0697-529f-41d3-a1a8-55b50ed024a2" containerName="cinder-db-sync" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.891260 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.896128 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.896200 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.896386 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.896495 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mcjr4" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.919898 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.953461 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3ab5d4f-9beb-4c41-954d-bc917548f495-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c3ab5d4f-9beb-4c41-954d-bc917548f495\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.953668 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3ab5d4f-9beb-4c41-954d-bc917548f495-scripts\") pod \"cinder-scheduler-0\" (UID: \"c3ab5d4f-9beb-4c41-954d-bc917548f495\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.953756 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3ab5d4f-9beb-4c41-954d-bc917548f495-config-data\") pod \"cinder-scheduler-0\" (UID: \"c3ab5d4f-9beb-4c41-954d-bc917548f495\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.953937 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcm4x\" (UniqueName: \"kubernetes.io/projected/c3ab5d4f-9beb-4c41-954d-bc917548f495-kube-api-access-xcm4x\") pod \"cinder-scheduler-0\" (UID: \"c3ab5d4f-9beb-4c41-954d-bc917548f495\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.954117 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3ab5d4f-9beb-4c41-954d-bc917548f495-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c3ab5d4f-9beb-4c41-954d-bc917548f495\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:44 crc kubenswrapper[4732]: I0402 14:00:44.954161 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3ab5d4f-9beb-4c41-954d-bc917548f495-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c3ab5d4f-9beb-4c41-954d-bc917548f495\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.005411 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-sxl2t"] Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.053970 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-kk8pk"] Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.057765 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcm4x\" (UniqueName: \"kubernetes.io/projected/c3ab5d4f-9beb-4c41-954d-bc917548f495-kube-api-access-xcm4x\") pod \"cinder-scheduler-0\" (UID: \"c3ab5d4f-9beb-4c41-954d-bc917548f495\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.057889 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3ab5d4f-9beb-4c41-954d-bc917548f495-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c3ab5d4f-9beb-4c41-954d-bc917548f495\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.057929 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3ab5d4f-9beb-4c41-954d-bc917548f495-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c3ab5d4f-9beb-4c41-954d-bc917548f495\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.058004 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3ab5d4f-9beb-4c41-954d-bc917548f495-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c3ab5d4f-9beb-4c41-954d-bc917548f495\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.058067 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3ab5d4f-9beb-4c41-954d-bc917548f495-scripts\") pod \"cinder-scheduler-0\" (UID: \"c3ab5d4f-9beb-4c41-954d-bc917548f495\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.058100 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3ab5d4f-9beb-4c41-954d-bc917548f495-config-data\") pod \"cinder-scheduler-0\" (UID: \"c3ab5d4f-9beb-4c41-954d-bc917548f495\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.063744 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3ab5d4f-9beb-4c41-954d-bc917548f495-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c3ab5d4f-9beb-4c41-954d-bc917548f495\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.064743 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3ab5d4f-9beb-4c41-954d-bc917548f495-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c3ab5d4f-9beb-4c41-954d-bc917548f495\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.069449 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3ab5d4f-9beb-4c41-954d-bc917548f495-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c3ab5d4f-9beb-4c41-954d-bc917548f495\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.070015 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3ab5d4f-9beb-4c41-954d-bc917548f495-scripts\") pod \"cinder-scheduler-0\" (UID: \"c3ab5d4f-9beb-4c41-954d-bc917548f495\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.072003 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-kk8pk"] Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.072130 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.082996 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3ab5d4f-9beb-4c41-954d-bc917548f495-config-data\") pod \"cinder-scheduler-0\" (UID: \"c3ab5d4f-9beb-4c41-954d-bc917548f495\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.087200 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcm4x\" (UniqueName: \"kubernetes.io/projected/c3ab5d4f-9beb-4c41-954d-bc917548f495-kube-api-access-xcm4x\") pod \"cinder-scheduler-0\" (UID: \"c3ab5d4f-9beb-4c41-954d-bc917548f495\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.160580 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-config\") pod \"dnsmasq-dns-6bb4fc677f-kk8pk\" (UID: \"56849534-b6ee-4c76-a706-22af33d61018\") " pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.160953 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhdbx\" (UniqueName: \"kubernetes.io/projected/56849534-b6ee-4c76-a706-22af33d61018-kube-api-access-zhdbx\") pod \"dnsmasq-dns-6bb4fc677f-kk8pk\" (UID: \"56849534-b6ee-4c76-a706-22af33d61018\") " pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.161011 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-kk8pk\" (UID: \"56849534-b6ee-4c76-a706-22af33d61018\") " pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.161176 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-kk8pk\" (UID: \"56849534-b6ee-4c76-a706-22af33d61018\") " pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.161228 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-kk8pk\" (UID: \"56849534-b6ee-4c76-a706-22af33d61018\") " pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.161270 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-kk8pk\" (UID: \"56849534-b6ee-4c76-a706-22af33d61018\") " pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.231427 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.232265 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.233069 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.242238 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.263665 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-kk8pk\" (UID: \"56849534-b6ee-4c76-a706-22af33d61018\") " pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.263715 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-kk8pk\" (UID: \"56849534-b6ee-4c76-a706-22af33d61018\") " pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.263747 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-kk8pk\" (UID: \"56849534-b6ee-4c76-a706-22af33d61018\") " pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.263846 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-config\") pod \"dnsmasq-dns-6bb4fc677f-kk8pk\" (UID: \"56849534-b6ee-4c76-a706-22af33d61018\") " pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.263915 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhdbx\" (UniqueName: \"kubernetes.io/projected/56849534-b6ee-4c76-a706-22af33d61018-kube-api-access-zhdbx\") pod \"dnsmasq-dns-6bb4fc677f-kk8pk\" (UID: \"56849534-b6ee-4c76-a706-22af33d61018\") " pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.265072 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-kk8pk\" (UID: \"56849534-b6ee-4c76-a706-22af33d61018\") " pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.265199 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-kk8pk\" (UID: \"56849534-b6ee-4c76-a706-22af33d61018\") " pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.265241 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-config\") pod \"dnsmasq-dns-6bb4fc677f-kk8pk\" (UID: \"56849534-b6ee-4c76-a706-22af33d61018\") " pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.265775 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-kk8pk\" (UID: \"56849534-b6ee-4c76-a706-22af33d61018\") " pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.266190 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-kk8pk\" (UID: \"56849534-b6ee-4c76-a706-22af33d61018\") " pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.267291 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-kk8pk\" (UID: \"56849534-b6ee-4c76-a706-22af33d61018\") " pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.279816 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.287874 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhdbx\" (UniqueName: \"kubernetes.io/projected/56849534-b6ee-4c76-a706-22af33d61018-kube-api-access-zhdbx\") pod \"dnsmasq-dns-6bb4fc677f-kk8pk\" (UID: \"56849534-b6ee-4c76-a706-22af33d61018\") " pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.368192 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52795ed2-6cd9-4380-8826-68d5f55d87df-scripts\") pod \"cinder-api-0\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " pod="openstack/cinder-api-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.368331 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpf8s\" (UniqueName: \"kubernetes.io/projected/52795ed2-6cd9-4380-8826-68d5f55d87df-kube-api-access-tpf8s\") pod \"cinder-api-0\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " pod="openstack/cinder-api-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.368875 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52795ed2-6cd9-4380-8826-68d5f55d87df-config-data-custom\") pod \"cinder-api-0\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " pod="openstack/cinder-api-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.369063 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52795ed2-6cd9-4380-8826-68d5f55d87df-config-data\") pod \"cinder-api-0\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " pod="openstack/cinder-api-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.369115 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/52795ed2-6cd9-4380-8826-68d5f55d87df-etc-machine-id\") pod \"cinder-api-0\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " pod="openstack/cinder-api-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.369253 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52795ed2-6cd9-4380-8826-68d5f55d87df-logs\") pod \"cinder-api-0\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " pod="openstack/cinder-api-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.369316 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52795ed2-6cd9-4380-8826-68d5f55d87df-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " pod="openstack/cinder-api-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.471484 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpf8s\" (UniqueName: \"kubernetes.io/projected/52795ed2-6cd9-4380-8826-68d5f55d87df-kube-api-access-tpf8s\") pod \"cinder-api-0\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " pod="openstack/cinder-api-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.471555 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52795ed2-6cd9-4380-8826-68d5f55d87df-config-data-custom\") pod \"cinder-api-0\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " pod="openstack/cinder-api-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.471635 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52795ed2-6cd9-4380-8826-68d5f55d87df-config-data\") pod \"cinder-api-0\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " pod="openstack/cinder-api-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.471686 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/52795ed2-6cd9-4380-8826-68d5f55d87df-etc-machine-id\") pod \"cinder-api-0\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " pod="openstack/cinder-api-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.471747 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52795ed2-6cd9-4380-8826-68d5f55d87df-logs\") pod \"cinder-api-0\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " pod="openstack/cinder-api-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.471778 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52795ed2-6cd9-4380-8826-68d5f55d87df-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " pod="openstack/cinder-api-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.471870 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52795ed2-6cd9-4380-8826-68d5f55d87df-scripts\") pod \"cinder-api-0\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " pod="openstack/cinder-api-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.472271 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/52795ed2-6cd9-4380-8826-68d5f55d87df-etc-machine-id\") pod \"cinder-api-0\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " pod="openstack/cinder-api-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.472346 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52795ed2-6cd9-4380-8826-68d5f55d87df-logs\") pod \"cinder-api-0\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " pod="openstack/cinder-api-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.474965 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52795ed2-6cd9-4380-8826-68d5f55d87df-scripts\") pod \"cinder-api-0\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " pod="openstack/cinder-api-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.475191 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52795ed2-6cd9-4380-8826-68d5f55d87df-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " pod="openstack/cinder-api-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.484252 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.481798 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52795ed2-6cd9-4380-8826-68d5f55d87df-config-data\") pod \"cinder-api-0\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " pod="openstack/cinder-api-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.490229 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52795ed2-6cd9-4380-8826-68d5f55d87df-config-data-custom\") pod \"cinder-api-0\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " pod="openstack/cinder-api-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.490383 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpf8s\" (UniqueName: \"kubernetes.io/projected/52795ed2-6cd9-4380-8826-68d5f55d87df-kube-api-access-tpf8s\") pod \"cinder-api-0\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " pod="openstack/cinder-api-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.590678 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.670315 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6c879c6666-5kls7"] Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.672255 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.677431 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.677469 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.689498 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6c879c6666-5kls7"] Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.778215 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/362f9e50-6f86-41ff-ae02-e0b8565fa55f-internal-tls-certs\") pod \"barbican-api-6c879c6666-5kls7\" (UID: \"362f9e50-6f86-41ff-ae02-e0b8565fa55f\") " pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.778318 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/362f9e50-6f86-41ff-ae02-e0b8565fa55f-config-data\") pod \"barbican-api-6c879c6666-5kls7\" (UID: \"362f9e50-6f86-41ff-ae02-e0b8565fa55f\") " pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.778353 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/362f9e50-6f86-41ff-ae02-e0b8565fa55f-combined-ca-bundle\") pod \"barbican-api-6c879c6666-5kls7\" (UID: \"362f9e50-6f86-41ff-ae02-e0b8565fa55f\") " pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.778377 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/362f9e50-6f86-41ff-ae02-e0b8565fa55f-logs\") pod \"barbican-api-6c879c6666-5kls7\" (UID: \"362f9e50-6f86-41ff-ae02-e0b8565fa55f\") " pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.778405 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/362f9e50-6f86-41ff-ae02-e0b8565fa55f-config-data-custom\") pod \"barbican-api-6c879c6666-5kls7\" (UID: \"362f9e50-6f86-41ff-ae02-e0b8565fa55f\") " pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.778435 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/362f9e50-6f86-41ff-ae02-e0b8565fa55f-public-tls-certs\") pod \"barbican-api-6c879c6666-5kls7\" (UID: \"362f9e50-6f86-41ff-ae02-e0b8565fa55f\") " pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.778526 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvw9v\" (UniqueName: \"kubernetes.io/projected/362f9e50-6f86-41ff-ae02-e0b8565fa55f-kube-api-access-vvw9v\") pod \"barbican-api-6c879c6666-5kls7\" (UID: \"362f9e50-6f86-41ff-ae02-e0b8565fa55f\") " pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.791196 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6488b8fdcd-j9s62" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.880326 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvw9v\" (UniqueName: \"kubernetes.io/projected/362f9e50-6f86-41ff-ae02-e0b8565fa55f-kube-api-access-vvw9v\") pod \"barbican-api-6c879c6666-5kls7\" (UID: \"362f9e50-6f86-41ff-ae02-e0b8565fa55f\") " pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.880487 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/362f9e50-6f86-41ff-ae02-e0b8565fa55f-internal-tls-certs\") pod \"barbican-api-6c879c6666-5kls7\" (UID: \"362f9e50-6f86-41ff-ae02-e0b8565fa55f\") " pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.880543 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/362f9e50-6f86-41ff-ae02-e0b8565fa55f-config-data\") pod \"barbican-api-6c879c6666-5kls7\" (UID: \"362f9e50-6f86-41ff-ae02-e0b8565fa55f\") " pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.880587 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/362f9e50-6f86-41ff-ae02-e0b8565fa55f-combined-ca-bundle\") pod \"barbican-api-6c879c6666-5kls7\" (UID: \"362f9e50-6f86-41ff-ae02-e0b8565fa55f\") " pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.880668 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/362f9e50-6f86-41ff-ae02-e0b8565fa55f-logs\") pod \"barbican-api-6c879c6666-5kls7\" (UID: \"362f9e50-6f86-41ff-ae02-e0b8565fa55f\") " pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.880693 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/362f9e50-6f86-41ff-ae02-e0b8565fa55f-config-data-custom\") pod \"barbican-api-6c879c6666-5kls7\" (UID: \"362f9e50-6f86-41ff-ae02-e0b8565fa55f\") " pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.880723 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/362f9e50-6f86-41ff-ae02-e0b8565fa55f-public-tls-certs\") pod \"barbican-api-6c879c6666-5kls7\" (UID: \"362f9e50-6f86-41ff-ae02-e0b8565fa55f\") " pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.885254 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/362f9e50-6f86-41ff-ae02-e0b8565fa55f-logs\") pod \"barbican-api-6c879c6666-5kls7\" (UID: \"362f9e50-6f86-41ff-ae02-e0b8565fa55f\") " pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.892063 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/362f9e50-6f86-41ff-ae02-e0b8565fa55f-internal-tls-certs\") pod \"barbican-api-6c879c6666-5kls7\" (UID: \"362f9e50-6f86-41ff-ae02-e0b8565fa55f\") " pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.892125 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/362f9e50-6f86-41ff-ae02-e0b8565fa55f-public-tls-certs\") pod \"barbican-api-6c879c6666-5kls7\" (UID: \"362f9e50-6f86-41ff-ae02-e0b8565fa55f\") " pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.893015 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/362f9e50-6f86-41ff-ae02-e0b8565fa55f-config-data-custom\") pod \"barbican-api-6c879c6666-5kls7\" (UID: \"362f9e50-6f86-41ff-ae02-e0b8565fa55f\") " pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.893851 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/362f9e50-6f86-41ff-ae02-e0b8565fa55f-config-data\") pod \"barbican-api-6c879c6666-5kls7\" (UID: \"362f9e50-6f86-41ff-ae02-e0b8565fa55f\") " pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.893933 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/362f9e50-6f86-41ff-ae02-e0b8565fa55f-combined-ca-bundle\") pod \"barbican-api-6c879c6666-5kls7\" (UID: \"362f9e50-6f86-41ff-ae02-e0b8565fa55f\") " pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:45 crc kubenswrapper[4732]: I0402 14:00:45.912969 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvw9v\" (UniqueName: \"kubernetes.io/projected/362f9e50-6f86-41ff-ae02-e0b8565fa55f-kube-api-access-vvw9v\") pod \"barbican-api-6c879c6666-5kls7\" (UID: \"362f9e50-6f86-41ff-ae02-e0b8565fa55f\") " pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:46 crc kubenswrapper[4732]: I0402 14:00:46.026299 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:46 crc kubenswrapper[4732]: I0402 14:00:46.043637 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Apr 02 14:00:46 crc kubenswrapper[4732]: I0402 14:00:46.200841 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-kk8pk"] Apr 02 14:00:46 crc kubenswrapper[4732]: I0402 14:00:46.347724 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Apr 02 14:00:46 crc kubenswrapper[4732]: I0402 14:00:46.521444 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6c879c6666-5kls7"] Apr 02 14:00:46 crc kubenswrapper[4732]: I0402 14:00:46.534004 4732 scope.go:117] "RemoveContainer" containerID="654a1e24249d45635d250229ae3e0174a0763ce4c01ca462e084011d1cce4241" Apr 02 14:00:46 crc kubenswrapper[4732]: I0402 14:00:46.821358 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c879c6666-5kls7" event={"ID":"362f9e50-6f86-41ff-ae02-e0b8565fa55f","Type":"ContainerStarted","Data":"dbc4dcbb0435d2e1e5cd09ca76bd813f15d9487fcfe0d3f6f46aa0cb6dd49daf"} Apr 02 14:00:46 crc kubenswrapper[4732]: I0402 14:00:46.827569 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" event={"ID":"56849534-b6ee-4c76-a706-22af33d61018","Type":"ContainerStarted","Data":"95c2e76f85264ee19c5f471b3778dba7420a8e8aea1b68c330f24d9465562e2a"} Apr 02 14:00:46 crc kubenswrapper[4732]: I0402 14:00:46.827603 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" event={"ID":"56849534-b6ee-4c76-a706-22af33d61018","Type":"ContainerStarted","Data":"3f109315f5b2612396b6a55ba5afc71c7e59da9514d24bbef834eab5f9134bb6"} Apr 02 14:00:46 crc kubenswrapper[4732]: I0402 14:00:46.830419 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-98d7bf879-xkszz" event={"ID":"8eed39a7-f437-403d-acab-246fa6d25c4b","Type":"ContainerStarted","Data":"5771a65898ae3b5195ada8e85459d1a572364091557a40e8bbcc8018ed0f6613"} Apr 02 14:00:46 crc kubenswrapper[4732]: I0402 14:00:46.830463 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-98d7bf879-xkszz" event={"ID":"8eed39a7-f437-403d-acab-246fa6d25c4b","Type":"ContainerStarted","Data":"e5e6399bc0929955d16f620ca0efcbfd4837f6c581552d387587af3b3fcbc6ff"} Apr 02 14:00:46 crc kubenswrapper[4732]: I0402 14:00:46.833240 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"52795ed2-6cd9-4380-8826-68d5f55d87df","Type":"ContainerStarted","Data":"21b65e6f470af4c9cc9e135beffdc6b3c501e4f2db97ae18433c050d66197062"} Apr 02 14:00:46 crc kubenswrapper[4732]: I0402 14:00:46.839339 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0e53540-98b3-463a-9611-a48c2fbfc0f5","Type":"ContainerStarted","Data":"70f43e416f820f0faed0922b41d00581677129965f3e933afec756d385dec3cf"} Apr 02 14:00:46 crc kubenswrapper[4732]: I0402 14:00:46.845802 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c3ab5d4f-9beb-4c41-954d-bc917548f495","Type":"ContainerStarted","Data":"d02977a3dde2f199a60cc75fb084b56c23c5917739586275ee9a5ded8a4a43c8"} Apr 02 14:00:46 crc kubenswrapper[4732]: I0402 14:00:46.891097 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" podUID="df175395-7cbc-4158-907a-b0ffce6a6efb" containerName="dnsmasq-dns" containerID="cri-o://8cf41a4bd5d2561802f652340045d6cce5f45df033272a00228b23776c4106cd" gracePeriod=10 Apr 02 14:00:46 crc kubenswrapper[4732]: I0402 14:00:46.891660 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f897cfb64-ql8wz" event={"ID":"5e017590-845a-4f52-a6ae-258890dd6388","Type":"ContainerStarted","Data":"d30bcc56037558f7e69aadf5904b90cb16855c7c6db3f5437513a8e6eacd22e7"} Apr 02 14:00:46 crc kubenswrapper[4732]: I0402 14:00:46.891697 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f897cfb64-ql8wz" event={"ID":"5e017590-845a-4f52-a6ae-258890dd6388","Type":"ContainerStarted","Data":"130d0fa16f57ff282ca350c17e0ab55269b6472976a6a8de526e3e1acdca01c1"} Apr 02 14:00:46 crc kubenswrapper[4732]: I0402 14:00:46.893273 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-98d7bf879-xkszz" podStartSLOduration=2.501565364 podStartE2EDuration="4.893256392s" podCreationTimestamp="2026-04-02 14:00:42 +0000 UTC" firstStartedPulling="2026-04-02 14:00:43.096599511 +0000 UTC m=+1400.001007064" lastFinishedPulling="2026-04-02 14:00:45.488290539 +0000 UTC m=+1402.392698092" observedRunningTime="2026-04-02 14:00:46.891266638 +0000 UTC m=+1403.795674201" watchObservedRunningTime="2026-04-02 14:00:46.893256392 +0000 UTC m=+1403.797663945" Apr 02 14:00:46 crc kubenswrapper[4732]: I0402 14:00:46.924812 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-f897cfb64-ql8wz" podStartSLOduration=2.498100869 podStartE2EDuration="4.924790831s" podCreationTimestamp="2026-04-02 14:00:42 +0000 UTC" firstStartedPulling="2026-04-02 14:00:43.129715803 +0000 UTC m=+1400.034123356" lastFinishedPulling="2026-04-02 14:00:45.556405765 +0000 UTC m=+1402.460813318" observedRunningTime="2026-04-02 14:00:46.920095735 +0000 UTC m=+1403.824503288" watchObservedRunningTime="2026-04-02 14:00:46.924790831 +0000 UTC m=+1403.829198384" Apr 02 14:00:47 crc kubenswrapper[4732]: I0402 14:00:47.909650 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"52795ed2-6cd9-4380-8826-68d5f55d87df","Type":"ContainerStarted","Data":"c0c0352edd155d350c9fc88dd046ff9f7dbe5dc2eac74913f48bc823d5d4cb81"} Apr 02 14:00:47 crc kubenswrapper[4732]: I0402 14:00:47.917762 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c879c6666-5kls7" event={"ID":"362f9e50-6f86-41ff-ae02-e0b8565fa55f","Type":"ContainerStarted","Data":"301326bcd644c15518986374a4f0151315c16c1e6e8096d879037cc8ee80096a"} Apr 02 14:00:47 crc kubenswrapper[4732]: I0402 14:00:47.921880 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0e53540-98b3-463a-9611-a48c2fbfc0f5","Type":"ContainerStarted","Data":"bdd7bd8126ffb3128ae660abacfacf4439445b786a5f93e6bf789109a2cab1eb"} Apr 02 14:00:47 crc kubenswrapper[4732]: I0402 14:00:47.931829 4732 generic.go:334] "Generic (PLEG): container finished" podID="df175395-7cbc-4158-907a-b0ffce6a6efb" containerID="8cf41a4bd5d2561802f652340045d6cce5f45df033272a00228b23776c4106cd" exitCode=0 Apr 02 14:00:47 crc kubenswrapper[4732]: I0402 14:00:47.931899 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" event={"ID":"df175395-7cbc-4158-907a-b0ffce6a6efb","Type":"ContainerDied","Data":"8cf41a4bd5d2561802f652340045d6cce5f45df033272a00228b23776c4106cd"} Apr 02 14:00:47 crc kubenswrapper[4732]: I0402 14:00:47.934394 4732 generic.go:334] "Generic (PLEG): container finished" podID="56849534-b6ee-4c76-a706-22af33d61018" containerID="95c2e76f85264ee19c5f471b3778dba7420a8e8aea1b68c330f24d9465562e2a" exitCode=0 Apr 02 14:00:47 crc kubenswrapper[4732]: I0402 14:00:47.934927 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" event={"ID":"56849534-b6ee-4c76-a706-22af33d61018","Type":"ContainerDied","Data":"95c2e76f85264ee19c5f471b3778dba7420a8e8aea1b68c330f24d9465562e2a"} Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.056664 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-cd55d6846-4zs9k" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.111430 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.270202 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-dns-swift-storage-0\") pod \"df175395-7cbc-4158-907a-b0ffce6a6efb\" (UID: \"df175395-7cbc-4158-907a-b0ffce6a6efb\") " Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.270342 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-ovsdbserver-nb\") pod \"df175395-7cbc-4158-907a-b0ffce6a6efb\" (UID: \"df175395-7cbc-4158-907a-b0ffce6a6efb\") " Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.270422 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzjzq\" (UniqueName: \"kubernetes.io/projected/df175395-7cbc-4158-907a-b0ffce6a6efb-kube-api-access-mzjzq\") pod \"df175395-7cbc-4158-907a-b0ffce6a6efb\" (UID: \"df175395-7cbc-4158-907a-b0ffce6a6efb\") " Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.270478 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-ovsdbserver-sb\") pod \"df175395-7cbc-4158-907a-b0ffce6a6efb\" (UID: \"df175395-7cbc-4158-907a-b0ffce6a6efb\") " Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.270544 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-config\") pod \"df175395-7cbc-4158-907a-b0ffce6a6efb\" (UID: \"df175395-7cbc-4158-907a-b0ffce6a6efb\") " Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.270584 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-dns-svc\") pod \"df175395-7cbc-4158-907a-b0ffce6a6efb\" (UID: \"df175395-7cbc-4158-907a-b0ffce6a6efb\") " Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.284441 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df175395-7cbc-4158-907a-b0ffce6a6efb-kube-api-access-mzjzq" (OuterVolumeSpecName: "kube-api-access-mzjzq") pod "df175395-7cbc-4158-907a-b0ffce6a6efb" (UID: "df175395-7cbc-4158-907a-b0ffce6a6efb"). InnerVolumeSpecName "kube-api-access-mzjzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.374317 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzjzq\" (UniqueName: \"kubernetes.io/projected/df175395-7cbc-4158-907a-b0ffce6a6efb-kube-api-access-mzjzq\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.375966 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-76fc857857-knj58"] Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.376232 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-76fc857857-knj58" podUID="88748d2e-8313-467e-b707-e82e1af776d5" containerName="neutron-api" containerID="cri-o://4bcf35b9dca26827b3b6375d989726ce646c242d2a9befcc53a1c6bbdc4e4e3d" gracePeriod=30 Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.376974 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-76fc857857-knj58" podUID="88748d2e-8313-467e-b707-e82e1af776d5" containerName="neutron-httpd" containerID="cri-o://4f997405de3433e224878d0c0a68fb4b21ec3d43799d53e42d66304cefa5956d" gracePeriod=30 Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.425721 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "df175395-7cbc-4158-907a-b0ffce6a6efb" (UID: "df175395-7cbc-4158-907a-b0ffce6a6efb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.462724 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-58f8f59779-9rrsx"] Apr 02 14:00:48 crc kubenswrapper[4732]: E0402 14:00:48.463350 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df175395-7cbc-4158-907a-b0ffce6a6efb" containerName="dnsmasq-dns" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.463367 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="df175395-7cbc-4158-907a-b0ffce6a6efb" containerName="dnsmasq-dns" Apr 02 14:00:48 crc kubenswrapper[4732]: E0402 14:00:48.463392 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df175395-7cbc-4158-907a-b0ffce6a6efb" containerName="init" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.463399 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="df175395-7cbc-4158-907a-b0ffce6a6efb" containerName="init" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.463566 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="df175395-7cbc-4158-907a-b0ffce6a6efb" containerName="dnsmasq-dns" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.464447 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-58f8f59779-9rrsx" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.475830 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.515678 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-58f8f59779-9rrsx"] Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.526061 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "df175395-7cbc-4158-907a-b0ffce6a6efb" (UID: "df175395-7cbc-4158-907a-b0ffce6a6efb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.526424 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "df175395-7cbc-4158-907a-b0ffce6a6efb" (UID: "df175395-7cbc-4158-907a-b0ffce6a6efb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.527107 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "df175395-7cbc-4158-907a-b0ffce6a6efb" (UID: "df175395-7cbc-4158-907a-b0ffce6a6efb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.529723 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-config" (OuterVolumeSpecName: "config") pod "df175395-7cbc-4158-907a-b0ffce6a6efb" (UID: "df175395-7cbc-4158-907a-b0ffce6a6efb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.579842 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd0e816f-6d8e-4ed8-884c-ee38cec72d94-ovndb-tls-certs\") pod \"neutron-58f8f59779-9rrsx\" (UID: \"cd0e816f-6d8e-4ed8-884c-ee38cec72d94\") " pod="openstack/neutron-58f8f59779-9rrsx" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.579964 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd0e816f-6d8e-4ed8-884c-ee38cec72d94-internal-tls-certs\") pod \"neutron-58f8f59779-9rrsx\" (UID: \"cd0e816f-6d8e-4ed8-884c-ee38cec72d94\") " pod="openstack/neutron-58f8f59779-9rrsx" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.580070 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd0e816f-6d8e-4ed8-884c-ee38cec72d94-combined-ca-bundle\") pod \"neutron-58f8f59779-9rrsx\" (UID: \"cd0e816f-6d8e-4ed8-884c-ee38cec72d94\") " pod="openstack/neutron-58f8f59779-9rrsx" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.580115 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cd0e816f-6d8e-4ed8-884c-ee38cec72d94-httpd-config\") pod \"neutron-58f8f59779-9rrsx\" (UID: \"cd0e816f-6d8e-4ed8-884c-ee38cec72d94\") " pod="openstack/neutron-58f8f59779-9rrsx" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.580144 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd0e816f-6d8e-4ed8-884c-ee38cec72d94-public-tls-certs\") pod \"neutron-58f8f59779-9rrsx\" (UID: \"cd0e816f-6d8e-4ed8-884c-ee38cec72d94\") " pod="openstack/neutron-58f8f59779-9rrsx" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.580189 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zq5n\" (UniqueName: \"kubernetes.io/projected/cd0e816f-6d8e-4ed8-884c-ee38cec72d94-kube-api-access-5zq5n\") pod \"neutron-58f8f59779-9rrsx\" (UID: \"cd0e816f-6d8e-4ed8-884c-ee38cec72d94\") " pod="openstack/neutron-58f8f59779-9rrsx" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.580219 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd0e816f-6d8e-4ed8-884c-ee38cec72d94-config\") pod \"neutron-58f8f59779-9rrsx\" (UID: \"cd0e816f-6d8e-4ed8-884c-ee38cec72d94\") " pod="openstack/neutron-58f8f59779-9rrsx" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.580287 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.580302 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-config\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.580316 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.580328 4732 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df175395-7cbc-4158-907a-b0ffce6a6efb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.631015 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.685991 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd0e816f-6d8e-4ed8-884c-ee38cec72d94-combined-ca-bundle\") pod \"neutron-58f8f59779-9rrsx\" (UID: \"cd0e816f-6d8e-4ed8-884c-ee38cec72d94\") " pod="openstack/neutron-58f8f59779-9rrsx" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.686074 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cd0e816f-6d8e-4ed8-884c-ee38cec72d94-httpd-config\") pod \"neutron-58f8f59779-9rrsx\" (UID: \"cd0e816f-6d8e-4ed8-884c-ee38cec72d94\") " pod="openstack/neutron-58f8f59779-9rrsx" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.686100 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd0e816f-6d8e-4ed8-884c-ee38cec72d94-public-tls-certs\") pod \"neutron-58f8f59779-9rrsx\" (UID: \"cd0e816f-6d8e-4ed8-884c-ee38cec72d94\") " pod="openstack/neutron-58f8f59779-9rrsx" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.686131 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zq5n\" (UniqueName: \"kubernetes.io/projected/cd0e816f-6d8e-4ed8-884c-ee38cec72d94-kube-api-access-5zq5n\") pod \"neutron-58f8f59779-9rrsx\" (UID: \"cd0e816f-6d8e-4ed8-884c-ee38cec72d94\") " pod="openstack/neutron-58f8f59779-9rrsx" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.686161 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd0e816f-6d8e-4ed8-884c-ee38cec72d94-config\") pod \"neutron-58f8f59779-9rrsx\" (UID: \"cd0e816f-6d8e-4ed8-884c-ee38cec72d94\") " pod="openstack/neutron-58f8f59779-9rrsx" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.686199 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd0e816f-6d8e-4ed8-884c-ee38cec72d94-ovndb-tls-certs\") pod \"neutron-58f8f59779-9rrsx\" (UID: \"cd0e816f-6d8e-4ed8-884c-ee38cec72d94\") " pod="openstack/neutron-58f8f59779-9rrsx" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.686272 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd0e816f-6d8e-4ed8-884c-ee38cec72d94-internal-tls-certs\") pod \"neutron-58f8f59779-9rrsx\" (UID: \"cd0e816f-6d8e-4ed8-884c-ee38cec72d94\") " pod="openstack/neutron-58f8f59779-9rrsx" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.697453 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd0e816f-6d8e-4ed8-884c-ee38cec72d94-public-tls-certs\") pod \"neutron-58f8f59779-9rrsx\" (UID: \"cd0e816f-6d8e-4ed8-884c-ee38cec72d94\") " pod="openstack/neutron-58f8f59779-9rrsx" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.700557 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd0e816f-6d8e-4ed8-884c-ee38cec72d94-ovndb-tls-certs\") pod \"neutron-58f8f59779-9rrsx\" (UID: \"cd0e816f-6d8e-4ed8-884c-ee38cec72d94\") " pod="openstack/neutron-58f8f59779-9rrsx" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.703069 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cd0e816f-6d8e-4ed8-884c-ee38cec72d94-httpd-config\") pod \"neutron-58f8f59779-9rrsx\" (UID: \"cd0e816f-6d8e-4ed8-884c-ee38cec72d94\") " pod="openstack/neutron-58f8f59779-9rrsx" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.706405 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd0e816f-6d8e-4ed8-884c-ee38cec72d94-internal-tls-certs\") pod \"neutron-58f8f59779-9rrsx\" (UID: \"cd0e816f-6d8e-4ed8-884c-ee38cec72d94\") " pod="openstack/neutron-58f8f59779-9rrsx" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.707561 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd0e816f-6d8e-4ed8-884c-ee38cec72d94-config\") pod \"neutron-58f8f59779-9rrsx\" (UID: \"cd0e816f-6d8e-4ed8-884c-ee38cec72d94\") " pod="openstack/neutron-58f8f59779-9rrsx" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.710336 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd0e816f-6d8e-4ed8-884c-ee38cec72d94-combined-ca-bundle\") pod \"neutron-58f8f59779-9rrsx\" (UID: \"cd0e816f-6d8e-4ed8-884c-ee38cec72d94\") " pod="openstack/neutron-58f8f59779-9rrsx" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.721532 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zq5n\" (UniqueName: \"kubernetes.io/projected/cd0e816f-6d8e-4ed8-884c-ee38cec72d94-kube-api-access-5zq5n\") pod \"neutron-58f8f59779-9rrsx\" (UID: \"cd0e816f-6d8e-4ed8-884c-ee38cec72d94\") " pod="openstack/neutron-58f8f59779-9rrsx" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.773774 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-76fc857857-knj58" podUID="88748d2e-8313-467e-b707-e82e1af776d5" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.158:9696/\": read tcp 10.217.0.2:38596->10.217.0.158:9696: read: connection reset by peer" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.914387 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-58f8f59779-9rrsx" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.961252 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" event={"ID":"df175395-7cbc-4158-907a-b0ffce6a6efb","Type":"ContainerDied","Data":"e875c5961bfe1a0e28d994fff2e74baf4626c1e3462d487d459b48ba81bfec43"} Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.961337 4732 scope.go:117] "RemoveContainer" containerID="8cf41a4bd5d2561802f652340045d6cce5f45df033272a00228b23776c4106cd" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.961571 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-sxl2t" Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.974166 4732 generic.go:334] "Generic (PLEG): container finished" podID="88748d2e-8313-467e-b707-e82e1af776d5" containerID="4f997405de3433e224878d0c0a68fb4b21ec3d43799d53e42d66304cefa5956d" exitCode=0 Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.974250 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76fc857857-knj58" event={"ID":"88748d2e-8313-467e-b707-e82e1af776d5","Type":"ContainerDied","Data":"4f997405de3433e224878d0c0a68fb4b21ec3d43799d53e42d66304cefa5956d"} Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.994019 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" event={"ID":"56849534-b6ee-4c76-a706-22af33d61018","Type":"ContainerStarted","Data":"fc38836c00961eba57500c9612eea7c44644cb5293906ff4ac62e9cc85f02adf"} Apr 02 14:00:48 crc kubenswrapper[4732]: I0402 14:00:48.995147 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" Apr 02 14:00:49 crc kubenswrapper[4732]: I0402 14:00:49.002702 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-sxl2t"] Apr 02 14:00:49 crc kubenswrapper[4732]: I0402 14:00:49.010898 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c879c6666-5kls7" event={"ID":"362f9e50-6f86-41ff-ae02-e0b8565fa55f","Type":"ContainerStarted","Data":"b98896b48ddb8cbe9ce2f53f7d90bb561362f8cf206b3ee382b2e11c3b71bd50"} Apr 02 14:00:49 crc kubenswrapper[4732]: I0402 14:00:49.011059 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:49 crc kubenswrapper[4732]: I0402 14:00:49.011358 4732 scope.go:117] "RemoveContainer" containerID="e76ca9aaf5f8fc2b70cb140c6f56b7a8c1165dd43deace98f1541d4c99b9337f" Apr 02 14:00:49 crc kubenswrapper[4732]: I0402 14:00:49.013523 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:49 crc kubenswrapper[4732]: I0402 14:00:49.020543 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-sxl2t"] Apr 02 14:00:49 crc kubenswrapper[4732]: I0402 14:00:49.024714 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" podStartSLOduration=4.024692018 podStartE2EDuration="4.024692018s" podCreationTimestamp="2026-04-02 14:00:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:00:49.011950145 +0000 UTC m=+1405.916357728" watchObservedRunningTime="2026-04-02 14:00:49.024692018 +0000 UTC m=+1405.929099571" Apr 02 14:00:49 crc kubenswrapper[4732]: I0402 14:00:49.046460 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6c879c6666-5kls7" podStartSLOduration=4.046436694 podStartE2EDuration="4.046436694s" podCreationTimestamp="2026-04-02 14:00:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:00:49.035436368 +0000 UTC m=+1405.939843931" watchObservedRunningTime="2026-04-02 14:00:49.046436694 +0000 UTC m=+1405.950844247" Apr 02 14:00:49 crc kubenswrapper[4732]: I0402 14:00:49.638478 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-58f8f59779-9rrsx"] Apr 02 14:00:49 crc kubenswrapper[4732]: I0402 14:00:49.753886 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 14:00:49 crc kubenswrapper[4732]: I0402 14:00:49.820149 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-54f994999b-b88d7" Apr 02 14:00:50 crc kubenswrapper[4732]: I0402 14:00:50.039241 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0e53540-98b3-463a-9611-a48c2fbfc0f5","Type":"ContainerStarted","Data":"e9c0400338b7d02a8e2c26227681c2de2c1fcd12163690172f04be090f9a7be0"} Apr 02 14:00:50 crc kubenswrapper[4732]: I0402 14:00:50.041025 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Apr 02 14:00:50 crc kubenswrapper[4732]: I0402 14:00:50.047727 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c3ab5d4f-9beb-4c41-954d-bc917548f495","Type":"ContainerStarted","Data":"e2529b80f2329b677cbfb274eef570beba102ba661be4a7cae791968ff0383d7"} Apr 02 14:00:50 crc kubenswrapper[4732]: I0402 14:00:50.058224 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58f8f59779-9rrsx" event={"ID":"cd0e816f-6d8e-4ed8-884c-ee38cec72d94","Type":"ContainerStarted","Data":"e868b42cc8744196e6800ce7db4e52778e2a6a0fe70d129a1e06eff57d5f702b"} Apr 02 14:00:50 crc kubenswrapper[4732]: I0402 14:00:50.082164 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="52795ed2-6cd9-4380-8826-68d5f55d87df" containerName="cinder-api-log" containerID="cri-o://c0c0352edd155d350c9fc88dd046ff9f7dbe5dc2eac74913f48bc823d5d4cb81" gracePeriod=30 Apr 02 14:00:50 crc kubenswrapper[4732]: I0402 14:00:50.082475 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"52795ed2-6cd9-4380-8826-68d5f55d87df","Type":"ContainerStarted","Data":"ba990cca85af2a75a0dffffeb85ff242c91dca204e5aca5bbb384560071ec6e5"} Apr 02 14:00:50 crc kubenswrapper[4732]: I0402 14:00:50.083052 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Apr 02 14:00:50 crc kubenswrapper[4732]: I0402 14:00:50.083121 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="52795ed2-6cd9-4380-8826-68d5f55d87df" containerName="cinder-api" containerID="cri-o://ba990cca85af2a75a0dffffeb85ff242c91dca204e5aca5bbb384560071ec6e5" gracePeriod=30 Apr 02 14:00:50 crc kubenswrapper[4732]: I0402 14:00:50.086849 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.911669821 podStartE2EDuration="8.086835164s" podCreationTimestamp="2026-04-02 14:00:42 +0000 UTC" firstStartedPulling="2026-04-02 14:00:43.452251543 +0000 UTC m=+1400.356659096" lastFinishedPulling="2026-04-02 14:00:49.627416886 +0000 UTC m=+1406.531824439" observedRunningTime="2026-04-02 14:00:50.083469533 +0000 UTC m=+1406.987877106" watchObservedRunningTime="2026-04-02 14:00:50.086835164 +0000 UTC m=+1406.991242717" Apr 02 14:00:50 crc kubenswrapper[4732]: I0402 14:00:50.139189 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.139150184 podStartE2EDuration="5.139150184s" podCreationTimestamp="2026-04-02 14:00:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:00:50.124719975 +0000 UTC m=+1407.029127548" watchObservedRunningTime="2026-04-02 14:00:50.139150184 +0000 UTC m=+1407.043557737" Apr 02 14:00:50 crc kubenswrapper[4732]: I0402 14:00:50.365250 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-76fc857857-knj58" podUID="88748d2e-8313-467e-b707-e82e1af776d5" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.158:9696/\": dial tcp 10.217.0.158:9696: connect: connection refused" Apr 02 14:00:50 crc kubenswrapper[4732]: I0402 14:00:50.712809 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df175395-7cbc-4158-907a-b0ffce6a6efb" path="/var/lib/kubelet/pods/df175395-7cbc-4158-907a-b0ffce6a6efb/volumes" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.055233 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.131882 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58f8f59779-9rrsx" event={"ID":"cd0e816f-6d8e-4ed8-884c-ee38cec72d94","Type":"ContainerStarted","Data":"cddd423e9c22a3236c4b14ad53383df4a70f521fbb443edc280ac775b501ae35"} Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.132201 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58f8f59779-9rrsx" event={"ID":"cd0e816f-6d8e-4ed8-884c-ee38cec72d94","Type":"ContainerStarted","Data":"76a03b97813c0f4a9de4fc3fa35c63110b6a1e8bd802f75af5ecfd52ba0da3d3"} Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.132798 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-58f8f59779-9rrsx" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.155900 4732 generic.go:334] "Generic (PLEG): container finished" podID="52795ed2-6cd9-4380-8826-68d5f55d87df" containerID="ba990cca85af2a75a0dffffeb85ff242c91dca204e5aca5bbb384560071ec6e5" exitCode=0 Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.155937 4732 generic.go:334] "Generic (PLEG): container finished" podID="52795ed2-6cd9-4380-8826-68d5f55d87df" containerID="c0c0352edd155d350c9fc88dd046ff9f7dbe5dc2eac74913f48bc823d5d4cb81" exitCode=143 Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.156010 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"52795ed2-6cd9-4380-8826-68d5f55d87df","Type":"ContainerDied","Data":"ba990cca85af2a75a0dffffeb85ff242c91dca204e5aca5bbb384560071ec6e5"} Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.156037 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"52795ed2-6cd9-4380-8826-68d5f55d87df","Type":"ContainerDied","Data":"c0c0352edd155d350c9fc88dd046ff9f7dbe5dc2eac74913f48bc823d5d4cb81"} Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.156048 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"52795ed2-6cd9-4380-8826-68d5f55d87df","Type":"ContainerDied","Data":"21b65e6f470af4c9cc9e135beffdc6b3c501e4f2db97ae18433c050d66197062"} Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.156065 4732 scope.go:117] "RemoveContainer" containerID="ba990cca85af2a75a0dffffeb85ff242c91dca204e5aca5bbb384560071ec6e5" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.156185 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.177878 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c3ab5d4f-9beb-4c41-954d-bc917548f495","Type":"ContainerStarted","Data":"55e06268ab600cfcf099a3af14bdb9abe464b33cd010165cea22950b4e3c9066"} Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.179892 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-58f8f59779-9rrsx" podStartSLOduration=3.179870323 podStartE2EDuration="3.179870323s" podCreationTimestamp="2026-04-02 14:00:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:00:51.169040982 +0000 UTC m=+1408.073448535" watchObservedRunningTime="2026-04-02 14:00:51.179870323 +0000 UTC m=+1408.084277886" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.209190 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52795ed2-6cd9-4380-8826-68d5f55d87df-scripts\") pod \"52795ed2-6cd9-4380-8826-68d5f55d87df\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.209248 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52795ed2-6cd9-4380-8826-68d5f55d87df-config-data-custom\") pod \"52795ed2-6cd9-4380-8826-68d5f55d87df\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.209381 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52795ed2-6cd9-4380-8826-68d5f55d87df-logs\") pod \"52795ed2-6cd9-4380-8826-68d5f55d87df\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.209438 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52795ed2-6cd9-4380-8826-68d5f55d87df-config-data\") pod \"52795ed2-6cd9-4380-8826-68d5f55d87df\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.209489 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52795ed2-6cd9-4380-8826-68d5f55d87df-combined-ca-bundle\") pod \"52795ed2-6cd9-4380-8826-68d5f55d87df\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.209526 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/52795ed2-6cd9-4380-8826-68d5f55d87df-etc-machine-id\") pod \"52795ed2-6cd9-4380-8826-68d5f55d87df\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.209634 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpf8s\" (UniqueName: \"kubernetes.io/projected/52795ed2-6cd9-4380-8826-68d5f55d87df-kube-api-access-tpf8s\") pod \"52795ed2-6cd9-4380-8826-68d5f55d87df\" (UID: \"52795ed2-6cd9-4380-8826-68d5f55d87df\") " Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.209935 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.285788632 podStartE2EDuration="7.209909483s" podCreationTimestamp="2026-04-02 14:00:44 +0000 UTC" firstStartedPulling="2026-04-02 14:00:46.065934451 +0000 UTC m=+1402.970342004" lastFinishedPulling="2026-04-02 14:00:47.990055292 +0000 UTC m=+1404.894462855" observedRunningTime="2026-04-02 14:00:51.207719024 +0000 UTC m=+1408.112126597" watchObservedRunningTime="2026-04-02 14:00:51.209909483 +0000 UTC m=+1408.114317046" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.210171 4732 scope.go:117] "RemoveContainer" containerID="c0c0352edd155d350c9fc88dd046ff9f7dbe5dc2eac74913f48bc823d5d4cb81" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.211101 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52795ed2-6cd9-4380-8826-68d5f55d87df-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "52795ed2-6cd9-4380-8826-68d5f55d87df" (UID: "52795ed2-6cd9-4380-8826-68d5f55d87df"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.211152 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52795ed2-6cd9-4380-8826-68d5f55d87df-logs" (OuterVolumeSpecName: "logs") pod "52795ed2-6cd9-4380-8826-68d5f55d87df" (UID: "52795ed2-6cd9-4380-8826-68d5f55d87df"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.222904 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52795ed2-6cd9-4380-8826-68d5f55d87df-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "52795ed2-6cd9-4380-8826-68d5f55d87df" (UID: "52795ed2-6cd9-4380-8826-68d5f55d87df"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.223541 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52795ed2-6cd9-4380-8826-68d5f55d87df-scripts" (OuterVolumeSpecName: "scripts") pod "52795ed2-6cd9-4380-8826-68d5f55d87df" (UID: "52795ed2-6cd9-4380-8826-68d5f55d87df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.235195 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52795ed2-6cd9-4380-8826-68d5f55d87df-kube-api-access-tpf8s" (OuterVolumeSpecName: "kube-api-access-tpf8s") pod "52795ed2-6cd9-4380-8826-68d5f55d87df" (UID: "52795ed2-6cd9-4380-8826-68d5f55d87df"). InnerVolumeSpecName "kube-api-access-tpf8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.274948 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52795ed2-6cd9-4380-8826-68d5f55d87df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52795ed2-6cd9-4380-8826-68d5f55d87df" (UID: "52795ed2-6cd9-4380-8826-68d5f55d87df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.278101 4732 scope.go:117] "RemoveContainer" containerID="ba990cca85af2a75a0dffffeb85ff242c91dca204e5aca5bbb384560071ec6e5" Apr 02 14:00:51 crc kubenswrapper[4732]: E0402 14:00:51.278905 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba990cca85af2a75a0dffffeb85ff242c91dca204e5aca5bbb384560071ec6e5\": container with ID starting with ba990cca85af2a75a0dffffeb85ff242c91dca204e5aca5bbb384560071ec6e5 not found: ID does not exist" containerID="ba990cca85af2a75a0dffffeb85ff242c91dca204e5aca5bbb384560071ec6e5" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.278937 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba990cca85af2a75a0dffffeb85ff242c91dca204e5aca5bbb384560071ec6e5"} err="failed to get container status \"ba990cca85af2a75a0dffffeb85ff242c91dca204e5aca5bbb384560071ec6e5\": rpc error: code = NotFound desc = could not find container \"ba990cca85af2a75a0dffffeb85ff242c91dca204e5aca5bbb384560071ec6e5\": container with ID starting with ba990cca85af2a75a0dffffeb85ff242c91dca204e5aca5bbb384560071ec6e5 not found: ID does not exist" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.278959 4732 scope.go:117] "RemoveContainer" containerID="c0c0352edd155d350c9fc88dd046ff9f7dbe5dc2eac74913f48bc823d5d4cb81" Apr 02 14:00:51 crc kubenswrapper[4732]: E0402 14:00:51.283475 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0c0352edd155d350c9fc88dd046ff9f7dbe5dc2eac74913f48bc823d5d4cb81\": container with ID starting with c0c0352edd155d350c9fc88dd046ff9f7dbe5dc2eac74913f48bc823d5d4cb81 not found: ID does not exist" containerID="c0c0352edd155d350c9fc88dd046ff9f7dbe5dc2eac74913f48bc823d5d4cb81" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.283514 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c0352edd155d350c9fc88dd046ff9f7dbe5dc2eac74913f48bc823d5d4cb81"} err="failed to get container status \"c0c0352edd155d350c9fc88dd046ff9f7dbe5dc2eac74913f48bc823d5d4cb81\": rpc error: code = NotFound desc = could not find container \"c0c0352edd155d350c9fc88dd046ff9f7dbe5dc2eac74913f48bc823d5d4cb81\": container with ID starting with c0c0352edd155d350c9fc88dd046ff9f7dbe5dc2eac74913f48bc823d5d4cb81 not found: ID does not exist" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.283536 4732 scope.go:117] "RemoveContainer" containerID="ba990cca85af2a75a0dffffeb85ff242c91dca204e5aca5bbb384560071ec6e5" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.287159 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba990cca85af2a75a0dffffeb85ff242c91dca204e5aca5bbb384560071ec6e5"} err="failed to get container status \"ba990cca85af2a75a0dffffeb85ff242c91dca204e5aca5bbb384560071ec6e5\": rpc error: code = NotFound desc = could not find container \"ba990cca85af2a75a0dffffeb85ff242c91dca204e5aca5bbb384560071ec6e5\": container with ID starting with ba990cca85af2a75a0dffffeb85ff242c91dca204e5aca5bbb384560071ec6e5 not found: ID does not exist" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.287204 4732 scope.go:117] "RemoveContainer" containerID="c0c0352edd155d350c9fc88dd046ff9f7dbe5dc2eac74913f48bc823d5d4cb81" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.298089 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c0352edd155d350c9fc88dd046ff9f7dbe5dc2eac74913f48bc823d5d4cb81"} err="failed to get container status \"c0c0352edd155d350c9fc88dd046ff9f7dbe5dc2eac74913f48bc823d5d4cb81\": rpc error: code = NotFound desc = could not find container \"c0c0352edd155d350c9fc88dd046ff9f7dbe5dc2eac74913f48bc823d5d4cb81\": container with ID starting with c0c0352edd155d350c9fc88dd046ff9f7dbe5dc2eac74913f48bc823d5d4cb81 not found: ID does not exist" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.311940 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpf8s\" (UniqueName: \"kubernetes.io/projected/52795ed2-6cd9-4380-8826-68d5f55d87df-kube-api-access-tpf8s\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.312988 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52795ed2-6cd9-4380-8826-68d5f55d87df-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.313018 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52795ed2-6cd9-4380-8826-68d5f55d87df-config-data-custom\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.313031 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52795ed2-6cd9-4380-8826-68d5f55d87df-logs\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.313041 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52795ed2-6cd9-4380-8826-68d5f55d87df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.313054 4732 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/52795ed2-6cd9-4380-8826-68d5f55d87df-etc-machine-id\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.323996 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52795ed2-6cd9-4380-8826-68d5f55d87df-config-data" (OuterVolumeSpecName: "config-data") pod "52795ed2-6cd9-4380-8826-68d5f55d87df" (UID: "52795ed2-6cd9-4380-8826-68d5f55d87df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.417315 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52795ed2-6cd9-4380-8826-68d5f55d87df-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.485982 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.497804 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.523364 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Apr 02 14:00:51 crc kubenswrapper[4732]: E0402 14:00:51.536498 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52795ed2-6cd9-4380-8826-68d5f55d87df" containerName="cinder-api-log" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.536551 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="52795ed2-6cd9-4380-8826-68d5f55d87df" containerName="cinder-api-log" Apr 02 14:00:51 crc kubenswrapper[4732]: E0402 14:00:51.536574 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52795ed2-6cd9-4380-8826-68d5f55d87df" containerName="cinder-api" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.536584 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="52795ed2-6cd9-4380-8826-68d5f55d87df" containerName="cinder-api" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.536874 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="52795ed2-6cd9-4380-8826-68d5f55d87df" containerName="cinder-api-log" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.536896 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="52795ed2-6cd9-4380-8826-68d5f55d87df" containerName="cinder-api" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.537815 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.537900 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.539757 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.540487 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.540804 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.730106 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpglf\" (UniqueName: \"kubernetes.io/projected/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-kube-api-access-kpglf\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.730514 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-config-data\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.730687 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.730831 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-scripts\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.730975 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-logs\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.731007 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.731770 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-config-data-custom\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.733281 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.733471 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.835207 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-config-data\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.835516 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.835556 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-scripts\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.835577 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-logs\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.835595 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.835671 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-config-data-custom\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.835716 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.835798 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.835897 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpglf\" (UniqueName: \"kubernetes.io/projected/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-kube-api-access-kpglf\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.836277 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-logs\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.838119 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.842673 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-scripts\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.842790 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.844601 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-config-data\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.845320 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-config-data-custom\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.846376 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.854105 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpglf\" (UniqueName: \"kubernetes.io/projected/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-kube-api-access-kpglf\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:51 crc kubenswrapper[4732]: I0402 14:00:51.861223 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fc3b5a4-f5bf-44ea-aa53-d93a32900271-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9fc3b5a4-f5bf-44ea-aa53-d93a32900271\") " pod="openstack/cinder-api-0" Apr 02 14:00:52 crc kubenswrapper[4732]: I0402 14:00:52.156660 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Apr 02 14:00:52 crc kubenswrapper[4732]: I0402 14:00:52.575262 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 14:00:52 crc kubenswrapper[4732]: I0402 14:00:52.667374 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-54f994999b-b88d7" Apr 02 14:00:52 crc kubenswrapper[4732]: I0402 14:00:52.702377 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52795ed2-6cd9-4380-8826-68d5f55d87df" path="/var/lib/kubelet/pods/52795ed2-6cd9-4380-8826-68d5f55d87df/volumes" Apr 02 14:00:52 crc kubenswrapper[4732]: I0402 14:00:52.763680 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-59fb764b6d-vml5x"] Apr 02 14:00:52 crc kubenswrapper[4732]: I0402 14:00:52.770917 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Apr 02 14:00:53 crc kubenswrapper[4732]: I0402 14:00:53.202442 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9fc3b5a4-f5bf-44ea-aa53-d93a32900271","Type":"ContainerStarted","Data":"c15079111e365d8ade5213bef57a47ddbdb6ab521da1e44d961a0a7386ef80ef"} Apr 02 14:00:53 crc kubenswrapper[4732]: I0402 14:00:53.202686 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-59fb764b6d-vml5x" podUID="80fe4580-a48f-4cba-b3b3-d6ccbc7f0525" containerName="horizon-log" containerID="cri-o://aecec17f0e79f52981ef4808d1800fe67aab89d44b00ed1c7e216149c5ee7fc8" gracePeriod=30 Apr 02 14:00:53 crc kubenswrapper[4732]: I0402 14:00:53.203075 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-59fb764b6d-vml5x" podUID="80fe4580-a48f-4cba-b3b3-d6ccbc7f0525" containerName="horizon" containerID="cri-o://56cb1e14c01780829e3d908dfe3c04fea17b711fbfdaa9357cab714feddecfb5" gracePeriod=30 Apr 02 14:00:53 crc kubenswrapper[4732]: I0402 14:00:53.873432 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.002409 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-httpd-config\") pod \"88748d2e-8313-467e-b707-e82e1af776d5\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.002464 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-internal-tls-certs\") pod \"88748d2e-8313-467e-b707-e82e1af776d5\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.002507 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-ovndb-tls-certs\") pod \"88748d2e-8313-467e-b707-e82e1af776d5\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.002627 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9578\" (UniqueName: \"kubernetes.io/projected/88748d2e-8313-467e-b707-e82e1af776d5-kube-api-access-b9578\") pod \"88748d2e-8313-467e-b707-e82e1af776d5\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.002682 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-combined-ca-bundle\") pod \"88748d2e-8313-467e-b707-e82e1af776d5\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.002754 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-config\") pod \"88748d2e-8313-467e-b707-e82e1af776d5\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.002779 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-public-tls-certs\") pod \"88748d2e-8313-467e-b707-e82e1af776d5\" (UID: \"88748d2e-8313-467e-b707-e82e1af776d5\") " Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.011780 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "88748d2e-8313-467e-b707-e82e1af776d5" (UID: "88748d2e-8313-467e-b707-e82e1af776d5"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.011846 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88748d2e-8313-467e-b707-e82e1af776d5-kube-api-access-b9578" (OuterVolumeSpecName: "kube-api-access-b9578") pod "88748d2e-8313-467e-b707-e82e1af776d5" (UID: "88748d2e-8313-467e-b707-e82e1af776d5"). InnerVolumeSpecName "kube-api-access-b9578". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.095952 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "88748d2e-8313-467e-b707-e82e1af776d5" (UID: "88748d2e-8313-467e-b707-e82e1af776d5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.097357 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-config" (OuterVolumeSpecName: "config") pod "88748d2e-8313-467e-b707-e82e1af776d5" (UID: "88748d2e-8313-467e-b707-e82e1af776d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.104875 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-config\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.104922 4732 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.104933 4732 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-httpd-config\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.104942 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9578\" (UniqueName: \"kubernetes.io/projected/88748d2e-8313-467e-b707-e82e1af776d5-kube-api-access-b9578\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.109063 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "88748d2e-8313-467e-b707-e82e1af776d5" (UID: "88748d2e-8313-467e-b707-e82e1af776d5"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.109687 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88748d2e-8313-467e-b707-e82e1af776d5" (UID: "88748d2e-8313-467e-b707-e82e1af776d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.122849 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "88748d2e-8313-467e-b707-e82e1af776d5" (UID: "88748d2e-8313-467e-b707-e82e1af776d5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.206398 4732 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.206434 4732 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.206446 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88748d2e-8313-467e-b707-e82e1af776d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.265652 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9fc3b5a4-f5bf-44ea-aa53-d93a32900271","Type":"ContainerStarted","Data":"d6508dd9fa04cd37d23467892fd95029cd0336833c6fdf2c2d234e611877f16d"} Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.271145 4732 generic.go:334] "Generic (PLEG): container finished" podID="88748d2e-8313-467e-b707-e82e1af776d5" containerID="4bcf35b9dca26827b3b6375d989726ce646c242d2a9befcc53a1c6bbdc4e4e3d" exitCode=0 Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.271186 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76fc857857-knj58" event={"ID":"88748d2e-8313-467e-b707-e82e1af776d5","Type":"ContainerDied","Data":"4bcf35b9dca26827b3b6375d989726ce646c242d2a9befcc53a1c6bbdc4e4e3d"} Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.271213 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76fc857857-knj58" event={"ID":"88748d2e-8313-467e-b707-e82e1af776d5","Type":"ContainerDied","Data":"aae01bcd908a540852406a3baf1c25bd0e1b0e43b9f76a9580475d13a78f0edb"} Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.271230 4732 scope.go:117] "RemoveContainer" containerID="4f997405de3433e224878d0c0a68fb4b21ec3d43799d53e42d66304cefa5956d" Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.271356 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76fc857857-knj58" Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.358682 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-76fc857857-knj58"] Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.423375 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-76fc857857-knj58"] Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.423376 4732 scope.go:117] "RemoveContainer" containerID="4bcf35b9dca26827b3b6375d989726ce646c242d2a9befcc53a1c6bbdc4e4e3d" Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.466093 4732 scope.go:117] "RemoveContainer" containerID="4f997405de3433e224878d0c0a68fb4b21ec3d43799d53e42d66304cefa5956d" Apr 02 14:00:54 crc kubenswrapper[4732]: E0402 14:00:54.469677 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f997405de3433e224878d0c0a68fb4b21ec3d43799d53e42d66304cefa5956d\": container with ID starting with 4f997405de3433e224878d0c0a68fb4b21ec3d43799d53e42d66304cefa5956d not found: ID does not exist" containerID="4f997405de3433e224878d0c0a68fb4b21ec3d43799d53e42d66304cefa5956d" Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.469731 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f997405de3433e224878d0c0a68fb4b21ec3d43799d53e42d66304cefa5956d"} err="failed to get container status \"4f997405de3433e224878d0c0a68fb4b21ec3d43799d53e42d66304cefa5956d\": rpc error: code = NotFound desc = could not find container \"4f997405de3433e224878d0c0a68fb4b21ec3d43799d53e42d66304cefa5956d\": container with ID starting with 4f997405de3433e224878d0c0a68fb4b21ec3d43799d53e42d66304cefa5956d not found: ID does not exist" Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.469761 4732 scope.go:117] "RemoveContainer" containerID="4bcf35b9dca26827b3b6375d989726ce646c242d2a9befcc53a1c6bbdc4e4e3d" Apr 02 14:00:54 crc kubenswrapper[4732]: E0402 14:00:54.471563 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bcf35b9dca26827b3b6375d989726ce646c242d2a9befcc53a1c6bbdc4e4e3d\": container with ID starting with 4bcf35b9dca26827b3b6375d989726ce646c242d2a9befcc53a1c6bbdc4e4e3d not found: ID does not exist" containerID="4bcf35b9dca26827b3b6375d989726ce646c242d2a9befcc53a1c6bbdc4e4e3d" Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.471593 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bcf35b9dca26827b3b6375d989726ce646c242d2a9befcc53a1c6bbdc4e4e3d"} err="failed to get container status \"4bcf35b9dca26827b3b6375d989726ce646c242d2a9befcc53a1c6bbdc4e4e3d\": rpc error: code = NotFound desc = could not find container \"4bcf35b9dca26827b3b6375d989726ce646c242d2a9befcc53a1c6bbdc4e4e3d\": container with ID starting with 4bcf35b9dca26827b3b6375d989726ce646c242d2a9befcc53a1c6bbdc4e4e3d not found: ID does not exist" Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.694443 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88748d2e-8313-467e-b707-e82e1af776d5" path="/var/lib/kubelet/pods/88748d2e-8313-467e-b707-e82e1af776d5/volumes" Apr 02 14:00:54 crc kubenswrapper[4732]: I0402 14:00:54.983128 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6488b8fdcd-j9s62" Apr 02 14:00:55 crc kubenswrapper[4732]: I0402 14:00:55.228516 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6488b8fdcd-j9s62" Apr 02 14:00:55 crc kubenswrapper[4732]: I0402 14:00:55.233467 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Apr 02 14:00:55 crc kubenswrapper[4732]: I0402 14:00:55.305442 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9fc3b5a4-f5bf-44ea-aa53-d93a32900271","Type":"ContainerStarted","Data":"ae11ed9e8db50a7f374b7743aa50e1f197c30b5fcd38db89e85cd57b998054a3"} Apr 02 14:00:55 crc kubenswrapper[4732]: I0402 14:00:55.305961 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Apr 02 14:00:55 crc kubenswrapper[4732]: I0402 14:00:55.345602 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.345584418 podStartE2EDuration="4.345584418s" podCreationTimestamp="2026-04-02 14:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:00:55.337178771 +0000 UTC m=+1412.241586324" watchObservedRunningTime="2026-04-02 14:00:55.345584418 +0000 UTC m=+1412.249991971" Apr 02 14:00:55 crc kubenswrapper[4732]: I0402 14:00:55.485746 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" Apr 02 14:00:55 crc kubenswrapper[4732]: I0402 14:00:55.587029 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-6cb8x"] Apr 02 14:00:55 crc kubenswrapper[4732]: I0402 14:00:55.587430 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" podUID="b7924745-2bd5-4642-a2a0-21f8647be92b" containerName="dnsmasq-dns" containerID="cri-o://add10d00c96f619f2970add749542c0e114a3d6f8c0bf2d320b1141303f526c9" gracePeriod=10 Apr 02 14:00:55 crc kubenswrapper[4732]: I0402 14:00:55.661223 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Apr 02 14:00:55 crc kubenswrapper[4732]: I0402 14:00:55.733918 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.224657 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.329145 4732 generic.go:334] "Generic (PLEG): container finished" podID="b7924745-2bd5-4642-a2a0-21f8647be92b" containerID="add10d00c96f619f2970add749542c0e114a3d6f8c0bf2d320b1141303f526c9" exitCode=0 Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.329967 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.330012 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" event={"ID":"b7924745-2bd5-4642-a2a0-21f8647be92b","Type":"ContainerDied","Data":"add10d00c96f619f2970add749542c0e114a3d6f8c0bf2d320b1141303f526c9"} Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.330514 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-6cb8x" event={"ID":"b7924745-2bd5-4642-a2a0-21f8647be92b","Type":"ContainerDied","Data":"8d5336ef4acfd59c12f405bed6fc11bf3c57546864e2fe92f6fa0d933567e76e"} Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.330537 4732 scope.go:117] "RemoveContainer" containerID="add10d00c96f619f2970add749542c0e114a3d6f8c0bf2d320b1141303f526c9" Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.331003 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c3ab5d4f-9beb-4c41-954d-bc917548f495" containerName="cinder-scheduler" containerID="cri-o://e2529b80f2329b677cbfb274eef570beba102ba661be4a7cae791968ff0383d7" gracePeriod=30 Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.331024 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c3ab5d4f-9beb-4c41-954d-bc917548f495" containerName="probe" containerID="cri-o://55e06268ab600cfcf099a3af14bdb9abe464b33cd010165cea22950b4e3c9066" gracePeriod=30 Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.363943 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-dns-svc\") pod \"b7924745-2bd5-4642-a2a0-21f8647be92b\" (UID: \"b7924745-2bd5-4642-a2a0-21f8647be92b\") " Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.364610 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-dns-swift-storage-0\") pod \"b7924745-2bd5-4642-a2a0-21f8647be92b\" (UID: \"b7924745-2bd5-4642-a2a0-21f8647be92b\") " Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.364950 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-config\") pod \"b7924745-2bd5-4642-a2a0-21f8647be92b\" (UID: \"b7924745-2bd5-4642-a2a0-21f8647be92b\") " Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.365091 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-ovsdbserver-sb\") pod \"b7924745-2bd5-4642-a2a0-21f8647be92b\" (UID: \"b7924745-2bd5-4642-a2a0-21f8647be92b\") " Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.365238 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br5bw\" (UniqueName: \"kubernetes.io/projected/b7924745-2bd5-4642-a2a0-21f8647be92b-kube-api-access-br5bw\") pod \"b7924745-2bd5-4642-a2a0-21f8647be92b\" (UID: \"b7924745-2bd5-4642-a2a0-21f8647be92b\") " Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.365437 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-ovsdbserver-nb\") pod \"b7924745-2bd5-4642-a2a0-21f8647be92b\" (UID: \"b7924745-2bd5-4642-a2a0-21f8647be92b\") " Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.378058 4732 scope.go:117] "RemoveContainer" containerID="aa725130e6c0af58d6bc8fa560fcc0c0ef18a460a1891aac88bb0ea3d0db9f83" Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.380814 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7924745-2bd5-4642-a2a0-21f8647be92b-kube-api-access-br5bw" (OuterVolumeSpecName: "kube-api-access-br5bw") pod "b7924745-2bd5-4642-a2a0-21f8647be92b" (UID: "b7924745-2bd5-4642-a2a0-21f8647be92b"). InnerVolumeSpecName "kube-api-access-br5bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.470492 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-config" (OuterVolumeSpecName: "config") pod "b7924745-2bd5-4642-a2a0-21f8647be92b" (UID: "b7924745-2bd5-4642-a2a0-21f8647be92b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.471071 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-config\") pod \"b7924745-2bd5-4642-a2a0-21f8647be92b\" (UID: \"b7924745-2bd5-4642-a2a0-21f8647be92b\") " Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.471885 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br5bw\" (UniqueName: \"kubernetes.io/projected/b7924745-2bd5-4642-a2a0-21f8647be92b-kube-api-access-br5bw\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:56 crc kubenswrapper[4732]: W0402 14:00:56.472230 4732 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/b7924745-2bd5-4642-a2a0-21f8647be92b/volumes/kubernetes.io~configmap/config Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.472244 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-config" (OuterVolumeSpecName: "config") pod "b7924745-2bd5-4642-a2a0-21f8647be92b" (UID: "b7924745-2bd5-4642-a2a0-21f8647be92b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.478870 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b7924745-2bd5-4642-a2a0-21f8647be92b" (UID: "b7924745-2bd5-4642-a2a0-21f8647be92b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.488784 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b7924745-2bd5-4642-a2a0-21f8647be92b" (UID: "b7924745-2bd5-4642-a2a0-21f8647be92b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.497200 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b7924745-2bd5-4642-a2a0-21f8647be92b" (UID: "b7924745-2bd5-4642-a2a0-21f8647be92b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.498076 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b7924745-2bd5-4642-a2a0-21f8647be92b" (UID: "b7924745-2bd5-4642-a2a0-21f8647be92b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.538507 4732 scope.go:117] "RemoveContainer" containerID="add10d00c96f619f2970add749542c0e114a3d6f8c0bf2d320b1141303f526c9" Apr 02 14:00:56 crc kubenswrapper[4732]: E0402 14:00:56.542026 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"add10d00c96f619f2970add749542c0e114a3d6f8c0bf2d320b1141303f526c9\": container with ID starting with add10d00c96f619f2970add749542c0e114a3d6f8c0bf2d320b1141303f526c9 not found: ID does not exist" containerID="add10d00c96f619f2970add749542c0e114a3d6f8c0bf2d320b1141303f526c9" Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.542160 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"add10d00c96f619f2970add749542c0e114a3d6f8c0bf2d320b1141303f526c9"} err="failed to get container status \"add10d00c96f619f2970add749542c0e114a3d6f8c0bf2d320b1141303f526c9\": rpc error: code = NotFound desc = could not find container \"add10d00c96f619f2970add749542c0e114a3d6f8c0bf2d320b1141303f526c9\": container with ID starting with add10d00c96f619f2970add749542c0e114a3d6f8c0bf2d320b1141303f526c9 not found: ID does not exist" Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.542239 4732 scope.go:117] "RemoveContainer" containerID="aa725130e6c0af58d6bc8fa560fcc0c0ef18a460a1891aac88bb0ea3d0db9f83" Apr 02 14:00:56 crc kubenswrapper[4732]: E0402 14:00:56.543455 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa725130e6c0af58d6bc8fa560fcc0c0ef18a460a1891aac88bb0ea3d0db9f83\": container with ID starting with aa725130e6c0af58d6bc8fa560fcc0c0ef18a460a1891aac88bb0ea3d0db9f83 not found: ID does not exist" containerID="aa725130e6c0af58d6bc8fa560fcc0c0ef18a460a1891aac88bb0ea3d0db9f83" Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.543483 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa725130e6c0af58d6bc8fa560fcc0c0ef18a460a1891aac88bb0ea3d0db9f83"} err="failed to get container status \"aa725130e6c0af58d6bc8fa560fcc0c0ef18a460a1891aac88bb0ea3d0db9f83\": rpc error: code = NotFound desc = could not find container \"aa725130e6c0af58d6bc8fa560fcc0c0ef18a460a1891aac88bb0ea3d0db9f83\": container with ID starting with aa725130e6c0af58d6bc8fa560fcc0c0ef18a460a1891aac88bb0ea3d0db9f83 not found: ID does not exist" Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.573225 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-config\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.576548 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.576651 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.576754 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.576839 4732 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7924745-2bd5-4642-a2a0-21f8647be92b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.596338 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-59fb764b6d-vml5x" podUID="80fe4580-a48f-4cba-b3b3-d6ccbc7f0525" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.676190 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-6cb8x"] Apr 02 14:00:56 crc kubenswrapper[4732]: I0402 14:00:56.703035 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-6cb8x"] Apr 02 14:00:57 crc kubenswrapper[4732]: I0402 14:00:57.379013 4732 generic.go:334] "Generic (PLEG): container finished" podID="80fe4580-a48f-4cba-b3b3-d6ccbc7f0525" containerID="56cb1e14c01780829e3d908dfe3c04fea17b711fbfdaa9357cab714feddecfb5" exitCode=0 Apr 02 14:00:57 crc kubenswrapper[4732]: I0402 14:00:57.379104 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59fb764b6d-vml5x" event={"ID":"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525","Type":"ContainerDied","Data":"56cb1e14c01780829e3d908dfe3c04fea17b711fbfdaa9357cab714feddecfb5"} Apr 02 14:00:57 crc kubenswrapper[4732]: I0402 14:00:57.384479 4732 generic.go:334] "Generic (PLEG): container finished" podID="c3ab5d4f-9beb-4c41-954d-bc917548f495" containerID="55e06268ab600cfcf099a3af14bdb9abe464b33cd010165cea22950b4e3c9066" exitCode=0 Apr 02 14:00:57 crc kubenswrapper[4732]: I0402 14:00:57.384535 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c3ab5d4f-9beb-4c41-954d-bc917548f495","Type":"ContainerDied","Data":"55e06268ab600cfcf099a3af14bdb9abe464b33cd010165cea22950b4e3c9066"} Apr 02 14:00:57 crc kubenswrapper[4732]: I0402 14:00:57.966409 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.054169 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6c879c6666-5kls7" Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.131004 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6488b8fdcd-j9s62"] Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.131213 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6488b8fdcd-j9s62" podUID="4c0b34b5-36cf-4f5d-91f2-618ff3a41adb" containerName="barbican-api-log" containerID="cri-o://3a555b820f07523abb2da70694769e55897c6dcea2fadc09fa51cb7d48986781" gracePeriod=30 Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.132467 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6488b8fdcd-j9s62" podUID="4c0b34b5-36cf-4f5d-91f2-618ff3a41adb" containerName="barbican-api" containerID="cri-o://aeb47ad0e1a6ab9d71c5c23188f45a87d4579aee6e0d77d78be8c7c18a512803" gracePeriod=30 Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.407911 4732 generic.go:334] "Generic (PLEG): container finished" podID="4c0b34b5-36cf-4f5d-91f2-618ff3a41adb" containerID="3a555b820f07523abb2da70694769e55897c6dcea2fadc09fa51cb7d48986781" exitCode=143 Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.408367 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6488b8fdcd-j9s62" event={"ID":"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb","Type":"ContainerDied","Data":"3a555b820f07523abb2da70694769e55897c6dcea2fadc09fa51cb7d48986781"} Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.410673 4732 generic.go:334] "Generic (PLEG): container finished" podID="c3ab5d4f-9beb-4c41-954d-bc917548f495" containerID="e2529b80f2329b677cbfb274eef570beba102ba661be4a7cae791968ff0383d7" exitCode=0 Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.410810 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c3ab5d4f-9beb-4c41-954d-bc917548f495","Type":"ContainerDied","Data":"e2529b80f2329b677cbfb274eef570beba102ba661be4a7cae791968ff0383d7"} Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.694051 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7924745-2bd5-4642-a2a0-21f8647be92b" path="/var/lib/kubelet/pods/b7924745-2bd5-4642-a2a0-21f8647be92b/volumes" Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.785501 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.820985 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3ab5d4f-9beb-4c41-954d-bc917548f495-etc-machine-id\") pod \"c3ab5d4f-9beb-4c41-954d-bc917548f495\" (UID: \"c3ab5d4f-9beb-4c41-954d-bc917548f495\") " Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.821039 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3ab5d4f-9beb-4c41-954d-bc917548f495-config-data-custom\") pod \"c3ab5d4f-9beb-4c41-954d-bc917548f495\" (UID: \"c3ab5d4f-9beb-4c41-954d-bc917548f495\") " Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.821078 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcm4x\" (UniqueName: \"kubernetes.io/projected/c3ab5d4f-9beb-4c41-954d-bc917548f495-kube-api-access-xcm4x\") pod \"c3ab5d4f-9beb-4c41-954d-bc917548f495\" (UID: \"c3ab5d4f-9beb-4c41-954d-bc917548f495\") " Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.821099 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3ab5d4f-9beb-4c41-954d-bc917548f495-scripts\") pod \"c3ab5d4f-9beb-4c41-954d-bc917548f495\" (UID: \"c3ab5d4f-9beb-4c41-954d-bc917548f495\") " Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.821132 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3ab5d4f-9beb-4c41-954d-bc917548f495-config-data\") pod \"c3ab5d4f-9beb-4c41-954d-bc917548f495\" (UID: \"c3ab5d4f-9beb-4c41-954d-bc917548f495\") " Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.821233 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3ab5d4f-9beb-4c41-954d-bc917548f495-combined-ca-bundle\") pod \"c3ab5d4f-9beb-4c41-954d-bc917548f495\" (UID: \"c3ab5d4f-9beb-4c41-954d-bc917548f495\") " Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.821918 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3ab5d4f-9beb-4c41-954d-bc917548f495-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c3ab5d4f-9beb-4c41-954d-bc917548f495" (UID: "c3ab5d4f-9beb-4c41-954d-bc917548f495"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.822281 4732 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3ab5d4f-9beb-4c41-954d-bc917548f495-etc-machine-id\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.826767 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3ab5d4f-9beb-4c41-954d-bc917548f495-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c3ab5d4f-9beb-4c41-954d-bc917548f495" (UID: "c3ab5d4f-9beb-4c41-954d-bc917548f495"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.828088 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3ab5d4f-9beb-4c41-954d-bc917548f495-scripts" (OuterVolumeSpecName: "scripts") pod "c3ab5d4f-9beb-4c41-954d-bc917548f495" (UID: "c3ab5d4f-9beb-4c41-954d-bc917548f495"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.849467 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3ab5d4f-9beb-4c41-954d-bc917548f495-kube-api-access-xcm4x" (OuterVolumeSpecName: "kube-api-access-xcm4x") pod "c3ab5d4f-9beb-4c41-954d-bc917548f495" (UID: "c3ab5d4f-9beb-4c41-954d-bc917548f495"). InnerVolumeSpecName "kube-api-access-xcm4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.921780 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3ab5d4f-9beb-4c41-954d-bc917548f495-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3ab5d4f-9beb-4c41-954d-bc917548f495" (UID: "c3ab5d4f-9beb-4c41-954d-bc917548f495"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.924813 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3ab5d4f-9beb-4c41-954d-bc917548f495-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.924844 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3ab5d4f-9beb-4c41-954d-bc917548f495-config-data-custom\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.924855 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcm4x\" (UniqueName: \"kubernetes.io/projected/c3ab5d4f-9beb-4c41-954d-bc917548f495-kube-api-access-xcm4x\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.924867 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3ab5d4f-9beb-4c41-954d-bc917548f495-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:58 crc kubenswrapper[4732]: I0402 14:00:58.951722 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3ab5d4f-9beb-4c41-954d-bc917548f495-config-data" (OuterVolumeSpecName: "config-data") pod "c3ab5d4f-9beb-4c41-954d-bc917548f495" (UID: "c3ab5d4f-9beb-4c41-954d-bc917548f495"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.026836 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3ab5d4f-9beb-4c41-954d-bc917548f495-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.435127 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c3ab5d4f-9beb-4c41-954d-bc917548f495","Type":"ContainerDied","Data":"d02977a3dde2f199a60cc75fb084b56c23c5917739586275ee9a5ded8a4a43c8"} Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.435479 4732 scope.go:117] "RemoveContainer" containerID="55e06268ab600cfcf099a3af14bdb9abe464b33cd010165cea22950b4e3c9066" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.435283 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.471190 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.478574 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.486050 4732 scope.go:117] "RemoveContainer" containerID="e2529b80f2329b677cbfb274eef570beba102ba661be4a7cae791968ff0383d7" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.487360 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Apr 02 14:00:59 crc kubenswrapper[4732]: E0402 14:00:59.487867 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88748d2e-8313-467e-b707-e82e1af776d5" containerName="neutron-api" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.487885 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="88748d2e-8313-467e-b707-e82e1af776d5" containerName="neutron-api" Apr 02 14:00:59 crc kubenswrapper[4732]: E0402 14:00:59.487900 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7924745-2bd5-4642-a2a0-21f8647be92b" containerName="init" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.487906 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7924745-2bd5-4642-a2a0-21f8647be92b" containerName="init" Apr 02 14:00:59 crc kubenswrapper[4732]: E0402 14:00:59.487931 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7924745-2bd5-4642-a2a0-21f8647be92b" containerName="dnsmasq-dns" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.487939 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7924745-2bd5-4642-a2a0-21f8647be92b" containerName="dnsmasq-dns" Apr 02 14:00:59 crc kubenswrapper[4732]: E0402 14:00:59.487951 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3ab5d4f-9beb-4c41-954d-bc917548f495" containerName="probe" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.487957 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3ab5d4f-9beb-4c41-954d-bc917548f495" containerName="probe" Apr 02 14:00:59 crc kubenswrapper[4732]: E0402 14:00:59.487966 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88748d2e-8313-467e-b707-e82e1af776d5" containerName="neutron-httpd" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.487972 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="88748d2e-8313-467e-b707-e82e1af776d5" containerName="neutron-httpd" Apr 02 14:00:59 crc kubenswrapper[4732]: E0402 14:00:59.487988 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3ab5d4f-9beb-4c41-954d-bc917548f495" containerName="cinder-scheduler" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.487994 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3ab5d4f-9beb-4c41-954d-bc917548f495" containerName="cinder-scheduler" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.488163 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3ab5d4f-9beb-4c41-954d-bc917548f495" containerName="probe" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.488176 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3ab5d4f-9beb-4c41-954d-bc917548f495" containerName="cinder-scheduler" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.488193 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="88748d2e-8313-467e-b707-e82e1af776d5" containerName="neutron-httpd" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.488206 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7924745-2bd5-4642-a2a0-21f8647be92b" containerName="dnsmasq-dns" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.488214 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="88748d2e-8313-467e-b707-e82e1af776d5" containerName="neutron-api" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.489090 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.493022 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.519745 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.536017 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7613bfc-a605-4485-b771-242a65e30df8-scripts\") pod \"cinder-scheduler-0\" (UID: \"d7613bfc-a605-4485-b771-242a65e30df8\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.536160 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7613bfc-a605-4485-b771-242a65e30df8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d7613bfc-a605-4485-b771-242a65e30df8\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.536223 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7613bfc-a605-4485-b771-242a65e30df8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d7613bfc-a605-4485-b771-242a65e30df8\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.536252 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv2bs\" (UniqueName: \"kubernetes.io/projected/d7613bfc-a605-4485-b771-242a65e30df8-kube-api-access-fv2bs\") pod \"cinder-scheduler-0\" (UID: \"d7613bfc-a605-4485-b771-242a65e30df8\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.536268 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7613bfc-a605-4485-b771-242a65e30df8-config-data\") pod \"cinder-scheduler-0\" (UID: \"d7613bfc-a605-4485-b771-242a65e30df8\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.536297 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7613bfc-a605-4485-b771-242a65e30df8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d7613bfc-a605-4485-b771-242a65e30df8\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.638252 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7613bfc-a605-4485-b771-242a65e30df8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d7613bfc-a605-4485-b771-242a65e30df8\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.638319 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv2bs\" (UniqueName: \"kubernetes.io/projected/d7613bfc-a605-4485-b771-242a65e30df8-kube-api-access-fv2bs\") pod \"cinder-scheduler-0\" (UID: \"d7613bfc-a605-4485-b771-242a65e30df8\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.638344 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7613bfc-a605-4485-b771-242a65e30df8-config-data\") pod \"cinder-scheduler-0\" (UID: \"d7613bfc-a605-4485-b771-242a65e30df8\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.638383 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7613bfc-a605-4485-b771-242a65e30df8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d7613bfc-a605-4485-b771-242a65e30df8\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.638426 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7613bfc-a605-4485-b771-242a65e30df8-scripts\") pod \"cinder-scheduler-0\" (UID: \"d7613bfc-a605-4485-b771-242a65e30df8\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.638524 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7613bfc-a605-4485-b771-242a65e30df8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d7613bfc-a605-4485-b771-242a65e30df8\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.639371 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7613bfc-a605-4485-b771-242a65e30df8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d7613bfc-a605-4485-b771-242a65e30df8\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.643325 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7613bfc-a605-4485-b771-242a65e30df8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d7613bfc-a605-4485-b771-242a65e30df8\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.643679 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7613bfc-a605-4485-b771-242a65e30df8-config-data\") pod \"cinder-scheduler-0\" (UID: \"d7613bfc-a605-4485-b771-242a65e30df8\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.647377 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7613bfc-a605-4485-b771-242a65e30df8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d7613bfc-a605-4485-b771-242a65e30df8\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.649329 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7613bfc-a605-4485-b771-242a65e30df8-scripts\") pod \"cinder-scheduler-0\" (UID: \"d7613bfc-a605-4485-b771-242a65e30df8\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.662278 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv2bs\" (UniqueName: \"kubernetes.io/projected/d7613bfc-a605-4485-b771-242a65e30df8-kube-api-access-fv2bs\") pod \"cinder-scheduler-0\" (UID: \"d7613bfc-a605-4485-b771-242a65e30df8\") " pod="openstack/cinder-scheduler-0" Apr 02 14:00:59 crc kubenswrapper[4732]: I0402 14:00:59.806794 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Apr 02 14:01:00 crc kubenswrapper[4732]: I0402 14:01:00.144861 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29585641-zldph"] Apr 02 14:01:00 crc kubenswrapper[4732]: I0402 14:01:00.146508 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29585641-zldph" Apr 02 14:01:00 crc kubenswrapper[4732]: I0402 14:01:00.157177 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29585641-zldph"] Apr 02 14:01:00 crc kubenswrapper[4732]: I0402 14:01:00.249712 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/61cd5173-b5d3-4cd7-a8ea-e4300054f364-fernet-keys\") pod \"keystone-cron-29585641-zldph\" (UID: \"61cd5173-b5d3-4cd7-a8ea-e4300054f364\") " pod="openstack/keystone-cron-29585641-zldph" Apr 02 14:01:00 crc kubenswrapper[4732]: I0402 14:01:00.249794 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fml4k\" (UniqueName: \"kubernetes.io/projected/61cd5173-b5d3-4cd7-a8ea-e4300054f364-kube-api-access-fml4k\") pod \"keystone-cron-29585641-zldph\" (UID: \"61cd5173-b5d3-4cd7-a8ea-e4300054f364\") " pod="openstack/keystone-cron-29585641-zldph" Apr 02 14:01:00 crc kubenswrapper[4732]: I0402 14:01:00.249835 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61cd5173-b5d3-4cd7-a8ea-e4300054f364-config-data\") pod \"keystone-cron-29585641-zldph\" (UID: \"61cd5173-b5d3-4cd7-a8ea-e4300054f364\") " pod="openstack/keystone-cron-29585641-zldph" Apr 02 14:01:00 crc kubenswrapper[4732]: I0402 14:01:00.249916 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61cd5173-b5d3-4cd7-a8ea-e4300054f364-combined-ca-bundle\") pod \"keystone-cron-29585641-zldph\" (UID: \"61cd5173-b5d3-4cd7-a8ea-e4300054f364\") " pod="openstack/keystone-cron-29585641-zldph" Apr 02 14:01:00 crc kubenswrapper[4732]: I0402 14:01:00.293221 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Apr 02 14:01:00 crc kubenswrapper[4732]: I0402 14:01:00.352575 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61cd5173-b5d3-4cd7-a8ea-e4300054f364-combined-ca-bundle\") pod \"keystone-cron-29585641-zldph\" (UID: \"61cd5173-b5d3-4cd7-a8ea-e4300054f364\") " pod="openstack/keystone-cron-29585641-zldph" Apr 02 14:01:00 crc kubenswrapper[4732]: I0402 14:01:00.352773 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/61cd5173-b5d3-4cd7-a8ea-e4300054f364-fernet-keys\") pod \"keystone-cron-29585641-zldph\" (UID: \"61cd5173-b5d3-4cd7-a8ea-e4300054f364\") " pod="openstack/keystone-cron-29585641-zldph" Apr 02 14:01:00 crc kubenswrapper[4732]: I0402 14:01:00.352857 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fml4k\" (UniqueName: \"kubernetes.io/projected/61cd5173-b5d3-4cd7-a8ea-e4300054f364-kube-api-access-fml4k\") pod \"keystone-cron-29585641-zldph\" (UID: \"61cd5173-b5d3-4cd7-a8ea-e4300054f364\") " pod="openstack/keystone-cron-29585641-zldph" Apr 02 14:01:00 crc kubenswrapper[4732]: I0402 14:01:00.352927 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61cd5173-b5d3-4cd7-a8ea-e4300054f364-config-data\") pod \"keystone-cron-29585641-zldph\" (UID: \"61cd5173-b5d3-4cd7-a8ea-e4300054f364\") " pod="openstack/keystone-cron-29585641-zldph" Apr 02 14:01:00 crc kubenswrapper[4732]: I0402 14:01:00.362648 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61cd5173-b5d3-4cd7-a8ea-e4300054f364-combined-ca-bundle\") pod \"keystone-cron-29585641-zldph\" (UID: \"61cd5173-b5d3-4cd7-a8ea-e4300054f364\") " pod="openstack/keystone-cron-29585641-zldph" Apr 02 14:01:00 crc kubenswrapper[4732]: I0402 14:01:00.372130 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/61cd5173-b5d3-4cd7-a8ea-e4300054f364-fernet-keys\") pod \"keystone-cron-29585641-zldph\" (UID: \"61cd5173-b5d3-4cd7-a8ea-e4300054f364\") " pod="openstack/keystone-cron-29585641-zldph" Apr 02 14:01:00 crc kubenswrapper[4732]: I0402 14:01:00.376217 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fml4k\" (UniqueName: \"kubernetes.io/projected/61cd5173-b5d3-4cd7-a8ea-e4300054f364-kube-api-access-fml4k\") pod \"keystone-cron-29585641-zldph\" (UID: \"61cd5173-b5d3-4cd7-a8ea-e4300054f364\") " pod="openstack/keystone-cron-29585641-zldph" Apr 02 14:01:00 crc kubenswrapper[4732]: I0402 14:01:00.376952 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61cd5173-b5d3-4cd7-a8ea-e4300054f364-config-data\") pod \"keystone-cron-29585641-zldph\" (UID: \"61cd5173-b5d3-4cd7-a8ea-e4300054f364\") " pod="openstack/keystone-cron-29585641-zldph" Apr 02 14:01:00 crc kubenswrapper[4732]: I0402 14:01:00.445077 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d7613bfc-a605-4485-b771-242a65e30df8","Type":"ContainerStarted","Data":"29ac40cbad5ebecf70239ab6cda1dc1ee969828a278dd5a6dd5ad56a0bbc263a"} Apr 02 14:01:00 crc kubenswrapper[4732]: I0402 14:01:00.470646 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29585641-zldph" Apr 02 14:01:00 crc kubenswrapper[4732]: I0402 14:01:00.695750 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3ab5d4f-9beb-4c41-954d-bc917548f495" path="/var/lib/kubelet/pods/c3ab5d4f-9beb-4c41-954d-bc917548f495/volumes" Apr 02 14:01:00 crc kubenswrapper[4732]: I0402 14:01:00.980334 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29585641-zldph"] Apr 02 14:01:01 crc kubenswrapper[4732]: I0402 14:01:01.470790 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29585641-zldph" event={"ID":"61cd5173-b5d3-4cd7-a8ea-e4300054f364","Type":"ContainerStarted","Data":"bbce644b1d5dc055ad2bf893ee8a95ccaf16cd8af4e6543b531916768af477c9"} Apr 02 14:01:01 crc kubenswrapper[4732]: I0402 14:01:01.471150 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29585641-zldph" event={"ID":"61cd5173-b5d3-4cd7-a8ea-e4300054f364","Type":"ContainerStarted","Data":"93d96bcafaec09d84be8e4f1592fd5620ed2a1b33ac0eb5820af16d96d8f5022"} Apr 02 14:01:01 crc kubenswrapper[4732]: I0402 14:01:01.474116 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d7613bfc-a605-4485-b771-242a65e30df8","Type":"ContainerStarted","Data":"83ca9f5cfe0ac35b499e5bc8c392da72bb16f25a39a757ce3e049f5e22353a09"} Apr 02 14:01:01 crc kubenswrapper[4732]: I0402 14:01:01.499319 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29585641-zldph" podStartSLOduration=1.499296883 podStartE2EDuration="1.499296883s" podCreationTimestamp="2026-04-02 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:01:01.49807343 +0000 UTC m=+1418.402481003" watchObservedRunningTime="2026-04-02 14:01:01.499296883 +0000 UTC m=+1418.403704436" Apr 02 14:01:01 crc kubenswrapper[4732]: I0402 14:01:01.874696 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6488b8fdcd-j9s62" Apr 02 14:01:01 crc kubenswrapper[4732]: I0402 14:01:01.984742 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-config-data-custom\") pod \"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb\" (UID: \"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb\") " Apr 02 14:01:01 crc kubenswrapper[4732]: I0402 14:01:01.984832 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mhhm\" (UniqueName: \"kubernetes.io/projected/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-kube-api-access-7mhhm\") pod \"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb\" (UID: \"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb\") " Apr 02 14:01:01 crc kubenswrapper[4732]: I0402 14:01:01.984979 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-logs\") pod \"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb\" (UID: \"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb\") " Apr 02 14:01:01 crc kubenswrapper[4732]: I0402 14:01:01.985008 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-config-data\") pod \"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb\" (UID: \"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb\") " Apr 02 14:01:01 crc kubenswrapper[4732]: I0402 14:01:01.985028 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-combined-ca-bundle\") pod \"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb\" (UID: \"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb\") " Apr 02 14:01:01 crc kubenswrapper[4732]: I0402 14:01:01.985555 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-logs" (OuterVolumeSpecName: "logs") pod "4c0b34b5-36cf-4f5d-91f2-618ff3a41adb" (UID: "4c0b34b5-36cf-4f5d-91f2-618ff3a41adb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:01:01 crc kubenswrapper[4732]: I0402 14:01:01.991782 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4c0b34b5-36cf-4f5d-91f2-618ff3a41adb" (UID: "4c0b34b5-36cf-4f5d-91f2-618ff3a41adb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:01 crc kubenswrapper[4732]: I0402 14:01:01.991804 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-kube-api-access-7mhhm" (OuterVolumeSpecName: "kube-api-access-7mhhm") pod "4c0b34b5-36cf-4f5d-91f2-618ff3a41adb" (UID: "4c0b34b5-36cf-4f5d-91f2-618ff3a41adb"). InnerVolumeSpecName "kube-api-access-7mhhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:01:02 crc kubenswrapper[4732]: I0402 14:01:02.032128 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c0b34b5-36cf-4f5d-91f2-618ff3a41adb" (UID: "4c0b34b5-36cf-4f5d-91f2-618ff3a41adb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:02 crc kubenswrapper[4732]: I0402 14:01:02.059474 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-config-data" (OuterVolumeSpecName: "config-data") pod "4c0b34b5-36cf-4f5d-91f2-618ff3a41adb" (UID: "4c0b34b5-36cf-4f5d-91f2-618ff3a41adb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:02 crc kubenswrapper[4732]: I0402 14:01:02.086996 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-logs\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:02 crc kubenswrapper[4732]: I0402 14:01:02.087028 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:02 crc kubenswrapper[4732]: I0402 14:01:02.087119 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:02 crc kubenswrapper[4732]: I0402 14:01:02.087160 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-config-data-custom\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:02 crc kubenswrapper[4732]: I0402 14:01:02.087171 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mhhm\" (UniqueName: \"kubernetes.io/projected/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb-kube-api-access-7mhhm\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:02 crc kubenswrapper[4732]: I0402 14:01:02.484000 4732 generic.go:334] "Generic (PLEG): container finished" podID="4c0b34b5-36cf-4f5d-91f2-618ff3a41adb" containerID="aeb47ad0e1a6ab9d71c5c23188f45a87d4579aee6e0d77d78be8c7c18a512803" exitCode=0 Apr 02 14:01:02 crc kubenswrapper[4732]: I0402 14:01:02.484060 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6488b8fdcd-j9s62" event={"ID":"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb","Type":"ContainerDied","Data":"aeb47ad0e1a6ab9d71c5c23188f45a87d4579aee6e0d77d78be8c7c18a512803"} Apr 02 14:01:02 crc kubenswrapper[4732]: I0402 14:01:02.484088 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6488b8fdcd-j9s62" event={"ID":"4c0b34b5-36cf-4f5d-91f2-618ff3a41adb","Type":"ContainerDied","Data":"cf9df17eaa8995dc201ebfe7375574ee5f065a9488d521516cac30a5a10298e0"} Apr 02 14:01:02 crc kubenswrapper[4732]: I0402 14:01:02.484110 4732 scope.go:117] "RemoveContainer" containerID="aeb47ad0e1a6ab9d71c5c23188f45a87d4579aee6e0d77d78be8c7c18a512803" Apr 02 14:01:02 crc kubenswrapper[4732]: I0402 14:01:02.484219 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6488b8fdcd-j9s62" Apr 02 14:01:02 crc kubenswrapper[4732]: I0402 14:01:02.487492 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d7613bfc-a605-4485-b771-242a65e30df8","Type":"ContainerStarted","Data":"62c171b334403fe8e6bed0df0d8a0d532c8bfe77474c5963bfc36c9f2518e788"} Apr 02 14:01:02 crc kubenswrapper[4732]: I0402 14:01:02.516310 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.516291644 podStartE2EDuration="3.516291644s" podCreationTimestamp="2026-04-02 14:00:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:01:02.515255436 +0000 UTC m=+1419.419663019" watchObservedRunningTime="2026-04-02 14:01:02.516291644 +0000 UTC m=+1419.420699197" Apr 02 14:01:02 crc kubenswrapper[4732]: I0402 14:01:02.520285 4732 scope.go:117] "RemoveContainer" containerID="3a555b820f07523abb2da70694769e55897c6dcea2fadc09fa51cb7d48986781" Apr 02 14:01:02 crc kubenswrapper[4732]: I0402 14:01:02.534118 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6488b8fdcd-j9s62"] Apr 02 14:01:02 crc kubenswrapper[4732]: I0402 14:01:02.540824 4732 scope.go:117] "RemoveContainer" containerID="aeb47ad0e1a6ab9d71c5c23188f45a87d4579aee6e0d77d78be8c7c18a512803" Apr 02 14:01:02 crc kubenswrapper[4732]: E0402 14:01:02.541279 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeb47ad0e1a6ab9d71c5c23188f45a87d4579aee6e0d77d78be8c7c18a512803\": container with ID starting with aeb47ad0e1a6ab9d71c5c23188f45a87d4579aee6e0d77d78be8c7c18a512803 not found: ID does not exist" containerID="aeb47ad0e1a6ab9d71c5c23188f45a87d4579aee6e0d77d78be8c7c18a512803" Apr 02 14:01:02 crc kubenswrapper[4732]: I0402 14:01:02.541315 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeb47ad0e1a6ab9d71c5c23188f45a87d4579aee6e0d77d78be8c7c18a512803"} err="failed to get container status \"aeb47ad0e1a6ab9d71c5c23188f45a87d4579aee6e0d77d78be8c7c18a512803\": rpc error: code = NotFound desc = could not find container \"aeb47ad0e1a6ab9d71c5c23188f45a87d4579aee6e0d77d78be8c7c18a512803\": container with ID starting with aeb47ad0e1a6ab9d71c5c23188f45a87d4579aee6e0d77d78be8c7c18a512803 not found: ID does not exist" Apr 02 14:01:02 crc kubenswrapper[4732]: I0402 14:01:02.541341 4732 scope.go:117] "RemoveContainer" containerID="3a555b820f07523abb2da70694769e55897c6dcea2fadc09fa51cb7d48986781" Apr 02 14:01:02 crc kubenswrapper[4732]: E0402 14:01:02.541695 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a555b820f07523abb2da70694769e55897c6dcea2fadc09fa51cb7d48986781\": container with ID starting with 3a555b820f07523abb2da70694769e55897c6dcea2fadc09fa51cb7d48986781 not found: ID does not exist" containerID="3a555b820f07523abb2da70694769e55897c6dcea2fadc09fa51cb7d48986781" Apr 02 14:01:02 crc kubenswrapper[4732]: I0402 14:01:02.541721 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a555b820f07523abb2da70694769e55897c6dcea2fadc09fa51cb7d48986781"} err="failed to get container status \"3a555b820f07523abb2da70694769e55897c6dcea2fadc09fa51cb7d48986781\": rpc error: code = NotFound desc = could not find container \"3a555b820f07523abb2da70694769e55897c6dcea2fadc09fa51cb7d48986781\": container with ID starting with 3a555b820f07523abb2da70694769e55897c6dcea2fadc09fa51cb7d48986781 not found: ID does not exist" Apr 02 14:01:02 crc kubenswrapper[4732]: I0402 14:01:02.543664 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6488b8fdcd-j9s62"] Apr 02 14:01:02 crc kubenswrapper[4732]: I0402 14:01:02.691322 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c0b34b5-36cf-4f5d-91f2-618ff3a41adb" path="/var/lib/kubelet/pods/4c0b34b5-36cf-4f5d-91f2-618ff3a41adb/volumes" Apr 02 14:01:04 crc kubenswrapper[4732]: I0402 14:01:04.247383 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Apr 02 14:01:04 crc kubenswrapper[4732]: I0402 14:01:04.510481 4732 generic.go:334] "Generic (PLEG): container finished" podID="61cd5173-b5d3-4cd7-a8ea-e4300054f364" containerID="bbce644b1d5dc055ad2bf893ee8a95ccaf16cd8af4e6543b531916768af477c9" exitCode=0 Apr 02 14:01:04 crc kubenswrapper[4732]: I0402 14:01:04.510636 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29585641-zldph" event={"ID":"61cd5173-b5d3-4cd7-a8ea-e4300054f364","Type":"ContainerDied","Data":"bbce644b1d5dc055ad2bf893ee8a95ccaf16cd8af4e6543b531916768af477c9"} Apr 02 14:01:04 crc kubenswrapper[4732]: I0402 14:01:04.807714 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Apr 02 14:01:04 crc kubenswrapper[4732]: I0402 14:01:04.919994 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-d4c8876f7-592x4" Apr 02 14:01:05 crc kubenswrapper[4732]: I0402 14:01:05.925954 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29585641-zldph" Apr 02 14:01:05 crc kubenswrapper[4732]: I0402 14:01:05.993734 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61cd5173-b5d3-4cd7-a8ea-e4300054f364-config-data\") pod \"61cd5173-b5d3-4cd7-a8ea-e4300054f364\" (UID: \"61cd5173-b5d3-4cd7-a8ea-e4300054f364\") " Apr 02 14:01:05 crc kubenswrapper[4732]: I0402 14:01:05.993879 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61cd5173-b5d3-4cd7-a8ea-e4300054f364-combined-ca-bundle\") pod \"61cd5173-b5d3-4cd7-a8ea-e4300054f364\" (UID: \"61cd5173-b5d3-4cd7-a8ea-e4300054f364\") " Apr 02 14:01:05 crc kubenswrapper[4732]: I0402 14:01:05.993937 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fml4k\" (UniqueName: \"kubernetes.io/projected/61cd5173-b5d3-4cd7-a8ea-e4300054f364-kube-api-access-fml4k\") pod \"61cd5173-b5d3-4cd7-a8ea-e4300054f364\" (UID: \"61cd5173-b5d3-4cd7-a8ea-e4300054f364\") " Apr 02 14:01:05 crc kubenswrapper[4732]: I0402 14:01:05.993998 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/61cd5173-b5d3-4cd7-a8ea-e4300054f364-fernet-keys\") pod \"61cd5173-b5d3-4cd7-a8ea-e4300054f364\" (UID: \"61cd5173-b5d3-4cd7-a8ea-e4300054f364\") " Apr 02 14:01:06 crc kubenswrapper[4732]: I0402 14:01:06.001828 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61cd5173-b5d3-4cd7-a8ea-e4300054f364-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "61cd5173-b5d3-4cd7-a8ea-e4300054f364" (UID: "61cd5173-b5d3-4cd7-a8ea-e4300054f364"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:06 crc kubenswrapper[4732]: I0402 14:01:06.004794 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61cd5173-b5d3-4cd7-a8ea-e4300054f364-kube-api-access-fml4k" (OuterVolumeSpecName: "kube-api-access-fml4k") pod "61cd5173-b5d3-4cd7-a8ea-e4300054f364" (UID: "61cd5173-b5d3-4cd7-a8ea-e4300054f364"). InnerVolumeSpecName "kube-api-access-fml4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:01:06 crc kubenswrapper[4732]: I0402 14:01:06.059030 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61cd5173-b5d3-4cd7-a8ea-e4300054f364-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61cd5173-b5d3-4cd7-a8ea-e4300054f364" (UID: "61cd5173-b5d3-4cd7-a8ea-e4300054f364"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:06 crc kubenswrapper[4732]: I0402 14:01:06.063748 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61cd5173-b5d3-4cd7-a8ea-e4300054f364-config-data" (OuterVolumeSpecName: "config-data") pod "61cd5173-b5d3-4cd7-a8ea-e4300054f364" (UID: "61cd5173-b5d3-4cd7-a8ea-e4300054f364"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:06 crc kubenswrapper[4732]: I0402 14:01:06.096106 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61cd5173-b5d3-4cd7-a8ea-e4300054f364-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:06 crc kubenswrapper[4732]: I0402 14:01:06.096155 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61cd5173-b5d3-4cd7-a8ea-e4300054f364-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:06 crc kubenswrapper[4732]: I0402 14:01:06.096168 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fml4k\" (UniqueName: \"kubernetes.io/projected/61cd5173-b5d3-4cd7-a8ea-e4300054f364-kube-api-access-fml4k\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:06 crc kubenswrapper[4732]: I0402 14:01:06.096179 4732 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/61cd5173-b5d3-4cd7-a8ea-e4300054f364-fernet-keys\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:06 crc kubenswrapper[4732]: I0402 14:01:06.533293 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29585641-zldph" event={"ID":"61cd5173-b5d3-4cd7-a8ea-e4300054f364","Type":"ContainerDied","Data":"93d96bcafaec09d84be8e4f1592fd5620ed2a1b33ac0eb5820af16d96d8f5022"} Apr 02 14:01:06 crc kubenswrapper[4732]: I0402 14:01:06.533647 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93d96bcafaec09d84be8e4f1592fd5620ed2a1b33ac0eb5820af16d96d8f5022" Apr 02 14:01:06 crc kubenswrapper[4732]: I0402 14:01:06.533327 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29585641-zldph" Apr 02 14:01:06 crc kubenswrapper[4732]: I0402 14:01:06.596744 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-59fb764b6d-vml5x" podUID="80fe4580-a48f-4cba-b3b3-d6ccbc7f0525" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.025982 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Apr 02 14:01:10 crc kubenswrapper[4732]: E0402 14:01:10.027016 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0b34b5-36cf-4f5d-91f2-618ff3a41adb" containerName="barbican-api" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.027033 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0b34b5-36cf-4f5d-91f2-618ff3a41adb" containerName="barbican-api" Apr 02 14:01:10 crc kubenswrapper[4732]: E0402 14:01:10.027057 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61cd5173-b5d3-4cd7-a8ea-e4300054f364" containerName="keystone-cron" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.027065 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="61cd5173-b5d3-4cd7-a8ea-e4300054f364" containerName="keystone-cron" Apr 02 14:01:10 crc kubenswrapper[4732]: E0402 14:01:10.027079 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0b34b5-36cf-4f5d-91f2-618ff3a41adb" containerName="barbican-api-log" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.027087 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0b34b5-36cf-4f5d-91f2-618ff3a41adb" containerName="barbican-api-log" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.027280 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c0b34b5-36cf-4f5d-91f2-618ff3a41adb" containerName="barbican-api" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.027297 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c0b34b5-36cf-4f5d-91f2-618ff3a41adb" containerName="barbican-api-log" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.027315 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="61cd5173-b5d3-4cd7-a8ea-e4300054f364" containerName="keystone-cron" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.028005 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.037022 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.038175 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.038355 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-4lkrh" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.047547 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.175251 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66ae86e8-597c-4fdb-b0da-283cf37afba2-openstack-config\") pod \"openstackclient\" (UID: \"66ae86e8-597c-4fdb-b0da-283cf37afba2\") " pod="openstack/openstackclient" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.175887 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98hpn\" (UniqueName: \"kubernetes.io/projected/66ae86e8-597c-4fdb-b0da-283cf37afba2-kube-api-access-98hpn\") pod \"openstackclient\" (UID: \"66ae86e8-597c-4fdb-b0da-283cf37afba2\") " pod="openstack/openstackclient" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.176045 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66ae86e8-597c-4fdb-b0da-283cf37afba2-openstack-config-secret\") pod \"openstackclient\" (UID: \"66ae86e8-597c-4fdb-b0da-283cf37afba2\") " pod="openstack/openstackclient" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.176243 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ae86e8-597c-4fdb-b0da-283cf37afba2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"66ae86e8-597c-4fdb-b0da-283cf37afba2\") " pod="openstack/openstackclient" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.277276 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.278420 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66ae86e8-597c-4fdb-b0da-283cf37afba2-openstack-config\") pod \"openstackclient\" (UID: \"66ae86e8-597c-4fdb-b0da-283cf37afba2\") " pod="openstack/openstackclient" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.278661 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98hpn\" (UniqueName: \"kubernetes.io/projected/66ae86e8-597c-4fdb-b0da-283cf37afba2-kube-api-access-98hpn\") pod \"openstackclient\" (UID: \"66ae86e8-597c-4fdb-b0da-283cf37afba2\") " pod="openstack/openstackclient" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.278724 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66ae86e8-597c-4fdb-b0da-283cf37afba2-openstack-config-secret\") pod \"openstackclient\" (UID: \"66ae86e8-597c-4fdb-b0da-283cf37afba2\") " pod="openstack/openstackclient" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.278868 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ae86e8-597c-4fdb-b0da-283cf37afba2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"66ae86e8-597c-4fdb-b0da-283cf37afba2\") " pod="openstack/openstackclient" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.279332 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66ae86e8-597c-4fdb-b0da-283cf37afba2-openstack-config\") pod \"openstackclient\" (UID: \"66ae86e8-597c-4fdb-b0da-283cf37afba2\") " pod="openstack/openstackclient" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.286850 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66ae86e8-597c-4fdb-b0da-283cf37afba2-openstack-config-secret\") pod \"openstackclient\" (UID: \"66ae86e8-597c-4fdb-b0da-283cf37afba2\") " pod="openstack/openstackclient" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.286983 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ae86e8-597c-4fdb-b0da-283cf37afba2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"66ae86e8-597c-4fdb-b0da-283cf37afba2\") " pod="openstack/openstackclient" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.298891 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98hpn\" (UniqueName: \"kubernetes.io/projected/66ae86e8-597c-4fdb-b0da-283cf37afba2-kube-api-access-98hpn\") pod \"openstackclient\" (UID: \"66ae86e8-597c-4fdb-b0da-283cf37afba2\") " pod="openstack/openstackclient" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.346604 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Apr 02 14:01:10 crc kubenswrapper[4732]: I0402 14:01:10.844062 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.185101 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-9f57ff6c-7m8sr"] Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.186975 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.189433 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.189881 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.195124 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.202332 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-9f57ff6c-7m8sr"] Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.276089 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.277291 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c0e53540-98b3-463a-9611-a48c2fbfc0f5" containerName="ceilometer-central-agent" containerID="cri-o://f9dbf40bc89d07703caf265aa216a0d8f8682ac26fc7209514eb887f20e06559" gracePeriod=30 Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.277639 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c0e53540-98b3-463a-9611-a48c2fbfc0f5" containerName="ceilometer-notification-agent" containerID="cri-o://70f43e416f820f0faed0922b41d00581677129965f3e933afec756d385dec3cf" gracePeriod=30 Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.277653 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c0e53540-98b3-463a-9611-a48c2fbfc0f5" containerName="proxy-httpd" containerID="cri-o://e9c0400338b7d02a8e2c26227681c2de2c1fcd12163690172f04be090f9a7be0" gracePeriod=30 Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.277631 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c0e53540-98b3-463a-9611-a48c2fbfc0f5" containerName="sg-core" containerID="cri-o://bdd7bd8126ffb3128ae660abacfacf4439445b786a5f93e6bf789109a2cab1eb" gracePeriod=30 Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.290859 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c0e53540-98b3-463a-9611-a48c2fbfc0f5" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.165:3000/\": EOF" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.302129 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f6ffca1-ce91-4e20-8cbc-38a3eab1616e-combined-ca-bundle\") pod \"swift-proxy-9f57ff6c-7m8sr\" (UID: \"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e\") " pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.302225 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f6ffca1-ce91-4e20-8cbc-38a3eab1616e-public-tls-certs\") pod \"swift-proxy-9f57ff6c-7m8sr\" (UID: \"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e\") " pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.302293 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f6ffca1-ce91-4e20-8cbc-38a3eab1616e-log-httpd\") pod \"swift-proxy-9f57ff6c-7m8sr\" (UID: \"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e\") " pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.302337 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn2bm\" (UniqueName: \"kubernetes.io/projected/7f6ffca1-ce91-4e20-8cbc-38a3eab1616e-kube-api-access-fn2bm\") pod \"swift-proxy-9f57ff6c-7m8sr\" (UID: \"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e\") " pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.302391 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f6ffca1-ce91-4e20-8cbc-38a3eab1616e-internal-tls-certs\") pod \"swift-proxy-9f57ff6c-7m8sr\" (UID: \"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e\") " pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.302435 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f6ffca1-ce91-4e20-8cbc-38a3eab1616e-config-data\") pod \"swift-proxy-9f57ff6c-7m8sr\" (UID: \"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e\") " pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.302456 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7f6ffca1-ce91-4e20-8cbc-38a3eab1616e-etc-swift\") pod \"swift-proxy-9f57ff6c-7m8sr\" (UID: \"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e\") " pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.302504 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f6ffca1-ce91-4e20-8cbc-38a3eab1616e-run-httpd\") pod \"swift-proxy-9f57ff6c-7m8sr\" (UID: \"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e\") " pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.404745 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f6ffca1-ce91-4e20-8cbc-38a3eab1616e-public-tls-certs\") pod \"swift-proxy-9f57ff6c-7m8sr\" (UID: \"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e\") " pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.404832 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f6ffca1-ce91-4e20-8cbc-38a3eab1616e-log-httpd\") pod \"swift-proxy-9f57ff6c-7m8sr\" (UID: \"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e\") " pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.404872 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn2bm\" (UniqueName: \"kubernetes.io/projected/7f6ffca1-ce91-4e20-8cbc-38a3eab1616e-kube-api-access-fn2bm\") pod \"swift-proxy-9f57ff6c-7m8sr\" (UID: \"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e\") " pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.404922 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f6ffca1-ce91-4e20-8cbc-38a3eab1616e-internal-tls-certs\") pod \"swift-proxy-9f57ff6c-7m8sr\" (UID: \"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e\") " pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.404963 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f6ffca1-ce91-4e20-8cbc-38a3eab1616e-config-data\") pod \"swift-proxy-9f57ff6c-7m8sr\" (UID: \"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e\") " pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.404983 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7f6ffca1-ce91-4e20-8cbc-38a3eab1616e-etc-swift\") pod \"swift-proxy-9f57ff6c-7m8sr\" (UID: \"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e\") " pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.405022 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f6ffca1-ce91-4e20-8cbc-38a3eab1616e-run-httpd\") pod \"swift-proxy-9f57ff6c-7m8sr\" (UID: \"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e\") " pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.405059 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f6ffca1-ce91-4e20-8cbc-38a3eab1616e-combined-ca-bundle\") pod \"swift-proxy-9f57ff6c-7m8sr\" (UID: \"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e\") " pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.411484 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f6ffca1-ce91-4e20-8cbc-38a3eab1616e-log-httpd\") pod \"swift-proxy-9f57ff6c-7m8sr\" (UID: \"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e\") " pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.412989 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f6ffca1-ce91-4e20-8cbc-38a3eab1616e-run-httpd\") pod \"swift-proxy-9f57ff6c-7m8sr\" (UID: \"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e\") " pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.415546 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f6ffca1-ce91-4e20-8cbc-38a3eab1616e-combined-ca-bundle\") pod \"swift-proxy-9f57ff6c-7m8sr\" (UID: \"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e\") " pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.417412 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7f6ffca1-ce91-4e20-8cbc-38a3eab1616e-etc-swift\") pod \"swift-proxy-9f57ff6c-7m8sr\" (UID: \"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e\") " pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.417569 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f6ffca1-ce91-4e20-8cbc-38a3eab1616e-config-data\") pod \"swift-proxy-9f57ff6c-7m8sr\" (UID: \"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e\") " pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.419509 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f6ffca1-ce91-4e20-8cbc-38a3eab1616e-internal-tls-certs\") pod \"swift-proxy-9f57ff6c-7m8sr\" (UID: \"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e\") " pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.439441 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f6ffca1-ce91-4e20-8cbc-38a3eab1616e-public-tls-certs\") pod \"swift-proxy-9f57ff6c-7m8sr\" (UID: \"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e\") " pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.447688 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn2bm\" (UniqueName: \"kubernetes.io/projected/7f6ffca1-ce91-4e20-8cbc-38a3eab1616e-kube-api-access-fn2bm\") pod \"swift-proxy-9f57ff6c-7m8sr\" (UID: \"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e\") " pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.510062 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.524546 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-gcw8g"] Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.540374 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gcw8g" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.544024 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-gcw8g"] Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.609496 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0267b36-acec-4035-9bdd-b19758f45275-operator-scripts\") pod \"nova-api-db-create-gcw8g\" (UID: \"d0267b36-acec-4035-9bdd-b19758f45275\") " pod="openstack/nova-api-db-create-gcw8g" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.609712 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbvwb\" (UniqueName: \"kubernetes.io/projected/d0267b36-acec-4035-9bdd-b19758f45275-kube-api-access-gbvwb\") pod \"nova-api-db-create-gcw8g\" (UID: \"d0267b36-acec-4035-9bdd-b19758f45275\") " pod="openstack/nova-api-db-create-gcw8g" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.628327 4732 generic.go:334] "Generic (PLEG): container finished" podID="c0e53540-98b3-463a-9611-a48c2fbfc0f5" containerID="e9c0400338b7d02a8e2c26227681c2de2c1fcd12163690172f04be090f9a7be0" exitCode=0 Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.628368 4732 generic.go:334] "Generic (PLEG): container finished" podID="c0e53540-98b3-463a-9611-a48c2fbfc0f5" containerID="bdd7bd8126ffb3128ae660abacfacf4439445b786a5f93e6bf789109a2cab1eb" exitCode=2 Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.628446 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0e53540-98b3-463a-9611-a48c2fbfc0f5","Type":"ContainerDied","Data":"e9c0400338b7d02a8e2c26227681c2de2c1fcd12163690172f04be090f9a7be0"} Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.628480 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0e53540-98b3-463a-9611-a48c2fbfc0f5","Type":"ContainerDied","Data":"bdd7bd8126ffb3128ae660abacfacf4439445b786a5f93e6bf789109a2cab1eb"} Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.635694 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-2bf58"] Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.637220 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2bf58" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.643116 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"66ae86e8-597c-4fdb-b0da-283cf37afba2","Type":"ContainerStarted","Data":"1eb17696fe81b0f75a307b80dc9e2a5c277a63863a7cb75b9049f7d3ebbe3e56"} Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.649791 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-2bf58"] Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.721088 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtv6h\" (UniqueName: \"kubernetes.io/projected/cca116f4-1c5a-4dc9-966d-5033ed344c2f-kube-api-access-gtv6h\") pod \"nova-cell0-db-create-2bf58\" (UID: \"cca116f4-1c5a-4dc9-966d-5033ed344c2f\") " pod="openstack/nova-cell0-db-create-2bf58" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.721176 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbvwb\" (UniqueName: \"kubernetes.io/projected/d0267b36-acec-4035-9bdd-b19758f45275-kube-api-access-gbvwb\") pod \"nova-api-db-create-gcw8g\" (UID: \"d0267b36-acec-4035-9bdd-b19758f45275\") " pod="openstack/nova-api-db-create-gcw8g" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.721494 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0267b36-acec-4035-9bdd-b19758f45275-operator-scripts\") pod \"nova-api-db-create-gcw8g\" (UID: \"d0267b36-acec-4035-9bdd-b19758f45275\") " pod="openstack/nova-api-db-create-gcw8g" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.721523 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca116f4-1c5a-4dc9-966d-5033ed344c2f-operator-scripts\") pod \"nova-cell0-db-create-2bf58\" (UID: \"cca116f4-1c5a-4dc9-966d-5033ed344c2f\") " pod="openstack/nova-cell0-db-create-2bf58" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.722180 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0267b36-acec-4035-9bdd-b19758f45275-operator-scripts\") pod \"nova-api-db-create-gcw8g\" (UID: \"d0267b36-acec-4035-9bdd-b19758f45275\") " pod="openstack/nova-api-db-create-gcw8g" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.744282 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbvwb\" (UniqueName: \"kubernetes.io/projected/d0267b36-acec-4035-9bdd-b19758f45275-kube-api-access-gbvwb\") pod \"nova-api-db-create-gcw8g\" (UID: \"d0267b36-acec-4035-9bdd-b19758f45275\") " pod="openstack/nova-api-db-create-gcw8g" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.762482 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-1cfc-account-create-update-vtn6k"] Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.764596 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1cfc-account-create-update-vtn6k" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.770044 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.773235 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1cfc-account-create-update-vtn6k"] Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.780527 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-265md"] Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.781675 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-265md" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.805525 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-265md"] Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.828646 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3edb7dea-2bf7-4dcc-80e0-54c59916152c-operator-scripts\") pod \"nova-api-1cfc-account-create-update-vtn6k\" (UID: \"3edb7dea-2bf7-4dcc-80e0-54c59916152c\") " pod="openstack/nova-api-1cfc-account-create-update-vtn6k" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.828686 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca116f4-1c5a-4dc9-966d-5033ed344c2f-operator-scripts\") pod \"nova-cell0-db-create-2bf58\" (UID: \"cca116f4-1c5a-4dc9-966d-5033ed344c2f\") " pod="openstack/nova-cell0-db-create-2bf58" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.828736 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fgr9\" (UniqueName: \"kubernetes.io/projected/3edb7dea-2bf7-4dcc-80e0-54c59916152c-kube-api-access-2fgr9\") pod \"nova-api-1cfc-account-create-update-vtn6k\" (UID: \"3edb7dea-2bf7-4dcc-80e0-54c59916152c\") " pod="openstack/nova-api-1cfc-account-create-update-vtn6k" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.828819 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtv6h\" (UniqueName: \"kubernetes.io/projected/cca116f4-1c5a-4dc9-966d-5033ed344c2f-kube-api-access-gtv6h\") pod \"nova-cell0-db-create-2bf58\" (UID: \"cca116f4-1c5a-4dc9-966d-5033ed344c2f\") " pod="openstack/nova-cell0-db-create-2bf58" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.829965 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca116f4-1c5a-4dc9-966d-5033ed344c2f-operator-scripts\") pod \"nova-cell0-db-create-2bf58\" (UID: \"cca116f4-1c5a-4dc9-966d-5033ed344c2f\") " pod="openstack/nova-cell0-db-create-2bf58" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.859307 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtv6h\" (UniqueName: \"kubernetes.io/projected/cca116f4-1c5a-4dc9-966d-5033ed344c2f-kube-api-access-gtv6h\") pod \"nova-cell0-db-create-2bf58\" (UID: \"cca116f4-1c5a-4dc9-966d-5033ed344c2f\") " pod="openstack/nova-cell0-db-create-2bf58" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.930739 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3012ae69-b011-46df-a7c0-36845efb7172-operator-scripts\") pod \"nova-cell1-db-create-265md\" (UID: \"3012ae69-b011-46df-a7c0-36845efb7172\") " pod="openstack/nova-cell1-db-create-265md" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.930788 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3edb7dea-2bf7-4dcc-80e0-54c59916152c-operator-scripts\") pod \"nova-api-1cfc-account-create-update-vtn6k\" (UID: \"3edb7dea-2bf7-4dcc-80e0-54c59916152c\") " pod="openstack/nova-api-1cfc-account-create-update-vtn6k" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.930836 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fgr9\" (UniqueName: \"kubernetes.io/projected/3edb7dea-2bf7-4dcc-80e0-54c59916152c-kube-api-access-2fgr9\") pod \"nova-api-1cfc-account-create-update-vtn6k\" (UID: \"3edb7dea-2bf7-4dcc-80e0-54c59916152c\") " pod="openstack/nova-api-1cfc-account-create-update-vtn6k" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.930876 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l6t9\" (UniqueName: \"kubernetes.io/projected/3012ae69-b011-46df-a7c0-36845efb7172-kube-api-access-2l6t9\") pod \"nova-cell1-db-create-265md\" (UID: \"3012ae69-b011-46df-a7c0-36845efb7172\") " pod="openstack/nova-cell1-db-create-265md" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.932185 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3edb7dea-2bf7-4dcc-80e0-54c59916152c-operator-scripts\") pod \"nova-api-1cfc-account-create-update-vtn6k\" (UID: \"3edb7dea-2bf7-4dcc-80e0-54c59916152c\") " pod="openstack/nova-api-1cfc-account-create-update-vtn6k" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.941898 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-e441-account-create-update-7222l"] Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.943229 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e441-account-create-update-7222l" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.948993 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.955507 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e441-account-create-update-7222l"] Apr 02 14:01:11 crc kubenswrapper[4732]: I0402 14:01:11.956331 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fgr9\" (UniqueName: \"kubernetes.io/projected/3edb7dea-2bf7-4dcc-80e0-54c59916152c-kube-api-access-2fgr9\") pod \"nova-api-1cfc-account-create-update-vtn6k\" (UID: \"3edb7dea-2bf7-4dcc-80e0-54c59916152c\") " pod="openstack/nova-api-1cfc-account-create-update-vtn6k" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.020710 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gcw8g" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.032465 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58580fa1-1d1a-4c30-9e7a-d0e464c1487f-operator-scripts\") pod \"nova-cell0-e441-account-create-update-7222l\" (UID: \"58580fa1-1d1a-4c30-9e7a-d0e464c1487f\") " pod="openstack/nova-cell0-e441-account-create-update-7222l" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.032524 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3012ae69-b011-46df-a7c0-36845efb7172-operator-scripts\") pod \"nova-cell1-db-create-265md\" (UID: \"3012ae69-b011-46df-a7c0-36845efb7172\") " pod="openstack/nova-cell1-db-create-265md" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.032702 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l6t9\" (UniqueName: \"kubernetes.io/projected/3012ae69-b011-46df-a7c0-36845efb7172-kube-api-access-2l6t9\") pod \"nova-cell1-db-create-265md\" (UID: \"3012ae69-b011-46df-a7c0-36845efb7172\") " pod="openstack/nova-cell1-db-create-265md" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.032747 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzclj\" (UniqueName: \"kubernetes.io/projected/58580fa1-1d1a-4c30-9e7a-d0e464c1487f-kube-api-access-pzclj\") pod \"nova-cell0-e441-account-create-update-7222l\" (UID: \"58580fa1-1d1a-4c30-9e7a-d0e464c1487f\") " pod="openstack/nova-cell0-e441-account-create-update-7222l" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.033605 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3012ae69-b011-46df-a7c0-36845efb7172-operator-scripts\") pod \"nova-cell1-db-create-265md\" (UID: \"3012ae69-b011-46df-a7c0-36845efb7172\") " pod="openstack/nova-cell1-db-create-265md" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.054367 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l6t9\" (UniqueName: \"kubernetes.io/projected/3012ae69-b011-46df-a7c0-36845efb7172-kube-api-access-2l6t9\") pod \"nova-cell1-db-create-265md\" (UID: \"3012ae69-b011-46df-a7c0-36845efb7172\") " pod="openstack/nova-cell1-db-create-265md" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.078930 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2bf58" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.131022 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1cfc-account-create-update-vtn6k" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.136748 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58580fa1-1d1a-4c30-9e7a-d0e464c1487f-operator-scripts\") pod \"nova-cell0-e441-account-create-update-7222l\" (UID: \"58580fa1-1d1a-4c30-9e7a-d0e464c1487f\") " pod="openstack/nova-cell0-e441-account-create-update-7222l" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.136873 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzclj\" (UniqueName: \"kubernetes.io/projected/58580fa1-1d1a-4c30-9e7a-d0e464c1487f-kube-api-access-pzclj\") pod \"nova-cell0-e441-account-create-update-7222l\" (UID: \"58580fa1-1d1a-4c30-9e7a-d0e464c1487f\") " pod="openstack/nova-cell0-e441-account-create-update-7222l" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.137984 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58580fa1-1d1a-4c30-9e7a-d0e464c1487f-operator-scripts\") pod \"nova-cell0-e441-account-create-update-7222l\" (UID: \"58580fa1-1d1a-4c30-9e7a-d0e464c1487f\") " pod="openstack/nova-cell0-e441-account-create-update-7222l" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.141145 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-e3c0-account-create-update-5k7n4"] Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.147292 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e3c0-account-create-update-5k7n4" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.153457 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.159182 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzclj\" (UniqueName: \"kubernetes.io/projected/58580fa1-1d1a-4c30-9e7a-d0e464c1487f-kube-api-access-pzclj\") pod \"nova-cell0-e441-account-create-update-7222l\" (UID: \"58580fa1-1d1a-4c30-9e7a-d0e464c1487f\") " pod="openstack/nova-cell0-e441-account-create-update-7222l" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.160974 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-265md" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.164953 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e3c0-account-create-update-5k7n4"] Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.239769 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zgtq\" (UniqueName: \"kubernetes.io/projected/99e5e7a5-f66b-45c2-881c-60507bfa4c25-kube-api-access-9zgtq\") pod \"nova-cell1-e3c0-account-create-update-5k7n4\" (UID: \"99e5e7a5-f66b-45c2-881c-60507bfa4c25\") " pod="openstack/nova-cell1-e3c0-account-create-update-5k7n4" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.239845 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99e5e7a5-f66b-45c2-881c-60507bfa4c25-operator-scripts\") pod \"nova-cell1-e3c0-account-create-update-5k7n4\" (UID: \"99e5e7a5-f66b-45c2-881c-60507bfa4c25\") " pod="openstack/nova-cell1-e3c0-account-create-update-5k7n4" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.283102 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e441-account-create-update-7222l" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.292310 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.342690 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zgtq\" (UniqueName: \"kubernetes.io/projected/99e5e7a5-f66b-45c2-881c-60507bfa4c25-kube-api-access-9zgtq\") pod \"nova-cell1-e3c0-account-create-update-5k7n4\" (UID: \"99e5e7a5-f66b-45c2-881c-60507bfa4c25\") " pod="openstack/nova-cell1-e3c0-account-create-update-5k7n4" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.342795 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99e5e7a5-f66b-45c2-881c-60507bfa4c25-operator-scripts\") pod \"nova-cell1-e3c0-account-create-update-5k7n4\" (UID: \"99e5e7a5-f66b-45c2-881c-60507bfa4c25\") " pod="openstack/nova-cell1-e3c0-account-create-update-5k7n4" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.345357 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99e5e7a5-f66b-45c2-881c-60507bfa4c25-operator-scripts\") pod \"nova-cell1-e3c0-account-create-update-5k7n4\" (UID: \"99e5e7a5-f66b-45c2-881c-60507bfa4c25\") " pod="openstack/nova-cell1-e3c0-account-create-update-5k7n4" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.366797 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zgtq\" (UniqueName: \"kubernetes.io/projected/99e5e7a5-f66b-45c2-881c-60507bfa4c25-kube-api-access-9zgtq\") pod \"nova-cell1-e3c0-account-create-update-5k7n4\" (UID: \"99e5e7a5-f66b-45c2-881c-60507bfa4c25\") " pod="openstack/nova-cell1-e3c0-account-create-update-5k7n4" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.372078 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-9f57ff6c-7m8sr"] Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.372604 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5688fc477d-p59pf" Apr 02 14:01:12 crc kubenswrapper[4732]: W0402 14:01:12.392600 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f6ffca1_ce91_4e20_8cbc_38a3eab1616e.slice/crio-6084c2ea3240f6557004c86043c52f66574d6979e18f8a4163b683acf0392dbb WatchSource:0}: Error finding container 6084c2ea3240f6557004c86043c52f66574d6979e18f8a4163b683acf0392dbb: Status 404 returned error can't find the container with id 6084c2ea3240f6557004c86043c52f66574d6979e18f8a4163b683acf0392dbb Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.517674 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e3c0-account-create-update-5k7n4" Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.663528 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9f57ff6c-7m8sr" event={"ID":"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e","Type":"ContainerStarted","Data":"6084c2ea3240f6557004c86043c52f66574d6979e18f8a4163b683acf0392dbb"} Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.673152 4732 generic.go:334] "Generic (PLEG): container finished" podID="c0e53540-98b3-463a-9611-a48c2fbfc0f5" containerID="70f43e416f820f0faed0922b41d00581677129965f3e933afec756d385dec3cf" exitCode=0 Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.673183 4732 generic.go:334] "Generic (PLEG): container finished" podID="c0e53540-98b3-463a-9611-a48c2fbfc0f5" containerID="f9dbf40bc89d07703caf265aa216a0d8f8682ac26fc7209514eb887f20e06559" exitCode=0 Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.673552 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0e53540-98b3-463a-9611-a48c2fbfc0f5","Type":"ContainerDied","Data":"70f43e416f820f0faed0922b41d00581677129965f3e933afec756d385dec3cf"} Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.673703 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0e53540-98b3-463a-9611-a48c2fbfc0f5","Type":"ContainerDied","Data":"f9dbf40bc89d07703caf265aa216a0d8f8682ac26fc7209514eb887f20e06559"} Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.974596 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-gcw8g"] Apr 02 14:01:12 crc kubenswrapper[4732]: I0402 14:01:12.995882 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.087196 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e53540-98b3-463a-9611-a48c2fbfc0f5-combined-ca-bundle\") pod \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.087260 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0e53540-98b3-463a-9611-a48c2fbfc0f5-scripts\") pod \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.087327 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx4kr\" (UniqueName: \"kubernetes.io/projected/c0e53540-98b3-463a-9611-a48c2fbfc0f5-kube-api-access-kx4kr\") pod \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.087381 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e53540-98b3-463a-9611-a48c2fbfc0f5-config-data\") pod \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.087418 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0e53540-98b3-463a-9611-a48c2fbfc0f5-run-httpd\") pod \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.087440 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c0e53540-98b3-463a-9611-a48c2fbfc0f5-sg-core-conf-yaml\") pod \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.087497 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0e53540-98b3-463a-9611-a48c2fbfc0f5-log-httpd\") pod \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\" (UID: \"c0e53540-98b3-463a-9611-a48c2fbfc0f5\") " Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.088590 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0e53540-98b3-463a-9611-a48c2fbfc0f5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c0e53540-98b3-463a-9611-a48c2fbfc0f5" (UID: "c0e53540-98b3-463a-9611-a48c2fbfc0f5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.088856 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0e53540-98b3-463a-9611-a48c2fbfc0f5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c0e53540-98b3-463a-9611-a48c2fbfc0f5" (UID: "c0e53540-98b3-463a-9611-a48c2fbfc0f5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.096113 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0e53540-98b3-463a-9611-a48c2fbfc0f5-scripts" (OuterVolumeSpecName: "scripts") pod "c0e53540-98b3-463a-9611-a48c2fbfc0f5" (UID: "c0e53540-98b3-463a-9611-a48c2fbfc0f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.100700 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0e53540-98b3-463a-9611-a48c2fbfc0f5-kube-api-access-kx4kr" (OuterVolumeSpecName: "kube-api-access-kx4kr") pod "c0e53540-98b3-463a-9611-a48c2fbfc0f5" (UID: "c0e53540-98b3-463a-9611-a48c2fbfc0f5"). InnerVolumeSpecName "kube-api-access-kx4kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.165810 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0e53540-98b3-463a-9611-a48c2fbfc0f5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c0e53540-98b3-463a-9611-a48c2fbfc0f5" (UID: "c0e53540-98b3-463a-9611-a48c2fbfc0f5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.190879 4732 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0e53540-98b3-463a-9611-a48c2fbfc0f5-run-httpd\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.190912 4732 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c0e53540-98b3-463a-9611-a48c2fbfc0f5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.190923 4732 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0e53540-98b3-463a-9611-a48c2fbfc0f5-log-httpd\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.190933 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0e53540-98b3-463a-9611-a48c2fbfc0f5-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.190947 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx4kr\" (UniqueName: \"kubernetes.io/projected/c0e53540-98b3-463a-9611-a48c2fbfc0f5-kube-api-access-kx4kr\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.236589 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-265md"] Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.245226 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-2bf58"] Apr 02 14:01:13 crc kubenswrapper[4732]: W0402 14:01:13.285861 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3012ae69_b011_46df_a7c0_36845efb7172.slice/crio-719eb9aa0d23a8eeb62eb1832780e9f0659f936ef6d65f7b139d9c6ecff3f83e WatchSource:0}: Error finding container 719eb9aa0d23a8eeb62eb1832780e9f0659f936ef6d65f7b139d9c6ecff3f83e: Status 404 returned error can't find the container with id 719eb9aa0d23a8eeb62eb1832780e9f0659f936ef6d65f7b139d9c6ecff3f83e Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.341361 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0e53540-98b3-463a-9611-a48c2fbfc0f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0e53540-98b3-463a-9611-a48c2fbfc0f5" (UID: "c0e53540-98b3-463a-9611-a48c2fbfc0f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.395829 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e53540-98b3-463a-9611-a48c2fbfc0f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.407323 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0e53540-98b3-463a-9611-a48c2fbfc0f5-config-data" (OuterVolumeSpecName: "config-data") pod "c0e53540-98b3-463a-9611-a48c2fbfc0f5" (UID: "c0e53540-98b3-463a-9611-a48c2fbfc0f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.480852 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1cfc-account-create-update-vtn6k"] Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.497152 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e53540-98b3-463a-9611-a48c2fbfc0f5-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.502560 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e441-account-create-update-7222l"] Apr 02 14:01:13 crc kubenswrapper[4732]: W0402 14:01:13.553048 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58580fa1_1d1a_4c30_9e7a_d0e464c1487f.slice/crio-3df6c708d0b571e95f3cd5c4b55b6ae57bc79c00bd349831f04064c5c52f3003 WatchSource:0}: Error finding container 3df6c708d0b571e95f3cd5c4b55b6ae57bc79c00bd349831f04064c5c52f3003: Status 404 returned error can't find the container with id 3df6c708d0b571e95f3cd5c4b55b6ae57bc79c00bd349831f04064c5c52f3003 Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.678889 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e3c0-account-create-update-5k7n4"] Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.718812 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2bf58" event={"ID":"cca116f4-1c5a-4dc9-966d-5033ed344c2f","Type":"ContainerStarted","Data":"ea2ef4b788fd8804b3c46379526db7e699f02f17786d9c350448c18d925997eb"} Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.737015 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-265md" event={"ID":"3012ae69-b011-46df-a7c0-36845efb7172","Type":"ContainerStarted","Data":"719eb9aa0d23a8eeb62eb1832780e9f0659f936ef6d65f7b139d9c6ecff3f83e"} Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.766676 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0e53540-98b3-463a-9611-a48c2fbfc0f5","Type":"ContainerDied","Data":"e992f20926ac9ac17ef5bd416178fca892141381cc8ab3ce4a581e7afd28138f"} Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.766725 4732 scope.go:117] "RemoveContainer" containerID="e9c0400338b7d02a8e2c26227681c2de2c1fcd12163690172f04be090f9a7be0" Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.766832 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.776802 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1cfc-account-create-update-vtn6k" event={"ID":"3edb7dea-2bf7-4dcc-80e0-54c59916152c","Type":"ContainerStarted","Data":"6202919de170483e4a0ad9ca5e6b24c78f293e5db46cdef825b4c86c72368ae8"} Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.779213 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e441-account-create-update-7222l" event={"ID":"58580fa1-1d1a-4c30-9e7a-d0e464c1487f","Type":"ContainerStarted","Data":"3df6c708d0b571e95f3cd5c4b55b6ae57bc79c00bd349831f04064c5c52f3003"} Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.781770 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gcw8g" event={"ID":"d0267b36-acec-4035-9bdd-b19758f45275","Type":"ContainerStarted","Data":"03f75e719d435b7310f14addd9962e52eccfb55e128e4bae6bf461c6b5c968bb"} Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.781802 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gcw8g" event={"ID":"d0267b36-acec-4035-9bdd-b19758f45275","Type":"ContainerStarted","Data":"cbe04223b2e294c1f3eb94bbf4774416843b9047e0d0573a2e971c2f2ebf0cd7"} Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.808565 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9f57ff6c-7m8sr" event={"ID":"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e","Type":"ContainerStarted","Data":"525bcea9dc3b6c2fe5be3c2c8aa7a723426576edf5b76f64b52af4f961038201"} Apr 02 14:01:13 crc kubenswrapper[4732]: I0402 14:01:13.980665 4732 scope.go:117] "RemoveContainer" containerID="bdd7bd8126ffb3128ae660abacfacf4439445b786a5f93e6bf789109a2cab1eb" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.025286 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.049410 4732 scope.go:117] "RemoveContainer" containerID="70f43e416f820f0faed0922b41d00581677129965f3e933afec756d385dec3cf" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.061052 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.081991 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:01:14 crc kubenswrapper[4732]: E0402 14:01:14.082476 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e53540-98b3-463a-9611-a48c2fbfc0f5" containerName="ceilometer-central-agent" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.082491 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e53540-98b3-463a-9611-a48c2fbfc0f5" containerName="ceilometer-central-agent" Apr 02 14:01:14 crc kubenswrapper[4732]: E0402 14:01:14.082514 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e53540-98b3-463a-9611-a48c2fbfc0f5" containerName="sg-core" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.082522 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e53540-98b3-463a-9611-a48c2fbfc0f5" containerName="sg-core" Apr 02 14:01:14 crc kubenswrapper[4732]: E0402 14:01:14.082547 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e53540-98b3-463a-9611-a48c2fbfc0f5" containerName="proxy-httpd" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.082555 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e53540-98b3-463a-9611-a48c2fbfc0f5" containerName="proxy-httpd" Apr 02 14:01:14 crc kubenswrapper[4732]: E0402 14:01:14.082566 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e53540-98b3-463a-9611-a48c2fbfc0f5" containerName="ceilometer-notification-agent" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.082575 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e53540-98b3-463a-9611-a48c2fbfc0f5" containerName="ceilometer-notification-agent" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.082851 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0e53540-98b3-463a-9611-a48c2fbfc0f5" containerName="ceilometer-central-agent" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.082872 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0e53540-98b3-463a-9611-a48c2fbfc0f5" containerName="sg-core" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.082890 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0e53540-98b3-463a-9611-a48c2fbfc0f5" containerName="ceilometer-notification-agent" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.082909 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0e53540-98b3-463a-9611-a48c2fbfc0f5" containerName="proxy-httpd" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.086369 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.090096 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.096647 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.106870 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.112408 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-log-httpd\") pod \"ceilometer-0\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " pod="openstack/ceilometer-0" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.112488 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr678\" (UniqueName: \"kubernetes.io/projected/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-kube-api-access-fr678\") pod \"ceilometer-0\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " pod="openstack/ceilometer-0" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.112586 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-scripts\") pod \"ceilometer-0\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " pod="openstack/ceilometer-0" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.112670 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " pod="openstack/ceilometer-0" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.112720 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " pod="openstack/ceilometer-0" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.112747 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-config-data\") pod \"ceilometer-0\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " pod="openstack/ceilometer-0" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.112771 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-run-httpd\") pod \"ceilometer-0\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " pod="openstack/ceilometer-0" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.121085 4732 scope.go:117] "RemoveContainer" containerID="f9dbf40bc89d07703caf265aa216a0d8f8682ac26fc7209514eb887f20e06559" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.219186 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-scripts\") pod \"ceilometer-0\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " pod="openstack/ceilometer-0" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.219348 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " pod="openstack/ceilometer-0" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.219446 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " pod="openstack/ceilometer-0" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.219473 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-config-data\") pod \"ceilometer-0\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " pod="openstack/ceilometer-0" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.219499 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-run-httpd\") pod \"ceilometer-0\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " pod="openstack/ceilometer-0" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.219873 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-log-httpd\") pod \"ceilometer-0\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " pod="openstack/ceilometer-0" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.219965 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr678\" (UniqueName: \"kubernetes.io/projected/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-kube-api-access-fr678\") pod \"ceilometer-0\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " pod="openstack/ceilometer-0" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.221795 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-log-httpd\") pod \"ceilometer-0\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " pod="openstack/ceilometer-0" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.222472 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-run-httpd\") pod \"ceilometer-0\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " pod="openstack/ceilometer-0" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.227652 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " pod="openstack/ceilometer-0" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.229913 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-scripts\") pod \"ceilometer-0\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " pod="openstack/ceilometer-0" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.234868 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " pod="openstack/ceilometer-0" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.235504 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-config-data\") pod \"ceilometer-0\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " pod="openstack/ceilometer-0" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.239892 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr678\" (UniqueName: \"kubernetes.io/projected/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-kube-api-access-fr678\") pod \"ceilometer-0\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " pod="openstack/ceilometer-0" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.415387 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.705916 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0e53540-98b3-463a-9611-a48c2fbfc0f5" path="/var/lib/kubelet/pods/c0e53540-98b3-463a-9611-a48c2fbfc0f5/volumes" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.831209 4732 generic.go:334] "Generic (PLEG): container finished" podID="cca116f4-1c5a-4dc9-966d-5033ed344c2f" containerID="162391e4dc1cc110bb45845cf779d42622d3bba05a9caad043bdcf8f836d3d0c" exitCode=0 Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.831375 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2bf58" event={"ID":"cca116f4-1c5a-4dc9-966d-5033ed344c2f","Type":"ContainerDied","Data":"162391e4dc1cc110bb45845cf779d42622d3bba05a9caad043bdcf8f836d3d0c"} Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.841860 4732 generic.go:334] "Generic (PLEG): container finished" podID="3012ae69-b011-46df-a7c0-36845efb7172" containerID="d00d5c36cd8aa69862685ba7aa1b5b654c559f5edb60f47093237e2c51c84f61" exitCode=0 Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.842065 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-265md" event={"ID":"3012ae69-b011-46df-a7c0-36845efb7172","Type":"ContainerDied","Data":"d00d5c36cd8aa69862685ba7aa1b5b654c559f5edb60f47093237e2c51c84f61"} Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.846442 4732 generic.go:334] "Generic (PLEG): container finished" podID="3edb7dea-2bf7-4dcc-80e0-54c59916152c" containerID="5d42fac2733be04e88650d180b8d8f50e9ee46420d5a97902e599603499b0f1a" exitCode=0 Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.846545 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1cfc-account-create-update-vtn6k" event={"ID":"3edb7dea-2bf7-4dcc-80e0-54c59916152c","Type":"ContainerDied","Data":"5d42fac2733be04e88650d180b8d8f50e9ee46420d5a97902e599603499b0f1a"} Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.854801 4732 generic.go:334] "Generic (PLEG): container finished" podID="58580fa1-1d1a-4c30-9e7a-d0e464c1487f" containerID="f8f2b2490f72d792e898a25b229cd5c8e8b10d81335981da4b9f5cac497f20a2" exitCode=0 Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.855390 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e441-account-create-update-7222l" event={"ID":"58580fa1-1d1a-4c30-9e7a-d0e464c1487f","Type":"ContainerDied","Data":"f8f2b2490f72d792e898a25b229cd5c8e8b10d81335981da4b9f5cac497f20a2"} Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.864037 4732 generic.go:334] "Generic (PLEG): container finished" podID="d0267b36-acec-4035-9bdd-b19758f45275" containerID="03f75e719d435b7310f14addd9962e52eccfb55e128e4bae6bf461c6b5c968bb" exitCode=0 Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.864119 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gcw8g" event={"ID":"d0267b36-acec-4035-9bdd-b19758f45275","Type":"ContainerDied","Data":"03f75e719d435b7310f14addd9962e52eccfb55e128e4bae6bf461c6b5c968bb"} Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.872210 4732 generic.go:334] "Generic (PLEG): container finished" podID="99e5e7a5-f66b-45c2-881c-60507bfa4c25" containerID="f4284ba9e6ef759fd112fe94d44b65bc399b40334f37ac83aaf6a6a51d41137f" exitCode=0 Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.872386 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e3c0-account-create-update-5k7n4" event={"ID":"99e5e7a5-f66b-45c2-881c-60507bfa4c25","Type":"ContainerDied","Data":"f4284ba9e6ef759fd112fe94d44b65bc399b40334f37ac83aaf6a6a51d41137f"} Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.872407 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e3c0-account-create-update-5k7n4" event={"ID":"99e5e7a5-f66b-45c2-881c-60507bfa4c25","Type":"ContainerStarted","Data":"3933b8981efbb0cdae010ebabefb04afa5688a3e57e04838d0d32e851ffb0999"} Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.874486 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9f57ff6c-7m8sr" event={"ID":"7f6ffca1-ce91-4e20-8cbc-38a3eab1616e","Type":"ContainerStarted","Data":"aef78478005ca1229ed92d7dace1722926c1f49fa6778c12a93cd015eb3f8b2d"} Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.875334 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.875363 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.924970 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-9f57ff6c-7m8sr" podStartSLOduration=3.924944152 podStartE2EDuration="3.924944152s" podCreationTimestamp="2026-04-02 14:01:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:01:14.909424889 +0000 UTC m=+1431.813832452" watchObservedRunningTime="2026-04-02 14:01:14.924944152 +0000 UTC m=+1431.829351705" Apr 02 14:01:14 crc kubenswrapper[4732]: I0402 14:01:14.977863 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:01:15 crc kubenswrapper[4732]: I0402 14:01:15.365077 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gcw8g" Apr 02 14:01:15 crc kubenswrapper[4732]: I0402 14:01:15.560719 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0267b36-acec-4035-9bdd-b19758f45275-operator-scripts\") pod \"d0267b36-acec-4035-9bdd-b19758f45275\" (UID: \"d0267b36-acec-4035-9bdd-b19758f45275\") " Apr 02 14:01:15 crc kubenswrapper[4732]: I0402 14:01:15.560900 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbvwb\" (UniqueName: \"kubernetes.io/projected/d0267b36-acec-4035-9bdd-b19758f45275-kube-api-access-gbvwb\") pod \"d0267b36-acec-4035-9bdd-b19758f45275\" (UID: \"d0267b36-acec-4035-9bdd-b19758f45275\") " Apr 02 14:01:15 crc kubenswrapper[4732]: I0402 14:01:15.562095 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0267b36-acec-4035-9bdd-b19758f45275-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d0267b36-acec-4035-9bdd-b19758f45275" (UID: "d0267b36-acec-4035-9bdd-b19758f45275"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:01:15 crc kubenswrapper[4732]: I0402 14:01:15.566070 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0267b36-acec-4035-9bdd-b19758f45275-kube-api-access-gbvwb" (OuterVolumeSpecName: "kube-api-access-gbvwb") pod "d0267b36-acec-4035-9bdd-b19758f45275" (UID: "d0267b36-acec-4035-9bdd-b19758f45275"). InnerVolumeSpecName "kube-api-access-gbvwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:01:15 crc kubenswrapper[4732]: I0402 14:01:15.663630 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0267b36-acec-4035-9bdd-b19758f45275-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:15 crc kubenswrapper[4732]: I0402 14:01:15.663660 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbvwb\" (UniqueName: \"kubernetes.io/projected/d0267b36-acec-4035-9bdd-b19758f45275-kube-api-access-gbvwb\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:15 crc kubenswrapper[4732]: I0402 14:01:15.892977 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gcw8g" event={"ID":"d0267b36-acec-4035-9bdd-b19758f45275","Type":"ContainerDied","Data":"cbe04223b2e294c1f3eb94bbf4774416843b9047e0d0573a2e971c2f2ebf0cd7"} Apr 02 14:01:15 crc kubenswrapper[4732]: I0402 14:01:15.893014 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbe04223b2e294c1f3eb94bbf4774416843b9047e0d0573a2e971c2f2ebf0cd7" Apr 02 14:01:15 crc kubenswrapper[4732]: I0402 14:01:15.893078 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gcw8g" Apr 02 14:01:15 crc kubenswrapper[4732]: I0402 14:01:15.895700 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b76e6f0e-76e4-4b53-8697-ea1680abc0f4","Type":"ContainerStarted","Data":"fffc6c54f0d0a2b9e101256ca31f190028d24e05834cf71b7da1c6cfe57494ec"} Apr 02 14:01:15 crc kubenswrapper[4732]: I0402 14:01:15.896443 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b76e6f0e-76e4-4b53-8697-ea1680abc0f4","Type":"ContainerStarted","Data":"986855bf5ac50cdb3b12f50aa1c573f673af8c46baec5fbc1959a983b9c6ed6e"} Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.408527 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e441-account-create-update-7222l" Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.586504 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzclj\" (UniqueName: \"kubernetes.io/projected/58580fa1-1d1a-4c30-9e7a-d0e464c1487f-kube-api-access-pzclj\") pod \"58580fa1-1d1a-4c30-9e7a-d0e464c1487f\" (UID: \"58580fa1-1d1a-4c30-9e7a-d0e464c1487f\") " Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.596091 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58580fa1-1d1a-4c30-9e7a-d0e464c1487f-operator-scripts\") pod \"58580fa1-1d1a-4c30-9e7a-d0e464c1487f\" (UID: \"58580fa1-1d1a-4c30-9e7a-d0e464c1487f\") " Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.592033 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58580fa1-1d1a-4c30-9e7a-d0e464c1487f-kube-api-access-pzclj" (OuterVolumeSpecName: "kube-api-access-pzclj") pod "58580fa1-1d1a-4c30-9e7a-d0e464c1487f" (UID: "58580fa1-1d1a-4c30-9e7a-d0e464c1487f"). InnerVolumeSpecName "kube-api-access-pzclj". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.596830 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-59fb764b6d-vml5x" podUID="80fe4580-a48f-4cba-b3b3-d6ccbc7f0525" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.596946 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.596990 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzclj\" (UniqueName: \"kubernetes.io/projected/58580fa1-1d1a-4c30-9e7a-d0e464c1487f-kube-api-access-pzclj\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.615461 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58580fa1-1d1a-4c30-9e7a-d0e464c1487f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "58580fa1-1d1a-4c30-9e7a-d0e464c1487f" (UID: "58580fa1-1d1a-4c30-9e7a-d0e464c1487f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.702119 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58580fa1-1d1a-4c30-9e7a-d0e464c1487f-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.793575 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e3c0-account-create-update-5k7n4" Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.823487 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-265md" Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.825518 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1cfc-account-create-update-vtn6k" Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.904833 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-265md" Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.904843 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-265md" event={"ID":"3012ae69-b011-46df-a7c0-36845efb7172","Type":"ContainerDied","Data":"719eb9aa0d23a8eeb62eb1832780e9f0659f936ef6d65f7b139d9c6ecff3f83e"} Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.904902 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="719eb9aa0d23a8eeb62eb1832780e9f0659f936ef6d65f7b139d9c6ecff3f83e" Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.906404 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1cfc-account-create-update-vtn6k" Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.906423 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1cfc-account-create-update-vtn6k" event={"ID":"3edb7dea-2bf7-4dcc-80e0-54c59916152c","Type":"ContainerDied","Data":"6202919de170483e4a0ad9ca5e6b24c78f293e5db46cdef825b4c86c72368ae8"} Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.906440 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6202919de170483e4a0ad9ca5e6b24c78f293e5db46cdef825b4c86c72368ae8" Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.907756 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e441-account-create-update-7222l" Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.907752 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e441-account-create-update-7222l" event={"ID":"58580fa1-1d1a-4c30-9e7a-d0e464c1487f","Type":"ContainerDied","Data":"3df6c708d0b571e95f3cd5c4b55b6ae57bc79c00bd349831f04064c5c52f3003"} Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.907875 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3df6c708d0b571e95f3cd5c4b55b6ae57bc79c00bd349831f04064c5c52f3003" Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.909090 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e3c0-account-create-update-5k7n4" Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.909200 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e3c0-account-create-update-5k7n4" event={"ID":"99e5e7a5-f66b-45c2-881c-60507bfa4c25","Type":"ContainerDied","Data":"3933b8981efbb0cdae010ebabefb04afa5688a3e57e04838d0d32e851ffb0999"} Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.909227 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3933b8981efbb0cdae010ebabefb04afa5688a3e57e04838d0d32e851ffb0999" Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.909895 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zgtq\" (UniqueName: \"kubernetes.io/projected/99e5e7a5-f66b-45c2-881c-60507bfa4c25-kube-api-access-9zgtq\") pod \"99e5e7a5-f66b-45c2-881c-60507bfa4c25\" (UID: \"99e5e7a5-f66b-45c2-881c-60507bfa4c25\") " Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.910090 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99e5e7a5-f66b-45c2-881c-60507bfa4c25-operator-scripts\") pod \"99e5e7a5-f66b-45c2-881c-60507bfa4c25\" (UID: \"99e5e7a5-f66b-45c2-881c-60507bfa4c25\") " Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.910850 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99e5e7a5-f66b-45c2-881c-60507bfa4c25-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "99e5e7a5-f66b-45c2-881c-60507bfa4c25" (UID: "99e5e7a5-f66b-45c2-881c-60507bfa4c25"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:01:16 crc kubenswrapper[4732]: I0402 14:01:16.919372 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99e5e7a5-f66b-45c2-881c-60507bfa4c25-kube-api-access-9zgtq" (OuterVolumeSpecName: "kube-api-access-9zgtq") pod "99e5e7a5-f66b-45c2-881c-60507bfa4c25" (UID: "99e5e7a5-f66b-45c2-881c-60507bfa4c25"). InnerVolumeSpecName "kube-api-access-9zgtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.011996 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l6t9\" (UniqueName: \"kubernetes.io/projected/3012ae69-b011-46df-a7c0-36845efb7172-kube-api-access-2l6t9\") pod \"3012ae69-b011-46df-a7c0-36845efb7172\" (UID: \"3012ae69-b011-46df-a7c0-36845efb7172\") " Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.012404 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fgr9\" (UniqueName: \"kubernetes.io/projected/3edb7dea-2bf7-4dcc-80e0-54c59916152c-kube-api-access-2fgr9\") pod \"3edb7dea-2bf7-4dcc-80e0-54c59916152c\" (UID: \"3edb7dea-2bf7-4dcc-80e0-54c59916152c\") " Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.012511 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3edb7dea-2bf7-4dcc-80e0-54c59916152c-operator-scripts\") pod \"3edb7dea-2bf7-4dcc-80e0-54c59916152c\" (UID: \"3edb7dea-2bf7-4dcc-80e0-54c59916152c\") " Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.012627 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3012ae69-b011-46df-a7c0-36845efb7172-operator-scripts\") pod \"3012ae69-b011-46df-a7c0-36845efb7172\" (UID: \"3012ae69-b011-46df-a7c0-36845efb7172\") " Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.013499 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99e5e7a5-f66b-45c2-881c-60507bfa4c25-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.013525 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zgtq\" (UniqueName: \"kubernetes.io/projected/99e5e7a5-f66b-45c2-881c-60507bfa4c25-kube-api-access-9zgtq\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.018821 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3edb7dea-2bf7-4dcc-80e0-54c59916152c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3edb7dea-2bf7-4dcc-80e0-54c59916152c" (UID: "3edb7dea-2bf7-4dcc-80e0-54c59916152c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.019132 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3012ae69-b011-46df-a7c0-36845efb7172-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3012ae69-b011-46df-a7c0-36845efb7172" (UID: "3012ae69-b011-46df-a7c0-36845efb7172"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.027179 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3012ae69-b011-46df-a7c0-36845efb7172-kube-api-access-2l6t9" (OuterVolumeSpecName: "kube-api-access-2l6t9") pod "3012ae69-b011-46df-a7c0-36845efb7172" (UID: "3012ae69-b011-46df-a7c0-36845efb7172"). InnerVolumeSpecName "kube-api-access-2l6t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.028835 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3edb7dea-2bf7-4dcc-80e0-54c59916152c-kube-api-access-2fgr9" (OuterVolumeSpecName: "kube-api-access-2fgr9") pod "3edb7dea-2bf7-4dcc-80e0-54c59916152c" (UID: "3edb7dea-2bf7-4dcc-80e0-54c59916152c"). InnerVolumeSpecName "kube-api-access-2fgr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.070734 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2bf58" Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.119010 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca116f4-1c5a-4dc9-966d-5033ed344c2f-operator-scripts\") pod \"cca116f4-1c5a-4dc9-966d-5033ed344c2f\" (UID: \"cca116f4-1c5a-4dc9-966d-5033ed344c2f\") " Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.119439 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l6t9\" (UniqueName: \"kubernetes.io/projected/3012ae69-b011-46df-a7c0-36845efb7172-kube-api-access-2l6t9\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.119459 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fgr9\" (UniqueName: \"kubernetes.io/projected/3edb7dea-2bf7-4dcc-80e0-54c59916152c-kube-api-access-2fgr9\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.119468 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3edb7dea-2bf7-4dcc-80e0-54c59916152c-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.119477 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3012ae69-b011-46df-a7c0-36845efb7172-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.120034 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cca116f4-1c5a-4dc9-966d-5033ed344c2f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cca116f4-1c5a-4dc9-966d-5033ed344c2f" (UID: "cca116f4-1c5a-4dc9-966d-5033ed344c2f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.220486 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtv6h\" (UniqueName: \"kubernetes.io/projected/cca116f4-1c5a-4dc9-966d-5033ed344c2f-kube-api-access-gtv6h\") pod \"cca116f4-1c5a-4dc9-966d-5033ed344c2f\" (UID: \"cca116f4-1c5a-4dc9-966d-5033ed344c2f\") " Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.221134 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca116f4-1c5a-4dc9-966d-5033ed344c2f-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.225826 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cca116f4-1c5a-4dc9-966d-5033ed344c2f-kube-api-access-gtv6h" (OuterVolumeSpecName: "kube-api-access-gtv6h") pod "cca116f4-1c5a-4dc9-966d-5033ed344c2f" (UID: "cca116f4-1c5a-4dc9-966d-5033ed344c2f"). InnerVolumeSpecName "kube-api-access-gtv6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.323016 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtv6h\" (UniqueName: \"kubernetes.io/projected/cca116f4-1c5a-4dc9-966d-5033ed344c2f-kube-api-access-gtv6h\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.920518 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b76e6f0e-76e4-4b53-8697-ea1680abc0f4","Type":"ContainerStarted","Data":"3aeebf741ee9b2f7f48baefcc3e99161bf878dded7f109b31f25faf929cfd71f"} Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.920586 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b76e6f0e-76e4-4b53-8697-ea1680abc0f4","Type":"ContainerStarted","Data":"0a5a63f8c0e6eb5a5c2e690c7ae380c865fd500f3f56c6502da61a164145d9de"} Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.923019 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2bf58" event={"ID":"cca116f4-1c5a-4dc9-966d-5033ed344c2f","Type":"ContainerDied","Data":"ea2ef4b788fd8804b3c46379526db7e699f02f17786d9c350448c18d925997eb"} Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.923049 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea2ef4b788fd8804b3c46379526db7e699f02f17786d9c350448c18d925997eb" Apr 02 14:01:17 crc kubenswrapper[4732]: I0402 14:01:17.923121 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2bf58" Apr 02 14:01:18 crc kubenswrapper[4732]: I0402 14:01:18.934638 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-58f8f59779-9rrsx" Apr 02 14:01:19 crc kubenswrapper[4732]: I0402 14:01:19.009169 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-cd55d6846-4zs9k"] Apr 02 14:01:19 crc kubenswrapper[4732]: I0402 14:01:19.009759 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-cd55d6846-4zs9k" podUID="16168989-9c80-47a7-92ea-8be3984e5d99" containerName="neutron-api" containerID="cri-o://0fc3be86c5bbdc911a2aa0733cc4678d1d83c286c3a20005f7c7cdbcde9da39b" gracePeriod=30 Apr 02 14:01:19 crc kubenswrapper[4732]: I0402 14:01:19.009891 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-cd55d6846-4zs9k" podUID="16168989-9c80-47a7-92ea-8be3984e5d99" containerName="neutron-httpd" containerID="cri-o://3429e76b65d136138b088fd6ee1f204b388fa132a9e8c6d42c421ee5d3126c70" gracePeriod=30 Apr 02 14:01:19 crc kubenswrapper[4732]: I0402 14:01:19.943367 4732 generic.go:334] "Generic (PLEG): container finished" podID="16168989-9c80-47a7-92ea-8be3984e5d99" containerID="3429e76b65d136138b088fd6ee1f204b388fa132a9e8c6d42c421ee5d3126c70" exitCode=0 Apr 02 14:01:19 crc kubenswrapper[4732]: I0402 14:01:19.943414 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cd55d6846-4zs9k" event={"ID":"16168989-9c80-47a7-92ea-8be3984e5d99","Type":"ContainerDied","Data":"3429e76b65d136138b088fd6ee1f204b388fa132a9e8c6d42c421ee5d3126c70"} Apr 02 14:01:21 crc kubenswrapper[4732]: I0402 14:01:21.519070 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:21 crc kubenswrapper[4732]: I0402 14:01:21.520847 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-9f57ff6c-7m8sr" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.165188 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lcftj"] Apr 02 14:01:22 crc kubenswrapper[4732]: E0402 14:01:22.165629 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99e5e7a5-f66b-45c2-881c-60507bfa4c25" containerName="mariadb-account-create-update" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.165647 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e5e7a5-f66b-45c2-881c-60507bfa4c25" containerName="mariadb-account-create-update" Apr 02 14:01:22 crc kubenswrapper[4732]: E0402 14:01:22.165665 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58580fa1-1d1a-4c30-9e7a-d0e464c1487f" containerName="mariadb-account-create-update" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.165672 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="58580fa1-1d1a-4c30-9e7a-d0e464c1487f" containerName="mariadb-account-create-update" Apr 02 14:01:22 crc kubenswrapper[4732]: E0402 14:01:22.165681 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0267b36-acec-4035-9bdd-b19758f45275" containerName="mariadb-database-create" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.165689 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0267b36-acec-4035-9bdd-b19758f45275" containerName="mariadb-database-create" Apr 02 14:01:22 crc kubenswrapper[4732]: E0402 14:01:22.165699 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3edb7dea-2bf7-4dcc-80e0-54c59916152c" containerName="mariadb-account-create-update" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.165705 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3edb7dea-2bf7-4dcc-80e0-54c59916152c" containerName="mariadb-account-create-update" Apr 02 14:01:22 crc kubenswrapper[4732]: E0402 14:01:22.165722 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca116f4-1c5a-4dc9-966d-5033ed344c2f" containerName="mariadb-database-create" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.165729 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca116f4-1c5a-4dc9-966d-5033ed344c2f" containerName="mariadb-database-create" Apr 02 14:01:22 crc kubenswrapper[4732]: E0402 14:01:22.165740 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3012ae69-b011-46df-a7c0-36845efb7172" containerName="mariadb-database-create" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.165748 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3012ae69-b011-46df-a7c0-36845efb7172" containerName="mariadb-database-create" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.165916 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="58580fa1-1d1a-4c30-9e7a-d0e464c1487f" containerName="mariadb-account-create-update" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.165931 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="3edb7dea-2bf7-4dcc-80e0-54c59916152c" containerName="mariadb-account-create-update" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.165942 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="99e5e7a5-f66b-45c2-881c-60507bfa4c25" containerName="mariadb-account-create-update" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.165952 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0267b36-acec-4035-9bdd-b19758f45275" containerName="mariadb-database-create" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.165963 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="3012ae69-b011-46df-a7c0-36845efb7172" containerName="mariadb-database-create" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.165972 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca116f4-1c5a-4dc9-966d-5033ed344c2f" containerName="mariadb-database-create" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.166558 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lcftj" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.168454 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.168850 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.168985 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kxxgt" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.174456 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lcftj"] Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.236367 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d186e696-8cc5-4dac-b6cb-b9a5530bc57e-scripts\") pod \"nova-cell0-conductor-db-sync-lcftj\" (UID: \"d186e696-8cc5-4dac-b6cb-b9a5530bc57e\") " pod="openstack/nova-cell0-conductor-db-sync-lcftj" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.236977 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt8kp\" (UniqueName: \"kubernetes.io/projected/d186e696-8cc5-4dac-b6cb-b9a5530bc57e-kube-api-access-rt8kp\") pod \"nova-cell0-conductor-db-sync-lcftj\" (UID: \"d186e696-8cc5-4dac-b6cb-b9a5530bc57e\") " pod="openstack/nova-cell0-conductor-db-sync-lcftj" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.237208 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d186e696-8cc5-4dac-b6cb-b9a5530bc57e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lcftj\" (UID: \"d186e696-8cc5-4dac-b6cb-b9a5530bc57e\") " pod="openstack/nova-cell0-conductor-db-sync-lcftj" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.237265 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d186e696-8cc5-4dac-b6cb-b9a5530bc57e-config-data\") pod \"nova-cell0-conductor-db-sync-lcftj\" (UID: \"d186e696-8cc5-4dac-b6cb-b9a5530bc57e\") " pod="openstack/nova-cell0-conductor-db-sync-lcftj" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.339276 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d186e696-8cc5-4dac-b6cb-b9a5530bc57e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lcftj\" (UID: \"d186e696-8cc5-4dac-b6cb-b9a5530bc57e\") " pod="openstack/nova-cell0-conductor-db-sync-lcftj" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.339563 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d186e696-8cc5-4dac-b6cb-b9a5530bc57e-config-data\") pod \"nova-cell0-conductor-db-sync-lcftj\" (UID: \"d186e696-8cc5-4dac-b6cb-b9a5530bc57e\") " pod="openstack/nova-cell0-conductor-db-sync-lcftj" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.339678 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d186e696-8cc5-4dac-b6cb-b9a5530bc57e-scripts\") pod \"nova-cell0-conductor-db-sync-lcftj\" (UID: \"d186e696-8cc5-4dac-b6cb-b9a5530bc57e\") " pod="openstack/nova-cell0-conductor-db-sync-lcftj" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.339803 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt8kp\" (UniqueName: \"kubernetes.io/projected/d186e696-8cc5-4dac-b6cb-b9a5530bc57e-kube-api-access-rt8kp\") pod \"nova-cell0-conductor-db-sync-lcftj\" (UID: \"d186e696-8cc5-4dac-b6cb-b9a5530bc57e\") " pod="openstack/nova-cell0-conductor-db-sync-lcftj" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.344999 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d186e696-8cc5-4dac-b6cb-b9a5530bc57e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lcftj\" (UID: \"d186e696-8cc5-4dac-b6cb-b9a5530bc57e\") " pod="openstack/nova-cell0-conductor-db-sync-lcftj" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.351135 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d186e696-8cc5-4dac-b6cb-b9a5530bc57e-scripts\") pod \"nova-cell0-conductor-db-sync-lcftj\" (UID: \"d186e696-8cc5-4dac-b6cb-b9a5530bc57e\") " pod="openstack/nova-cell0-conductor-db-sync-lcftj" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.374371 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt8kp\" (UniqueName: \"kubernetes.io/projected/d186e696-8cc5-4dac-b6cb-b9a5530bc57e-kube-api-access-rt8kp\") pod \"nova-cell0-conductor-db-sync-lcftj\" (UID: \"d186e696-8cc5-4dac-b6cb-b9a5530bc57e\") " pod="openstack/nova-cell0-conductor-db-sync-lcftj" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.376496 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d186e696-8cc5-4dac-b6cb-b9a5530bc57e-config-data\") pod \"nova-cell0-conductor-db-sync-lcftj\" (UID: \"d186e696-8cc5-4dac-b6cb-b9a5530bc57e\") " pod="openstack/nova-cell0-conductor-db-sync-lcftj" Apr 02 14:01:22 crc kubenswrapper[4732]: I0402 14:01:22.511163 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lcftj" Apr 02 14:01:23 crc kubenswrapper[4732]: I0402 14:01:23.994753 4732 generic.go:334] "Generic (PLEG): container finished" podID="80fe4580-a48f-4cba-b3b3-d6ccbc7f0525" containerID="aecec17f0e79f52981ef4808d1800fe67aab89d44b00ed1c7e216149c5ee7fc8" exitCode=137 Apr 02 14:01:23 crc kubenswrapper[4732]: I0402 14:01:23.995068 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59fb764b6d-vml5x" event={"ID":"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525","Type":"ContainerDied","Data":"aecec17f0e79f52981ef4808d1800fe67aab89d44b00ed1c7e216149c5ee7fc8"} Apr 02 14:01:25 crc kubenswrapper[4732]: I0402 14:01:25.016215 4732 generic.go:334] "Generic (PLEG): container finished" podID="16168989-9c80-47a7-92ea-8be3984e5d99" containerID="0fc3be86c5bbdc911a2aa0733cc4678d1d83c286c3a20005f7c7cdbcde9da39b" exitCode=0 Apr 02 14:01:25 crc kubenswrapper[4732]: I0402 14:01:25.016579 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cd55d6846-4zs9k" event={"ID":"16168989-9c80-47a7-92ea-8be3984e5d99","Type":"ContainerDied","Data":"0fc3be86c5bbdc911a2aa0733cc4678d1d83c286c3a20005f7c7cdbcde9da39b"} Apr 02 14:01:26 crc kubenswrapper[4732]: I0402 14:01:26.684026 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 14:01:26 crc kubenswrapper[4732]: I0402 14:01:26.828160 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-combined-ca-bundle\") pod \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " Apr 02 14:01:26 crc kubenswrapper[4732]: I0402 14:01:26.828285 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-logs\") pod \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " Apr 02 14:01:26 crc kubenswrapper[4732]: I0402 14:01:26.828371 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-config-data\") pod \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " Apr 02 14:01:26 crc kubenswrapper[4732]: I0402 14:01:26.828386 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-horizon-tls-certs\") pod \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " Apr 02 14:01:26 crc kubenswrapper[4732]: I0402 14:01:26.828461 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-scripts\") pod \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " Apr 02 14:01:26 crc kubenswrapper[4732]: I0402 14:01:26.828545 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-horizon-secret-key\") pod \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " Apr 02 14:01:26 crc kubenswrapper[4732]: I0402 14:01:26.828648 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgkzv\" (UniqueName: \"kubernetes.io/projected/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-kube-api-access-sgkzv\") pod \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\" (UID: \"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525\") " Apr 02 14:01:26 crc kubenswrapper[4732]: I0402 14:01:26.829997 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-logs" (OuterVolumeSpecName: "logs") pod "80fe4580-a48f-4cba-b3b3-d6ccbc7f0525" (UID: "80fe4580-a48f-4cba-b3b3-d6ccbc7f0525"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:01:26 crc kubenswrapper[4732]: I0402 14:01:26.834849 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-kube-api-access-sgkzv" (OuterVolumeSpecName: "kube-api-access-sgkzv") pod "80fe4580-a48f-4cba-b3b3-d6ccbc7f0525" (UID: "80fe4580-a48f-4cba-b3b3-d6ccbc7f0525"). InnerVolumeSpecName "kube-api-access-sgkzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:01:26 crc kubenswrapper[4732]: I0402 14:01:26.834942 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "80fe4580-a48f-4cba-b3b3-d6ccbc7f0525" (UID: "80fe4580-a48f-4cba-b3b3-d6ccbc7f0525"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:26 crc kubenswrapper[4732]: I0402 14:01:26.869569 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80fe4580-a48f-4cba-b3b3-d6ccbc7f0525" (UID: "80fe4580-a48f-4cba-b3b3-d6ccbc7f0525"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:26 crc kubenswrapper[4732]: I0402 14:01:26.894310 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-scripts" (OuterVolumeSpecName: "scripts") pod "80fe4580-a48f-4cba-b3b3-d6ccbc7f0525" (UID: "80fe4580-a48f-4cba-b3b3-d6ccbc7f0525"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:01:26 crc kubenswrapper[4732]: I0402 14:01:26.897581 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-config-data" (OuterVolumeSpecName: "config-data") pod "80fe4580-a48f-4cba-b3b3-d6ccbc7f0525" (UID: "80fe4580-a48f-4cba-b3b3-d6ccbc7f0525"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:01:26 crc kubenswrapper[4732]: I0402 14:01:26.923853 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "80fe4580-a48f-4cba-b3b3-d6ccbc7f0525" (UID: "80fe4580-a48f-4cba-b3b3-d6ccbc7f0525"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:26 crc kubenswrapper[4732]: I0402 14:01:26.932510 4732 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:26 crc kubenswrapper[4732]: I0402 14:01:26.932544 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgkzv\" (UniqueName: \"kubernetes.io/projected/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-kube-api-access-sgkzv\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:26 crc kubenswrapper[4732]: I0402 14:01:26.932558 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:26 crc kubenswrapper[4732]: I0402 14:01:26.932567 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-logs\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:26 crc kubenswrapper[4732]: I0402 14:01:26.932578 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:26 crc kubenswrapper[4732]: I0402 14:01:26.932586 4732 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:26 crc kubenswrapper[4732]: I0402 14:01:26.932594 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.013074 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cd55d6846-4zs9k" Apr 02 14:01:27 crc kubenswrapper[4732]: W0402 14:01:27.027063 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd186e696_8cc5_4dac_b6cb_b9a5530bc57e.slice/crio-347e5e89c757fde507b0826d2d1c8d62cc3150ca7b345a4b1865fa73c190dc15 WatchSource:0}: Error finding container 347e5e89c757fde507b0826d2d1c8d62cc3150ca7b345a4b1865fa73c190dc15: Status 404 returned error can't find the container with id 347e5e89c757fde507b0826d2d1c8d62cc3150ca7b345a4b1865fa73c190dc15 Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.028392 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lcftj"] Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.047382 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59fb764b6d-vml5x" event={"ID":"80fe4580-a48f-4cba-b3b3-d6ccbc7f0525","Type":"ContainerDied","Data":"4b1706da63a3a81d63b698d942bdeeacfb33d97d83cb7dc15eb16115530f9f12"} Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.047433 4732 scope.go:117] "RemoveContainer" containerID="56cb1e14c01780829e3d908dfe3c04fea17b711fbfdaa9357cab714feddecfb5" Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.047575 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59fb764b6d-vml5x" Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.053423 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cd55d6846-4zs9k" event={"ID":"16168989-9c80-47a7-92ea-8be3984e5d99","Type":"ContainerDied","Data":"54feea172d4cc2765cc10a2c9f6f31c474a9ea49b5f141adcab410b0f18eb69b"} Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.053508 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cd55d6846-4zs9k" Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.056294 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b76e6f0e-76e4-4b53-8697-ea1680abc0f4","Type":"ContainerStarted","Data":"605e0f293cae9b39f89f388e139e7fdf2999e783ba85854dbe98f39d274a268c"} Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.056948 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.059690 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"66ae86e8-597c-4fdb-b0da-283cf37afba2","Type":"ContainerStarted","Data":"27d3c59fca64727f57a3ae64161310e7b0693be0ef9c2cca6bdda6bad96198b2"} Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.085451 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=8.25176483 podStartE2EDuration="13.085432305s" podCreationTimestamp="2026-04-02 14:01:14 +0000 UTC" firstStartedPulling="2026-04-02 14:01:14.989459401 +0000 UTC m=+1431.893866954" lastFinishedPulling="2026-04-02 14:01:19.823126876 +0000 UTC m=+1436.727534429" observedRunningTime="2026-04-02 14:01:27.082913446 +0000 UTC m=+1443.987321009" watchObservedRunningTime="2026-04-02 14:01:27.085432305 +0000 UTC m=+1443.989839878" Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.122115 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.435260185 podStartE2EDuration="18.122099624s" podCreationTimestamp="2026-04-02 14:01:09 +0000 UTC" firstStartedPulling="2026-04-02 14:01:10.855161936 +0000 UTC m=+1427.759569499" lastFinishedPulling="2026-04-02 14:01:26.542001385 +0000 UTC m=+1443.446408938" observedRunningTime="2026-04-02 14:01:27.105324357 +0000 UTC m=+1444.009731920" watchObservedRunningTime="2026-04-02 14:01:27.122099624 +0000 UTC m=+1444.026507177" Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.141186 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-59fb764b6d-vml5x"] Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.146976 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16168989-9c80-47a7-92ea-8be3984e5d99-combined-ca-bundle\") pod \"16168989-9c80-47a7-92ea-8be3984e5d99\" (UID: \"16168989-9c80-47a7-92ea-8be3984e5d99\") " Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.147038 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/16168989-9c80-47a7-92ea-8be3984e5d99-ovndb-tls-certs\") pod \"16168989-9c80-47a7-92ea-8be3984e5d99\" (UID: \"16168989-9c80-47a7-92ea-8be3984e5d99\") " Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.147164 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/16168989-9c80-47a7-92ea-8be3984e5d99-config\") pod \"16168989-9c80-47a7-92ea-8be3984e5d99\" (UID: \"16168989-9c80-47a7-92ea-8be3984e5d99\") " Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.147225 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/16168989-9c80-47a7-92ea-8be3984e5d99-httpd-config\") pod \"16168989-9c80-47a7-92ea-8be3984e5d99\" (UID: \"16168989-9c80-47a7-92ea-8be3984e5d99\") " Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.147289 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwj2d\" (UniqueName: \"kubernetes.io/projected/16168989-9c80-47a7-92ea-8be3984e5d99-kube-api-access-dwj2d\") pod \"16168989-9c80-47a7-92ea-8be3984e5d99\" (UID: \"16168989-9c80-47a7-92ea-8be3984e5d99\") " Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.151114 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16168989-9c80-47a7-92ea-8be3984e5d99-kube-api-access-dwj2d" (OuterVolumeSpecName: "kube-api-access-dwj2d") pod "16168989-9c80-47a7-92ea-8be3984e5d99" (UID: "16168989-9c80-47a7-92ea-8be3984e5d99"). InnerVolumeSpecName "kube-api-access-dwj2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.152441 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-59fb764b6d-vml5x"] Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.157703 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16168989-9c80-47a7-92ea-8be3984e5d99-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "16168989-9c80-47a7-92ea-8be3984e5d99" (UID: "16168989-9c80-47a7-92ea-8be3984e5d99"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.222994 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16168989-9c80-47a7-92ea-8be3984e5d99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16168989-9c80-47a7-92ea-8be3984e5d99" (UID: "16168989-9c80-47a7-92ea-8be3984e5d99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.229121 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16168989-9c80-47a7-92ea-8be3984e5d99-config" (OuterVolumeSpecName: "config") pod "16168989-9c80-47a7-92ea-8be3984e5d99" (UID: "16168989-9c80-47a7-92ea-8be3984e5d99"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.242809 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16168989-9c80-47a7-92ea-8be3984e5d99-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "16168989-9c80-47a7-92ea-8be3984e5d99" (UID: "16168989-9c80-47a7-92ea-8be3984e5d99"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.249776 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwj2d\" (UniqueName: \"kubernetes.io/projected/16168989-9c80-47a7-92ea-8be3984e5d99-kube-api-access-dwj2d\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.249806 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16168989-9c80-47a7-92ea-8be3984e5d99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.249815 4732 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/16168989-9c80-47a7-92ea-8be3984e5d99-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.249823 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/16168989-9c80-47a7-92ea-8be3984e5d99-config\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.249833 4732 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/16168989-9c80-47a7-92ea-8be3984e5d99-httpd-config\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.260556 4732 scope.go:117] "RemoveContainer" containerID="aecec17f0e79f52981ef4808d1800fe67aab89d44b00ed1c7e216149c5ee7fc8" Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.284435 4732 scope.go:117] "RemoveContainer" containerID="3429e76b65d136138b088fd6ee1f204b388fa132a9e8c6d42c421ee5d3126c70" Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.365608 4732 scope.go:117] "RemoveContainer" containerID="0fc3be86c5bbdc911a2aa0733cc4678d1d83c286c3a20005f7c7cdbcde9da39b" Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.390900 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-cd55d6846-4zs9k"] Apr 02 14:01:27 crc kubenswrapper[4732]: I0402 14:01:27.399583 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-cd55d6846-4zs9k"] Apr 02 14:01:28 crc kubenswrapper[4732]: I0402 14:01:28.073713 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lcftj" event={"ID":"d186e696-8cc5-4dac-b6cb-b9a5530bc57e","Type":"ContainerStarted","Data":"347e5e89c757fde507b0826d2d1c8d62cc3150ca7b345a4b1865fa73c190dc15"} Apr 02 14:01:28 crc kubenswrapper[4732]: I0402 14:01:28.651840 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:01:28 crc kubenswrapper[4732]: I0402 14:01:28.690513 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16168989-9c80-47a7-92ea-8be3984e5d99" path="/var/lib/kubelet/pods/16168989-9c80-47a7-92ea-8be3984e5d99/volumes" Apr 02 14:01:28 crc kubenswrapper[4732]: I0402 14:01:28.691217 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80fe4580-a48f-4cba-b3b3-d6ccbc7f0525" path="/var/lib/kubelet/pods/80fe4580-a48f-4cba-b3b3-d6ccbc7f0525/volumes" Apr 02 14:01:30 crc kubenswrapper[4732]: I0402 14:01:30.100852 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b76e6f0e-76e4-4b53-8697-ea1680abc0f4" containerName="ceilometer-central-agent" containerID="cri-o://fffc6c54f0d0a2b9e101256ca31f190028d24e05834cf71b7da1c6cfe57494ec" gracePeriod=30 Apr 02 14:01:30 crc kubenswrapper[4732]: I0402 14:01:30.100912 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b76e6f0e-76e4-4b53-8697-ea1680abc0f4" containerName="ceilometer-notification-agent" containerID="cri-o://0a5a63f8c0e6eb5a5c2e690c7ae380c865fd500f3f56c6502da61a164145d9de" gracePeriod=30 Apr 02 14:01:30 crc kubenswrapper[4732]: I0402 14:01:30.100917 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b76e6f0e-76e4-4b53-8697-ea1680abc0f4" containerName="proxy-httpd" containerID="cri-o://605e0f293cae9b39f89f388e139e7fdf2999e783ba85854dbe98f39d274a268c" gracePeriod=30 Apr 02 14:01:30 crc kubenswrapper[4732]: I0402 14:01:30.100920 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b76e6f0e-76e4-4b53-8697-ea1680abc0f4" containerName="sg-core" containerID="cri-o://3aeebf741ee9b2f7f48baefcc3e99161bf878dded7f109b31f25faf929cfd71f" gracePeriod=30 Apr 02 14:01:31 crc kubenswrapper[4732]: I0402 14:01:31.115342 4732 generic.go:334] "Generic (PLEG): container finished" podID="b76e6f0e-76e4-4b53-8697-ea1680abc0f4" containerID="605e0f293cae9b39f89f388e139e7fdf2999e783ba85854dbe98f39d274a268c" exitCode=0 Apr 02 14:01:31 crc kubenswrapper[4732]: I0402 14:01:31.115375 4732 generic.go:334] "Generic (PLEG): container finished" podID="b76e6f0e-76e4-4b53-8697-ea1680abc0f4" containerID="3aeebf741ee9b2f7f48baefcc3e99161bf878dded7f109b31f25faf929cfd71f" exitCode=2 Apr 02 14:01:31 crc kubenswrapper[4732]: I0402 14:01:31.115383 4732 generic.go:334] "Generic (PLEG): container finished" podID="b76e6f0e-76e4-4b53-8697-ea1680abc0f4" containerID="0a5a63f8c0e6eb5a5c2e690c7ae380c865fd500f3f56c6502da61a164145d9de" exitCode=0 Apr 02 14:01:31 crc kubenswrapper[4732]: I0402 14:01:31.115390 4732 generic.go:334] "Generic (PLEG): container finished" podID="b76e6f0e-76e4-4b53-8697-ea1680abc0f4" containerID="fffc6c54f0d0a2b9e101256ca31f190028d24e05834cf71b7da1c6cfe57494ec" exitCode=0 Apr 02 14:01:31 crc kubenswrapper[4732]: I0402 14:01:31.115413 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b76e6f0e-76e4-4b53-8697-ea1680abc0f4","Type":"ContainerDied","Data":"605e0f293cae9b39f89f388e139e7fdf2999e783ba85854dbe98f39d274a268c"} Apr 02 14:01:31 crc kubenswrapper[4732]: I0402 14:01:31.115646 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b76e6f0e-76e4-4b53-8697-ea1680abc0f4","Type":"ContainerDied","Data":"3aeebf741ee9b2f7f48baefcc3e99161bf878dded7f109b31f25faf929cfd71f"} Apr 02 14:01:31 crc kubenswrapper[4732]: I0402 14:01:31.115665 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b76e6f0e-76e4-4b53-8697-ea1680abc0f4","Type":"ContainerDied","Data":"0a5a63f8c0e6eb5a5c2e690c7ae380c865fd500f3f56c6502da61a164145d9de"} Apr 02 14:01:31 crc kubenswrapper[4732]: I0402 14:01:31.115677 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b76e6f0e-76e4-4b53-8697-ea1680abc0f4","Type":"ContainerDied","Data":"fffc6c54f0d0a2b9e101256ca31f190028d24e05834cf71b7da1c6cfe57494ec"} Apr 02 14:01:31 crc kubenswrapper[4732]: I0402 14:01:31.597880 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-59fb764b6d-vml5x" podUID="80fe4580-a48f-4cba-b3b3-d6ccbc7f0525" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 02 14:01:34 crc kubenswrapper[4732]: I0402 14:01:34.765541 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Apr 02 14:01:34 crc kubenswrapper[4732]: I0402 14:01:34.766196 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1fbd853a-4252-4cf9-a5f3-a79c7360a62c" containerName="glance-log" containerID="cri-o://af0f0cd140e2c2dea25a93d14d04b37a18f2e24ae31a55b2beb619dc7d51e799" gracePeriod=30 Apr 02 14:01:34 crc kubenswrapper[4732]: I0402 14:01:34.766251 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1fbd853a-4252-4cf9-a5f3-a79c7360a62c" containerName="glance-httpd" containerID="cri-o://4fbd4e38d326424f02f37bb618d5c6d08b882e6c122760e527d6ba58f02e1e92" gracePeriod=30 Apr 02 14:01:35 crc kubenswrapper[4732]: I0402 14:01:35.183123 4732 generic.go:334] "Generic (PLEG): container finished" podID="1fbd853a-4252-4cf9-a5f3-a79c7360a62c" containerID="af0f0cd140e2c2dea25a93d14d04b37a18f2e24ae31a55b2beb619dc7d51e799" exitCode=143 Apr 02 14:01:35 crc kubenswrapper[4732]: I0402 14:01:35.183175 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1fbd853a-4252-4cf9-a5f3-a79c7360a62c","Type":"ContainerDied","Data":"af0f0cd140e2c2dea25a93d14d04b37a18f2e24ae31a55b2beb619dc7d51e799"} Apr 02 14:01:35 crc kubenswrapper[4732]: I0402 14:01:35.570517 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 02 14:01:35 crc kubenswrapper[4732]: I0402 14:01:35.571057 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3aed3c4d-3173-407f-9a70-c20ef18a554d" containerName="glance-log" containerID="cri-o://64b90e5513c61a0a240c665e8829faf848278958fab87319d195ef359146e53b" gracePeriod=30 Apr 02 14:01:35 crc kubenswrapper[4732]: I0402 14:01:35.571306 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3aed3c4d-3173-407f-9a70-c20ef18a554d" containerName="glance-httpd" containerID="cri-o://7c190a4e465759d8b097e15e3ca8d2d640bd547fb325eb7b0705d436799a0337" gracePeriod=30 Apr 02 14:01:35 crc kubenswrapper[4732]: I0402 14:01:35.790066 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 14:01:35 crc kubenswrapper[4732]: I0402 14:01:35.922330 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-config-data\") pod \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " Apr 02 14:01:35 crc kubenswrapper[4732]: I0402 14:01:35.922396 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-sg-core-conf-yaml\") pod \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " Apr 02 14:01:35 crc kubenswrapper[4732]: I0402 14:01:35.922447 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-scripts\") pod \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " Apr 02 14:01:35 crc kubenswrapper[4732]: I0402 14:01:35.922480 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-log-httpd\") pod \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " Apr 02 14:01:35 crc kubenswrapper[4732]: I0402 14:01:35.922514 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-run-httpd\") pod \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " Apr 02 14:01:35 crc kubenswrapper[4732]: I0402 14:01:35.922532 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-combined-ca-bundle\") pod \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " Apr 02 14:01:35 crc kubenswrapper[4732]: I0402 14:01:35.922608 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr678\" (UniqueName: \"kubernetes.io/projected/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-kube-api-access-fr678\") pod \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\" (UID: \"b76e6f0e-76e4-4b53-8697-ea1680abc0f4\") " Apr 02 14:01:35 crc kubenswrapper[4732]: I0402 14:01:35.923054 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b76e6f0e-76e4-4b53-8697-ea1680abc0f4" (UID: "b76e6f0e-76e4-4b53-8697-ea1680abc0f4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:01:35 crc kubenswrapper[4732]: I0402 14:01:35.923194 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b76e6f0e-76e4-4b53-8697-ea1680abc0f4" (UID: "b76e6f0e-76e4-4b53-8697-ea1680abc0f4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:01:35 crc kubenswrapper[4732]: I0402 14:01:35.926775 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-scripts" (OuterVolumeSpecName: "scripts") pod "b76e6f0e-76e4-4b53-8697-ea1680abc0f4" (UID: "b76e6f0e-76e4-4b53-8697-ea1680abc0f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:35 crc kubenswrapper[4732]: I0402 14:01:35.926834 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-kube-api-access-fr678" (OuterVolumeSpecName: "kube-api-access-fr678") pod "b76e6f0e-76e4-4b53-8697-ea1680abc0f4" (UID: "b76e6f0e-76e4-4b53-8697-ea1680abc0f4"). InnerVolumeSpecName "kube-api-access-fr678". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:01:35 crc kubenswrapper[4732]: I0402 14:01:35.976811 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b76e6f0e-76e4-4b53-8697-ea1680abc0f4" (UID: "b76e6f0e-76e4-4b53-8697-ea1680abc0f4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.024783 4732 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.024819 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.024829 4732 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-log-httpd\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.024837 4732 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-run-httpd\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.024847 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr678\" (UniqueName: \"kubernetes.io/projected/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-kube-api-access-fr678\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.049762 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b76e6f0e-76e4-4b53-8697-ea1680abc0f4" (UID: "b76e6f0e-76e4-4b53-8697-ea1680abc0f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.049851 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-config-data" (OuterVolumeSpecName: "config-data") pod "b76e6f0e-76e4-4b53-8697-ea1680abc0f4" (UID: "b76e6f0e-76e4-4b53-8697-ea1680abc0f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.126251 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.126289 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b76e6f0e-76e4-4b53-8697-ea1680abc0f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.205253 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b76e6f0e-76e4-4b53-8697-ea1680abc0f4","Type":"ContainerDied","Data":"986855bf5ac50cdb3b12f50aa1c573f673af8c46baec5fbc1959a983b9c6ed6e"} Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.205286 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.205315 4732 scope.go:117] "RemoveContainer" containerID="605e0f293cae9b39f89f388e139e7fdf2999e783ba85854dbe98f39d274a268c" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.217285 4732 generic.go:334] "Generic (PLEG): container finished" podID="3aed3c4d-3173-407f-9a70-c20ef18a554d" containerID="64b90e5513c61a0a240c665e8829faf848278958fab87319d195ef359146e53b" exitCode=143 Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.217368 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3aed3c4d-3173-407f-9a70-c20ef18a554d","Type":"ContainerDied","Data":"64b90e5513c61a0a240c665e8829faf848278958fab87319d195ef359146e53b"} Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.222113 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lcftj" event={"ID":"d186e696-8cc5-4dac-b6cb-b9a5530bc57e","Type":"ContainerStarted","Data":"6bafa11a5f6f06ce2459f12612464717263a6a689d0a14c463caf4b10f096b3e"} Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.249354 4732 scope.go:117] "RemoveContainer" containerID="3aeebf741ee9b2f7f48baefcc3e99161bf878dded7f109b31f25faf929cfd71f" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.252968 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.268107 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.277566 4732 scope.go:117] "RemoveContainer" containerID="0a5a63f8c0e6eb5a5c2e690c7ae380c865fd500f3f56c6502da61a164145d9de" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.279250 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:01:36 crc kubenswrapper[4732]: E0402 14:01:36.279794 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76e6f0e-76e4-4b53-8697-ea1680abc0f4" containerName="proxy-httpd" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.279812 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76e6f0e-76e4-4b53-8697-ea1680abc0f4" containerName="proxy-httpd" Apr 02 14:01:36 crc kubenswrapper[4732]: E0402 14:01:36.279837 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76e6f0e-76e4-4b53-8697-ea1680abc0f4" containerName="ceilometer-notification-agent" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.279847 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76e6f0e-76e4-4b53-8697-ea1680abc0f4" containerName="ceilometer-notification-agent" Apr 02 14:01:36 crc kubenswrapper[4732]: E0402 14:01:36.279870 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76e6f0e-76e4-4b53-8697-ea1680abc0f4" containerName="sg-core" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.279878 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76e6f0e-76e4-4b53-8697-ea1680abc0f4" containerName="sg-core" Apr 02 14:01:36 crc kubenswrapper[4732]: E0402 14:01:36.279887 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80fe4580-a48f-4cba-b3b3-d6ccbc7f0525" containerName="horizon" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.279894 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="80fe4580-a48f-4cba-b3b3-d6ccbc7f0525" containerName="horizon" Apr 02 14:01:36 crc kubenswrapper[4732]: E0402 14:01:36.279917 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80fe4580-a48f-4cba-b3b3-d6ccbc7f0525" containerName="horizon-log" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.279925 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="80fe4580-a48f-4cba-b3b3-d6ccbc7f0525" containerName="horizon-log" Apr 02 14:01:36 crc kubenswrapper[4732]: E0402 14:01:36.279935 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16168989-9c80-47a7-92ea-8be3984e5d99" containerName="neutron-httpd" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.279912 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-lcftj" podStartSLOduration=5.575576992 podStartE2EDuration="14.279894338s" podCreationTimestamp="2026-04-02 14:01:22 +0000 UTC" firstStartedPulling="2026-04-02 14:01:27.029740407 +0000 UTC m=+1443.934147960" lastFinishedPulling="2026-04-02 14:01:35.734057753 +0000 UTC m=+1452.638465306" observedRunningTime="2026-04-02 14:01:36.263376668 +0000 UTC m=+1453.167784231" watchObservedRunningTime="2026-04-02 14:01:36.279894338 +0000 UTC m=+1453.184301891" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.279943 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="16168989-9c80-47a7-92ea-8be3984e5d99" containerName="neutron-httpd" Apr 02 14:01:36 crc kubenswrapper[4732]: E0402 14:01:36.280152 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16168989-9c80-47a7-92ea-8be3984e5d99" containerName="neutron-api" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.280173 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="16168989-9c80-47a7-92ea-8be3984e5d99" containerName="neutron-api" Apr 02 14:01:36 crc kubenswrapper[4732]: E0402 14:01:36.280203 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76e6f0e-76e4-4b53-8697-ea1680abc0f4" containerName="ceilometer-central-agent" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.280213 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76e6f0e-76e4-4b53-8697-ea1680abc0f4" containerName="ceilometer-central-agent" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.280575 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="b76e6f0e-76e4-4b53-8697-ea1680abc0f4" containerName="proxy-httpd" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.280589 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="b76e6f0e-76e4-4b53-8697-ea1680abc0f4" containerName="sg-core" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.280625 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="80fe4580-a48f-4cba-b3b3-d6ccbc7f0525" containerName="horizon" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.280645 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="16168989-9c80-47a7-92ea-8be3984e5d99" containerName="neutron-api" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.280657 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="16168989-9c80-47a7-92ea-8be3984e5d99" containerName="neutron-httpd" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.280665 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="80fe4580-a48f-4cba-b3b3-d6ccbc7f0525" containerName="horizon-log" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.280674 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="b76e6f0e-76e4-4b53-8697-ea1680abc0f4" containerName="ceilometer-central-agent" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.280682 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="b76e6f0e-76e4-4b53-8697-ea1680abc0f4" containerName="ceilometer-notification-agent" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.282475 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.285138 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.285368 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.303056 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.322420 4732 scope.go:117] "RemoveContainer" containerID="fffc6c54f0d0a2b9e101256ca31f190028d24e05834cf71b7da1c6cfe57494ec" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.442343 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-scripts\") pod \"ceilometer-0\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " pod="openstack/ceilometer-0" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.442427 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " pod="openstack/ceilometer-0" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.442473 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " pod="openstack/ceilometer-0" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.442508 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-config-data\") pod \"ceilometer-0\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " pod="openstack/ceilometer-0" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.443178 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft5wj\" (UniqueName: \"kubernetes.io/projected/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-kube-api-access-ft5wj\") pod \"ceilometer-0\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " pod="openstack/ceilometer-0" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.443325 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-log-httpd\") pod \"ceilometer-0\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " pod="openstack/ceilometer-0" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.443471 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-run-httpd\") pod \"ceilometer-0\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " pod="openstack/ceilometer-0" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.545738 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-run-httpd\") pod \"ceilometer-0\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " pod="openstack/ceilometer-0" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.546069 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-scripts\") pod \"ceilometer-0\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " pod="openstack/ceilometer-0" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.546123 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " pod="openstack/ceilometer-0" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.546189 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " pod="openstack/ceilometer-0" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.546232 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-config-data\") pod \"ceilometer-0\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " pod="openstack/ceilometer-0" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.546255 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft5wj\" (UniqueName: \"kubernetes.io/projected/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-kube-api-access-ft5wj\") pod \"ceilometer-0\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " pod="openstack/ceilometer-0" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.546311 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-log-httpd\") pod \"ceilometer-0\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " pod="openstack/ceilometer-0" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.546410 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-run-httpd\") pod \"ceilometer-0\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " pod="openstack/ceilometer-0" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.546775 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-log-httpd\") pod \"ceilometer-0\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " pod="openstack/ceilometer-0" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.550779 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " pod="openstack/ceilometer-0" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.551280 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-config-data\") pod \"ceilometer-0\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " pod="openstack/ceilometer-0" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.552552 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " pod="openstack/ceilometer-0" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.568645 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft5wj\" (UniqueName: \"kubernetes.io/projected/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-kube-api-access-ft5wj\") pod \"ceilometer-0\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " pod="openstack/ceilometer-0" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.571360 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-scripts\") pod \"ceilometer-0\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " pod="openstack/ceilometer-0" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.602637 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.692842 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b76e6f0e-76e4-4b53-8697-ea1680abc0f4" path="/var/lib/kubelet/pods/b76e6f0e-76e4-4b53-8697-ea1680abc0f4/volumes" Apr 02 14:01:36 crc kubenswrapper[4732]: I0402 14:01:36.705738 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:01:37 crc kubenswrapper[4732]: I0402 14:01:37.101050 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:01:37 crc kubenswrapper[4732]: I0402 14:01:37.268902 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1248b5d3-77aa-4d5b-9311-3481f2cfbde6","Type":"ContainerStarted","Data":"c9f83fceae95825f61117bdfa17af70ddba5b5af0d63f3a4acbcd093a395899f"} Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.284777 4732 generic.go:334] "Generic (PLEG): container finished" podID="1fbd853a-4252-4cf9-a5f3-a79c7360a62c" containerID="4fbd4e38d326424f02f37bb618d5c6d08b882e6c122760e527d6ba58f02e1e92" exitCode=0 Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.284838 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1fbd853a-4252-4cf9-a5f3-a79c7360a62c","Type":"ContainerDied","Data":"4fbd4e38d326424f02f37bb618d5c6d08b882e6c122760e527d6ba58f02e1e92"} Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.288758 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1248b5d3-77aa-4d5b-9311-3481f2cfbde6","Type":"ContainerStarted","Data":"ddbda13604c8ef351162bf5a9a09766f98206d7f6c7634213de5b12cf4b2fc76"} Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.343775 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.481155 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-combined-ca-bundle\") pod \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.481678 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-public-tls-certs\") pod \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.481758 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-httpd-run\") pod \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.481832 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.481897 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-scripts\") pod \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.481941 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b6qg\" (UniqueName: \"kubernetes.io/projected/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-kube-api-access-7b6qg\") pod \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.481986 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-config-data\") pod \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.482013 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-logs\") pod \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\" (UID: \"1fbd853a-4252-4cf9-a5f3-a79c7360a62c\") " Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.482951 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1fbd853a-4252-4cf9-a5f3-a79c7360a62c" (UID: "1fbd853a-4252-4cf9-a5f3-a79c7360a62c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.483134 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-logs" (OuterVolumeSpecName: "logs") pod "1fbd853a-4252-4cf9-a5f3-a79c7360a62c" (UID: "1fbd853a-4252-4cf9-a5f3-a79c7360a62c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.486766 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "1fbd853a-4252-4cf9-a5f3-a79c7360a62c" (UID: "1fbd853a-4252-4cf9-a5f3-a79c7360a62c"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.487815 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-kube-api-access-7b6qg" (OuterVolumeSpecName: "kube-api-access-7b6qg") pod "1fbd853a-4252-4cf9-a5f3-a79c7360a62c" (UID: "1fbd853a-4252-4cf9-a5f3-a79c7360a62c"). InnerVolumeSpecName "kube-api-access-7b6qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.490194 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-scripts" (OuterVolumeSpecName: "scripts") pod "1fbd853a-4252-4cf9-a5f3-a79c7360a62c" (UID: "1fbd853a-4252-4cf9-a5f3-a79c7360a62c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.511632 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fbd853a-4252-4cf9-a5f3-a79c7360a62c" (UID: "1fbd853a-4252-4cf9-a5f3-a79c7360a62c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.533862 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1fbd853a-4252-4cf9-a5f3-a79c7360a62c" (UID: "1fbd853a-4252-4cf9-a5f3-a79c7360a62c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.533976 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-config-data" (OuterVolumeSpecName: "config-data") pod "1fbd853a-4252-4cf9-a5f3-a79c7360a62c" (UID: "1fbd853a-4252-4cf9-a5f3-a79c7360a62c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.583998 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.584030 4732 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.584040 4732 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-httpd-run\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.584069 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.584078 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.584087 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b6qg\" (UniqueName: \"kubernetes.io/projected/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-kube-api-access-7b6qg\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.584097 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.584106 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fbd853a-4252-4cf9-a5f3-a79c7360a62c-logs\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.608779 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Apr 02 14:01:38 crc kubenswrapper[4732]: I0402 14:01:38.685264 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.264469 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.302729 4732 generic.go:334] "Generic (PLEG): container finished" podID="3aed3c4d-3173-407f-9a70-c20ef18a554d" containerID="7c190a4e465759d8b097e15e3ca8d2d640bd547fb325eb7b0705d436799a0337" exitCode=0 Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.302810 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.302822 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3aed3c4d-3173-407f-9a70-c20ef18a554d","Type":"ContainerDied","Data":"7c190a4e465759d8b097e15e3ca8d2d640bd547fb325eb7b0705d436799a0337"} Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.304075 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3aed3c4d-3173-407f-9a70-c20ef18a554d","Type":"ContainerDied","Data":"f29a05790ed5f6792add4a7193cc0c21631d0f9cd2272be4e822f3df3619efa6"} Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.304117 4732 scope.go:117] "RemoveContainer" containerID="7c190a4e465759d8b097e15e3ca8d2d640bd547fb325eb7b0705d436799a0337" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.314853 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1248b5d3-77aa-4d5b-9311-3481f2cfbde6","Type":"ContainerStarted","Data":"41f88a7b9959ef3cf4dedfd030cc84dd85e78f7fd2632b538abdc89d1edd11c6"} Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.314907 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1248b5d3-77aa-4d5b-9311-3481f2cfbde6","Type":"ContainerStarted","Data":"5e432d61851798b7733da9aeef480c9eae81af07a601737682affb42dd9ce147"} Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.318492 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1fbd853a-4252-4cf9-a5f3-a79c7360a62c","Type":"ContainerDied","Data":"f11b23500df71f2522d7c74a4aa7b3e662134098c846722429819d5f8798e6ca"} Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.318624 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.351418 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.358479 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.372789 4732 scope.go:117] "RemoveContainer" containerID="64b90e5513c61a0a240c665e8829faf848278958fab87319d195ef359146e53b" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.401180 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3aed3c4d-3173-407f-9a70-c20ef18a554d-logs\") pod \"3aed3c4d-3173-407f-9a70-c20ef18a554d\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.401277 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aed3c4d-3173-407f-9a70-c20ef18a554d-internal-tls-certs\") pod \"3aed3c4d-3173-407f-9a70-c20ef18a554d\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.401310 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aed3c4d-3173-407f-9a70-c20ef18a554d-config-data\") pod \"3aed3c4d-3173-407f-9a70-c20ef18a554d\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.401348 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"3aed3c4d-3173-407f-9a70-c20ef18a554d\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.401420 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3aed3c4d-3173-407f-9a70-c20ef18a554d-httpd-run\") pod \"3aed3c4d-3173-407f-9a70-c20ef18a554d\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.401468 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aed3c4d-3173-407f-9a70-c20ef18a554d-scripts\") pod \"3aed3c4d-3173-407f-9a70-c20ef18a554d\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.401566 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aed3c4d-3173-407f-9a70-c20ef18a554d-combined-ca-bundle\") pod \"3aed3c4d-3173-407f-9a70-c20ef18a554d\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.401701 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9dfz\" (UniqueName: \"kubernetes.io/projected/3aed3c4d-3173-407f-9a70-c20ef18a554d-kube-api-access-g9dfz\") pod \"3aed3c4d-3173-407f-9a70-c20ef18a554d\" (UID: \"3aed3c4d-3173-407f-9a70-c20ef18a554d\") " Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.402949 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aed3c4d-3173-407f-9a70-c20ef18a554d-logs" (OuterVolumeSpecName: "logs") pod "3aed3c4d-3173-407f-9a70-c20ef18a554d" (UID: "3aed3c4d-3173-407f-9a70-c20ef18a554d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.403132 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aed3c4d-3173-407f-9a70-c20ef18a554d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3aed3c4d-3173-407f-9a70-c20ef18a554d" (UID: "3aed3c4d-3173-407f-9a70-c20ef18a554d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.418983 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aed3c4d-3173-407f-9a70-c20ef18a554d-kube-api-access-g9dfz" (OuterVolumeSpecName: "kube-api-access-g9dfz") pod "3aed3c4d-3173-407f-9a70-c20ef18a554d" (UID: "3aed3c4d-3173-407f-9a70-c20ef18a554d"). InnerVolumeSpecName "kube-api-access-g9dfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.419107 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "3aed3c4d-3173-407f-9a70-c20ef18a554d" (UID: "3aed3c4d-3173-407f-9a70-c20ef18a554d"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.419891 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aed3c4d-3173-407f-9a70-c20ef18a554d-scripts" (OuterVolumeSpecName: "scripts") pod "3aed3c4d-3173-407f-9a70-c20ef18a554d" (UID: "3aed3c4d-3173-407f-9a70-c20ef18a554d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.425734 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Apr 02 14:01:39 crc kubenswrapper[4732]: E0402 14:01:39.426179 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fbd853a-4252-4cf9-a5f3-a79c7360a62c" containerName="glance-httpd" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.426200 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fbd853a-4252-4cf9-a5f3-a79c7360a62c" containerName="glance-httpd" Apr 02 14:01:39 crc kubenswrapper[4732]: E0402 14:01:39.426212 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fbd853a-4252-4cf9-a5f3-a79c7360a62c" containerName="glance-log" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.426220 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fbd853a-4252-4cf9-a5f3-a79c7360a62c" containerName="glance-log" Apr 02 14:01:39 crc kubenswrapper[4732]: E0402 14:01:39.426238 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aed3c4d-3173-407f-9a70-c20ef18a554d" containerName="glance-httpd" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.426247 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aed3c4d-3173-407f-9a70-c20ef18a554d" containerName="glance-httpd" Apr 02 14:01:39 crc kubenswrapper[4732]: E0402 14:01:39.426281 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aed3c4d-3173-407f-9a70-c20ef18a554d" containerName="glance-log" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.426290 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aed3c4d-3173-407f-9a70-c20ef18a554d" containerName="glance-log" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.426512 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aed3c4d-3173-407f-9a70-c20ef18a554d" containerName="glance-httpd" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.426534 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fbd853a-4252-4cf9-a5f3-a79c7360a62c" containerName="glance-log" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.426551 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fbd853a-4252-4cf9-a5f3-a79c7360a62c" containerName="glance-httpd" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.426564 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aed3c4d-3173-407f-9a70-c20ef18a554d" containerName="glance-log" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.427750 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.440294 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.461769 4732 scope.go:117] "RemoveContainer" containerID="7c190a4e465759d8b097e15e3ca8d2d640bd547fb325eb7b0705d436799a0337" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.475599 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.477696 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Apr 02 14:01:39 crc kubenswrapper[4732]: E0402 14:01:39.482842 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c190a4e465759d8b097e15e3ca8d2d640bd547fb325eb7b0705d436799a0337\": container with ID starting with 7c190a4e465759d8b097e15e3ca8d2d640bd547fb325eb7b0705d436799a0337 not found: ID does not exist" containerID="7c190a4e465759d8b097e15e3ca8d2d640bd547fb325eb7b0705d436799a0337" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.482899 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c190a4e465759d8b097e15e3ca8d2d640bd547fb325eb7b0705d436799a0337"} err="failed to get container status \"7c190a4e465759d8b097e15e3ca8d2d640bd547fb325eb7b0705d436799a0337\": rpc error: code = NotFound desc = could not find container \"7c190a4e465759d8b097e15e3ca8d2d640bd547fb325eb7b0705d436799a0337\": container with ID starting with 7c190a4e465759d8b097e15e3ca8d2d640bd547fb325eb7b0705d436799a0337 not found: ID does not exist" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.482936 4732 scope.go:117] "RemoveContainer" containerID="64b90e5513c61a0a240c665e8829faf848278958fab87319d195ef359146e53b" Apr 02 14:01:39 crc kubenswrapper[4732]: E0402 14:01:39.487922 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64b90e5513c61a0a240c665e8829faf848278958fab87319d195ef359146e53b\": container with ID starting with 64b90e5513c61a0a240c665e8829faf848278958fab87319d195ef359146e53b not found: ID does not exist" containerID="64b90e5513c61a0a240c665e8829faf848278958fab87319d195ef359146e53b" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.488047 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64b90e5513c61a0a240c665e8829faf848278958fab87319d195ef359146e53b"} err="failed to get container status \"64b90e5513c61a0a240c665e8829faf848278958fab87319d195ef359146e53b\": rpc error: code = NotFound desc = could not find container \"64b90e5513c61a0a240c665e8829faf848278958fab87319d195ef359146e53b\": container with ID starting with 64b90e5513c61a0a240c665e8829faf848278958fab87319d195ef359146e53b not found: ID does not exist" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.488144 4732 scope.go:117] "RemoveContainer" containerID="4fbd4e38d326424f02f37bb618d5c6d08b882e6c122760e527d6ba58f02e1e92" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.508871 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aed3c4d-3173-407f-9a70-c20ef18a554d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3aed3c4d-3173-407f-9a70-c20ef18a554d" (UID: "3aed3c4d-3173-407f-9a70-c20ef18a554d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.510088 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9dfz\" (UniqueName: \"kubernetes.io/projected/3aed3c4d-3173-407f-9a70-c20ef18a554d-kube-api-access-g9dfz\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.510112 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3aed3c4d-3173-407f-9a70-c20ef18a554d-logs\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.510133 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.510144 4732 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3aed3c4d-3173-407f-9a70-c20ef18a554d-httpd-run\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.510155 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aed3c4d-3173-407f-9a70-c20ef18a554d-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.510163 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aed3c4d-3173-407f-9a70-c20ef18a554d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.564071 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.574988 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aed3c4d-3173-407f-9a70-c20ef18a554d-config-data" (OuterVolumeSpecName: "config-data") pod "3aed3c4d-3173-407f-9a70-c20ef18a554d" (UID: "3aed3c4d-3173-407f-9a70-c20ef18a554d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.575833 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aed3c4d-3173-407f-9a70-c20ef18a554d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3aed3c4d-3173-407f-9a70-c20ef18a554d" (UID: "3aed3c4d-3173-407f-9a70-c20ef18a554d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.612567 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j956l\" (UniqueName: \"kubernetes.io/projected/2bbb407d-51c0-4cca-99c6-9436acda495d-kube-api-access-j956l\") pod \"glance-default-external-api-0\" (UID: \"2bbb407d-51c0-4cca-99c6-9436acda495d\") " pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.612695 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bbb407d-51c0-4cca-99c6-9436acda495d-config-data\") pod \"glance-default-external-api-0\" (UID: \"2bbb407d-51c0-4cca-99c6-9436acda495d\") " pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.612719 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"2bbb407d-51c0-4cca-99c6-9436acda495d\") " pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.612736 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bbb407d-51c0-4cca-99c6-9436acda495d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2bbb407d-51c0-4cca-99c6-9436acda495d\") " pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.612759 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bbb407d-51c0-4cca-99c6-9436acda495d-scripts\") pod \"glance-default-external-api-0\" (UID: \"2bbb407d-51c0-4cca-99c6-9436acda495d\") " pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.612784 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bbb407d-51c0-4cca-99c6-9436acda495d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2bbb407d-51c0-4cca-99c6-9436acda495d\") " pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.612858 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bbb407d-51c0-4cca-99c6-9436acda495d-logs\") pod \"glance-default-external-api-0\" (UID: \"2bbb407d-51c0-4cca-99c6-9436acda495d\") " pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.612879 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2bbb407d-51c0-4cca-99c6-9436acda495d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2bbb407d-51c0-4cca-99c6-9436acda495d\") " pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.612924 4732 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aed3c4d-3173-407f-9a70-c20ef18a554d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.612934 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aed3c4d-3173-407f-9a70-c20ef18a554d-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.612944 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.632944 4732 scope.go:117] "RemoveContainer" containerID="af0f0cd140e2c2dea25a93d14d04b37a18f2e24ae31a55b2beb619dc7d51e799" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.714454 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j956l\" (UniqueName: \"kubernetes.io/projected/2bbb407d-51c0-4cca-99c6-9436acda495d-kube-api-access-j956l\") pod \"glance-default-external-api-0\" (UID: \"2bbb407d-51c0-4cca-99c6-9436acda495d\") " pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.714548 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bbb407d-51c0-4cca-99c6-9436acda495d-config-data\") pod \"glance-default-external-api-0\" (UID: \"2bbb407d-51c0-4cca-99c6-9436acda495d\") " pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.714572 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"2bbb407d-51c0-4cca-99c6-9436acda495d\") " pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.714594 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bbb407d-51c0-4cca-99c6-9436acda495d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2bbb407d-51c0-4cca-99c6-9436acda495d\") " pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.714633 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bbb407d-51c0-4cca-99c6-9436acda495d-scripts\") pod \"glance-default-external-api-0\" (UID: \"2bbb407d-51c0-4cca-99c6-9436acda495d\") " pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.714663 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bbb407d-51c0-4cca-99c6-9436acda495d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2bbb407d-51c0-4cca-99c6-9436acda495d\") " pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.714842 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bbb407d-51c0-4cca-99c6-9436acda495d-logs\") pod \"glance-default-external-api-0\" (UID: \"2bbb407d-51c0-4cca-99c6-9436acda495d\") " pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.714876 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2bbb407d-51c0-4cca-99c6-9436acda495d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2bbb407d-51c0-4cca-99c6-9436acda495d\") " pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.715395 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2bbb407d-51c0-4cca-99c6-9436acda495d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2bbb407d-51c0-4cca-99c6-9436acda495d\") " pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.715636 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bbb407d-51c0-4cca-99c6-9436acda495d-logs\") pod \"glance-default-external-api-0\" (UID: \"2bbb407d-51c0-4cca-99c6-9436acda495d\") " pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.716244 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"2bbb407d-51c0-4cca-99c6-9436acda495d\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.721410 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bbb407d-51c0-4cca-99c6-9436acda495d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2bbb407d-51c0-4cca-99c6-9436acda495d\") " pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.722488 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bbb407d-51c0-4cca-99c6-9436acda495d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2bbb407d-51c0-4cca-99c6-9436acda495d\") " pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.727022 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bbb407d-51c0-4cca-99c6-9436acda495d-config-data\") pod \"glance-default-external-api-0\" (UID: \"2bbb407d-51c0-4cca-99c6-9436acda495d\") " pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.728282 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bbb407d-51c0-4cca-99c6-9436acda495d-scripts\") pod \"glance-default-external-api-0\" (UID: \"2bbb407d-51c0-4cca-99c6-9436acda495d\") " pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.729973 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.753668 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j956l\" (UniqueName: \"kubernetes.io/projected/2bbb407d-51c0-4cca-99c6-9436acda495d-kube-api-access-j956l\") pod \"glance-default-external-api-0\" (UID: \"2bbb407d-51c0-4cca-99c6-9436acda495d\") " pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.766080 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.782500 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"2bbb407d-51c0-4cca-99c6-9436acda495d\") " pod="openstack/glance-default-external-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.800187 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.802018 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.810569 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.810835 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.812798 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.848442 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8724b48c-9ac7-43a2-8d27-7d16056387ca-logs\") pod \"glance-default-internal-api-0\" (UID: \"8724b48c-9ac7-43a2-8d27-7d16056387ca\") " pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.848490 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqgv8\" (UniqueName: \"kubernetes.io/projected/8724b48c-9ac7-43a2-8d27-7d16056387ca-kube-api-access-cqgv8\") pod \"glance-default-internal-api-0\" (UID: \"8724b48c-9ac7-43a2-8d27-7d16056387ca\") " pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.848515 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8724b48c-9ac7-43a2-8d27-7d16056387ca-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8724b48c-9ac7-43a2-8d27-7d16056387ca\") " pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.848565 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"8724b48c-9ac7-43a2-8d27-7d16056387ca\") " pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.848585 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8724b48c-9ac7-43a2-8d27-7d16056387ca-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8724b48c-9ac7-43a2-8d27-7d16056387ca\") " pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.848606 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8724b48c-9ac7-43a2-8d27-7d16056387ca-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8724b48c-9ac7-43a2-8d27-7d16056387ca\") " pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.848639 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8724b48c-9ac7-43a2-8d27-7d16056387ca-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8724b48c-9ac7-43a2-8d27-7d16056387ca\") " pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.848700 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8724b48c-9ac7-43a2-8d27-7d16056387ca-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8724b48c-9ac7-43a2-8d27-7d16056387ca\") " pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.951038 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"8724b48c-9ac7-43a2-8d27-7d16056387ca\") " pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.951101 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8724b48c-9ac7-43a2-8d27-7d16056387ca-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8724b48c-9ac7-43a2-8d27-7d16056387ca\") " pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.951130 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8724b48c-9ac7-43a2-8d27-7d16056387ca-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8724b48c-9ac7-43a2-8d27-7d16056387ca\") " pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.951168 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8724b48c-9ac7-43a2-8d27-7d16056387ca-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8724b48c-9ac7-43a2-8d27-7d16056387ca\") " pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.951239 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8724b48c-9ac7-43a2-8d27-7d16056387ca-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8724b48c-9ac7-43a2-8d27-7d16056387ca\") " pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.951248 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"8724b48c-9ac7-43a2-8d27-7d16056387ca\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.951544 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8724b48c-9ac7-43a2-8d27-7d16056387ca-logs\") pod \"glance-default-internal-api-0\" (UID: \"8724b48c-9ac7-43a2-8d27-7d16056387ca\") " pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.951570 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqgv8\" (UniqueName: \"kubernetes.io/projected/8724b48c-9ac7-43a2-8d27-7d16056387ca-kube-api-access-cqgv8\") pod \"glance-default-internal-api-0\" (UID: \"8724b48c-9ac7-43a2-8d27-7d16056387ca\") " pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.951591 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8724b48c-9ac7-43a2-8d27-7d16056387ca-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8724b48c-9ac7-43a2-8d27-7d16056387ca\") " pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.952395 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8724b48c-9ac7-43a2-8d27-7d16056387ca-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8724b48c-9ac7-43a2-8d27-7d16056387ca\") " pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.952473 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8724b48c-9ac7-43a2-8d27-7d16056387ca-logs\") pod \"glance-default-internal-api-0\" (UID: \"8724b48c-9ac7-43a2-8d27-7d16056387ca\") " pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.956486 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8724b48c-9ac7-43a2-8d27-7d16056387ca-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8724b48c-9ac7-43a2-8d27-7d16056387ca\") " pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.957298 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8724b48c-9ac7-43a2-8d27-7d16056387ca-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8724b48c-9ac7-43a2-8d27-7d16056387ca\") " pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.957779 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8724b48c-9ac7-43a2-8d27-7d16056387ca-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8724b48c-9ac7-43a2-8d27-7d16056387ca\") " pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.958630 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8724b48c-9ac7-43a2-8d27-7d16056387ca-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8724b48c-9ac7-43a2-8d27-7d16056387ca\") " pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.974867 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqgv8\" (UniqueName: \"kubernetes.io/projected/8724b48c-9ac7-43a2-8d27-7d16056387ca-kube-api-access-cqgv8\") pod \"glance-default-internal-api-0\" (UID: \"8724b48c-9ac7-43a2-8d27-7d16056387ca\") " pod="openstack/glance-default-internal-api-0" Apr 02 14:01:39 crc kubenswrapper[4732]: I0402 14:01:39.977313 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"8724b48c-9ac7-43a2-8d27-7d16056387ca\") " pod="openstack/glance-default-internal-api-0" Apr 02 14:01:40 crc kubenswrapper[4732]: I0402 14:01:40.078905 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Apr 02 14:01:40 crc kubenswrapper[4732]: I0402 14:01:40.184106 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Apr 02 14:01:40 crc kubenswrapper[4732]: I0402 14:01:40.648457 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Apr 02 14:01:40 crc kubenswrapper[4732]: W0402 14:01:40.673944 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bbb407d_51c0_4cca_99c6_9436acda495d.slice/crio-c36a161678cfd9af48acea50c885a2f9281ab83f256c9f0c0a88d0b02b9ab5bc WatchSource:0}: Error finding container c36a161678cfd9af48acea50c885a2f9281ab83f256c9f0c0a88d0b02b9ab5bc: Status 404 returned error can't find the container with id c36a161678cfd9af48acea50c885a2f9281ab83f256c9f0c0a88d0b02b9ab5bc Apr 02 14:01:40 crc kubenswrapper[4732]: I0402 14:01:40.712073 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fbd853a-4252-4cf9-a5f3-a79c7360a62c" path="/var/lib/kubelet/pods/1fbd853a-4252-4cf9-a5f3-a79c7360a62c/volumes" Apr 02 14:01:40 crc kubenswrapper[4732]: I0402 14:01:40.712959 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aed3c4d-3173-407f-9a70-c20ef18a554d" path="/var/lib/kubelet/pods/3aed3c4d-3173-407f-9a70-c20ef18a554d/volumes" Apr 02 14:01:40 crc kubenswrapper[4732]: I0402 14:01:40.753776 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 02 14:01:40 crc kubenswrapper[4732]: W0402 14:01:40.769775 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8724b48c_9ac7_43a2_8d27_7d16056387ca.slice/crio-184e6a8b8d2d728a1f9e008401299a10a989ea676a0ceb94fae7250feb318072 WatchSource:0}: Error finding container 184e6a8b8d2d728a1f9e008401299a10a989ea676a0ceb94fae7250feb318072: Status 404 returned error can't find the container with id 184e6a8b8d2d728a1f9e008401299a10a989ea676a0ceb94fae7250feb318072 Apr 02 14:01:41 crc kubenswrapper[4732]: I0402 14:01:41.351607 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2bbb407d-51c0-4cca-99c6-9436acda495d","Type":"ContainerStarted","Data":"c36a161678cfd9af48acea50c885a2f9281ab83f256c9f0c0a88d0b02b9ab5bc"} Apr 02 14:01:41 crc kubenswrapper[4732]: I0402 14:01:41.353495 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8724b48c-9ac7-43a2-8d27-7d16056387ca","Type":"ContainerStarted","Data":"184e6a8b8d2d728a1f9e008401299a10a989ea676a0ceb94fae7250feb318072"} Apr 02 14:01:41 crc kubenswrapper[4732]: I0402 14:01:41.357875 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1248b5d3-77aa-4d5b-9311-3481f2cfbde6","Type":"ContainerStarted","Data":"9d4640df4df0902f12ed980e9a16d4204f01a0b2fc31c15a35950122cffd7650"} Apr 02 14:01:41 crc kubenswrapper[4732]: I0402 14:01:41.358040 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1248b5d3-77aa-4d5b-9311-3481f2cfbde6" containerName="ceilometer-central-agent" containerID="cri-o://ddbda13604c8ef351162bf5a9a09766f98206d7f6c7634213de5b12cf4b2fc76" gracePeriod=30 Apr 02 14:01:41 crc kubenswrapper[4732]: I0402 14:01:41.358222 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1248b5d3-77aa-4d5b-9311-3481f2cfbde6" containerName="sg-core" containerID="cri-o://41f88a7b9959ef3cf4dedfd030cc84dd85e78f7fd2632b538abdc89d1edd11c6" gracePeriod=30 Apr 02 14:01:41 crc kubenswrapper[4732]: I0402 14:01:41.358319 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1248b5d3-77aa-4d5b-9311-3481f2cfbde6" containerName="proxy-httpd" containerID="cri-o://9d4640df4df0902f12ed980e9a16d4204f01a0b2fc31c15a35950122cffd7650" gracePeriod=30 Apr 02 14:01:41 crc kubenswrapper[4732]: I0402 14:01:41.358409 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Apr 02 14:01:41 crc kubenswrapper[4732]: I0402 14:01:41.358460 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1248b5d3-77aa-4d5b-9311-3481f2cfbde6" containerName="ceilometer-notification-agent" containerID="cri-o://5e432d61851798b7733da9aeef480c9eae81af07a601737682affb42dd9ce147" gracePeriod=30 Apr 02 14:01:41 crc kubenswrapper[4732]: I0402 14:01:41.383756 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.465425046 podStartE2EDuration="5.383599862s" podCreationTimestamp="2026-04-02 14:01:36 +0000 UTC" firstStartedPulling="2026-04-02 14:01:37.10633444 +0000 UTC m=+1454.010741993" lastFinishedPulling="2026-04-02 14:01:41.024509246 +0000 UTC m=+1457.928916809" observedRunningTime="2026-04-02 14:01:41.382866602 +0000 UTC m=+1458.287274175" watchObservedRunningTime="2026-04-02 14:01:41.383599862 +0000 UTC m=+1458.288007415" Apr 02 14:01:42 crc kubenswrapper[4732]: I0402 14:01:42.369859 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8724b48c-9ac7-43a2-8d27-7d16056387ca","Type":"ContainerStarted","Data":"6d8ac3a443577f38f2b933af2fa88233811cce7cf3f3abab3fc77f2ab4a8d519"} Apr 02 14:01:42 crc kubenswrapper[4732]: I0402 14:01:42.370453 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8724b48c-9ac7-43a2-8d27-7d16056387ca","Type":"ContainerStarted","Data":"339d5b913e7961e2fb4873f1a44e3116158830afe777903c4e5e8e5d43f30144"} Apr 02 14:01:42 crc kubenswrapper[4732]: I0402 14:01:42.372280 4732 generic.go:334] "Generic (PLEG): container finished" podID="1248b5d3-77aa-4d5b-9311-3481f2cfbde6" containerID="41f88a7b9959ef3cf4dedfd030cc84dd85e78f7fd2632b538abdc89d1edd11c6" exitCode=2 Apr 02 14:01:42 crc kubenswrapper[4732]: I0402 14:01:42.372311 4732 generic.go:334] "Generic (PLEG): container finished" podID="1248b5d3-77aa-4d5b-9311-3481f2cfbde6" containerID="5e432d61851798b7733da9aeef480c9eae81af07a601737682affb42dd9ce147" exitCode=0 Apr 02 14:01:42 crc kubenswrapper[4732]: I0402 14:01:42.372354 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1248b5d3-77aa-4d5b-9311-3481f2cfbde6","Type":"ContainerDied","Data":"41f88a7b9959ef3cf4dedfd030cc84dd85e78f7fd2632b538abdc89d1edd11c6"} Apr 02 14:01:42 crc kubenswrapper[4732]: I0402 14:01:42.372380 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1248b5d3-77aa-4d5b-9311-3481f2cfbde6","Type":"ContainerDied","Data":"5e432d61851798b7733da9aeef480c9eae81af07a601737682affb42dd9ce147"} Apr 02 14:01:42 crc kubenswrapper[4732]: I0402 14:01:42.374192 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2bbb407d-51c0-4cca-99c6-9436acda495d","Type":"ContainerStarted","Data":"a382e380ad614e5ee341bc0080450971b188be1f4a6d6f2c3c86d012d43a0291"} Apr 02 14:01:42 crc kubenswrapper[4732]: I0402 14:01:42.374221 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2bbb407d-51c0-4cca-99c6-9436acda495d","Type":"ContainerStarted","Data":"fbc74ef8bd8c85dfa175c2dbae3cef90fa2ae24cce6409074711193d18375348"} Apr 02 14:01:42 crc kubenswrapper[4732]: I0402 14:01:42.397327 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.397309867 podStartE2EDuration="3.397309867s" podCreationTimestamp="2026-04-02 14:01:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:01:42.390992695 +0000 UTC m=+1459.295400258" watchObservedRunningTime="2026-04-02 14:01:42.397309867 +0000 UTC m=+1459.301717420" Apr 02 14:01:42 crc kubenswrapper[4732]: I0402 14:01:42.425135 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.425112005 podStartE2EDuration="3.425112005s" podCreationTimestamp="2026-04-02 14:01:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:01:42.421131237 +0000 UTC m=+1459.325538800" watchObservedRunningTime="2026-04-02 14:01:42.425112005 +0000 UTC m=+1459.329519558" Apr 02 14:01:42 crc kubenswrapper[4732]: I0402 14:01:42.953110 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c0e53540-98b3-463a-9611-a48c2fbfc0f5" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.165:3000/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 02 14:01:50 crc kubenswrapper[4732]: I0402 14:01:50.079286 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Apr 02 14:01:50 crc kubenswrapper[4732]: I0402 14:01:50.080211 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Apr 02 14:01:50 crc kubenswrapper[4732]: I0402 14:01:50.115466 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Apr 02 14:01:50 crc kubenswrapper[4732]: I0402 14:01:50.123787 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Apr 02 14:01:50 crc kubenswrapper[4732]: I0402 14:01:50.185937 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Apr 02 14:01:50 crc kubenswrapper[4732]: I0402 14:01:50.186299 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Apr 02 14:01:50 crc kubenswrapper[4732]: I0402 14:01:50.218190 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Apr 02 14:01:50 crc kubenswrapper[4732]: I0402 14:01:50.228643 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Apr 02 14:01:50 crc kubenswrapper[4732]: I0402 14:01:50.474451 4732 generic.go:334] "Generic (PLEG): container finished" podID="1248b5d3-77aa-4d5b-9311-3481f2cfbde6" containerID="ddbda13604c8ef351162bf5a9a09766f98206d7f6c7634213de5b12cf4b2fc76" exitCode=0 Apr 02 14:01:50 crc kubenswrapper[4732]: I0402 14:01:50.474515 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1248b5d3-77aa-4d5b-9311-3481f2cfbde6","Type":"ContainerDied","Data":"ddbda13604c8ef351162bf5a9a09766f98206d7f6c7634213de5b12cf4b2fc76"} Apr 02 14:01:50 crc kubenswrapper[4732]: I0402 14:01:50.475138 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Apr 02 14:01:50 crc kubenswrapper[4732]: I0402 14:01:50.475165 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Apr 02 14:01:50 crc kubenswrapper[4732]: I0402 14:01:50.475180 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Apr 02 14:01:50 crc kubenswrapper[4732]: I0402 14:01:50.475189 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Apr 02 14:01:52 crc kubenswrapper[4732]: I0402 14:01:52.526472 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Apr 02 14:01:52 crc kubenswrapper[4732]: I0402 14:01:52.526948 4732 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 02 14:01:52 crc kubenswrapper[4732]: I0402 14:01:52.527860 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Apr 02 14:01:52 crc kubenswrapper[4732]: I0402 14:01:52.693000 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Apr 02 14:01:52 crc kubenswrapper[4732]: I0402 14:01:52.693096 4732 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 02 14:01:52 crc kubenswrapper[4732]: I0402 14:01:52.739108 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Apr 02 14:01:54 crc kubenswrapper[4732]: E0402 14:01:54.793550 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd186e696_8cc5_4dac_b6cb_b9a5530bc57e.slice/crio-6bafa11a5f6f06ce2459f12612464717263a6a689d0a14c463caf4b10f096b3e.scope\": RecentStats: unable to find data in memory cache]" Apr 02 14:01:55 crc kubenswrapper[4732]: I0402 14:01:55.520745 4732 generic.go:334] "Generic (PLEG): container finished" podID="d186e696-8cc5-4dac-b6cb-b9a5530bc57e" containerID="6bafa11a5f6f06ce2459f12612464717263a6a689d0a14c463caf4b10f096b3e" exitCode=0 Apr 02 14:01:55 crc kubenswrapper[4732]: I0402 14:01:55.520849 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lcftj" event={"ID":"d186e696-8cc5-4dac-b6cb-b9a5530bc57e","Type":"ContainerDied","Data":"6bafa11a5f6f06ce2459f12612464717263a6a689d0a14c463caf4b10f096b3e"} Apr 02 14:01:56 crc kubenswrapper[4732]: I0402 14:01:56.860547 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lcftj" Apr 02 14:01:56 crc kubenswrapper[4732]: I0402 14:01:56.961049 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt8kp\" (UniqueName: \"kubernetes.io/projected/d186e696-8cc5-4dac-b6cb-b9a5530bc57e-kube-api-access-rt8kp\") pod \"d186e696-8cc5-4dac-b6cb-b9a5530bc57e\" (UID: \"d186e696-8cc5-4dac-b6cb-b9a5530bc57e\") " Apr 02 14:01:56 crc kubenswrapper[4732]: I0402 14:01:56.961163 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d186e696-8cc5-4dac-b6cb-b9a5530bc57e-scripts\") pod \"d186e696-8cc5-4dac-b6cb-b9a5530bc57e\" (UID: \"d186e696-8cc5-4dac-b6cb-b9a5530bc57e\") " Apr 02 14:01:56 crc kubenswrapper[4732]: I0402 14:01:56.961306 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d186e696-8cc5-4dac-b6cb-b9a5530bc57e-combined-ca-bundle\") pod \"d186e696-8cc5-4dac-b6cb-b9a5530bc57e\" (UID: \"d186e696-8cc5-4dac-b6cb-b9a5530bc57e\") " Apr 02 14:01:56 crc kubenswrapper[4732]: I0402 14:01:56.961342 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d186e696-8cc5-4dac-b6cb-b9a5530bc57e-config-data\") pod \"d186e696-8cc5-4dac-b6cb-b9a5530bc57e\" (UID: \"d186e696-8cc5-4dac-b6cb-b9a5530bc57e\") " Apr 02 14:01:56 crc kubenswrapper[4732]: I0402 14:01:56.968964 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d186e696-8cc5-4dac-b6cb-b9a5530bc57e-scripts" (OuterVolumeSpecName: "scripts") pod "d186e696-8cc5-4dac-b6cb-b9a5530bc57e" (UID: "d186e696-8cc5-4dac-b6cb-b9a5530bc57e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:56 crc kubenswrapper[4732]: I0402 14:01:56.969108 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d186e696-8cc5-4dac-b6cb-b9a5530bc57e-kube-api-access-rt8kp" (OuterVolumeSpecName: "kube-api-access-rt8kp") pod "d186e696-8cc5-4dac-b6cb-b9a5530bc57e" (UID: "d186e696-8cc5-4dac-b6cb-b9a5530bc57e"). InnerVolumeSpecName "kube-api-access-rt8kp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:01:56 crc kubenswrapper[4732]: I0402 14:01:56.988966 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d186e696-8cc5-4dac-b6cb-b9a5530bc57e-config-data" (OuterVolumeSpecName: "config-data") pod "d186e696-8cc5-4dac-b6cb-b9a5530bc57e" (UID: "d186e696-8cc5-4dac-b6cb-b9a5530bc57e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:56 crc kubenswrapper[4732]: I0402 14:01:56.991975 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d186e696-8cc5-4dac-b6cb-b9a5530bc57e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d186e696-8cc5-4dac-b6cb-b9a5530bc57e" (UID: "d186e696-8cc5-4dac-b6cb-b9a5530bc57e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:01:57 crc kubenswrapper[4732]: I0402 14:01:57.063535 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt8kp\" (UniqueName: \"kubernetes.io/projected/d186e696-8cc5-4dac-b6cb-b9a5530bc57e-kube-api-access-rt8kp\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:57 crc kubenswrapper[4732]: I0402 14:01:57.063621 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d186e696-8cc5-4dac-b6cb-b9a5530bc57e-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:57 crc kubenswrapper[4732]: I0402 14:01:57.063636 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d186e696-8cc5-4dac-b6cb-b9a5530bc57e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:57 crc kubenswrapper[4732]: I0402 14:01:57.063645 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d186e696-8cc5-4dac-b6cb-b9a5530bc57e-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:01:57 crc kubenswrapper[4732]: I0402 14:01:57.543429 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lcftj" event={"ID":"d186e696-8cc5-4dac-b6cb-b9a5530bc57e","Type":"ContainerDied","Data":"347e5e89c757fde507b0826d2d1c8d62cc3150ca7b345a4b1865fa73c190dc15"} Apr 02 14:01:57 crc kubenswrapper[4732]: I0402 14:01:57.543757 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="347e5e89c757fde507b0826d2d1c8d62cc3150ca7b345a4b1865fa73c190dc15" Apr 02 14:01:57 crc kubenswrapper[4732]: I0402 14:01:57.543491 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lcftj" Apr 02 14:01:57 crc kubenswrapper[4732]: I0402 14:01:57.640647 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Apr 02 14:01:57 crc kubenswrapper[4732]: E0402 14:01:57.641104 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d186e696-8cc5-4dac-b6cb-b9a5530bc57e" containerName="nova-cell0-conductor-db-sync" Apr 02 14:01:57 crc kubenswrapper[4732]: I0402 14:01:57.641128 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d186e696-8cc5-4dac-b6cb-b9a5530bc57e" containerName="nova-cell0-conductor-db-sync" Apr 02 14:01:57 crc kubenswrapper[4732]: I0402 14:01:57.641458 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d186e696-8cc5-4dac-b6cb-b9a5530bc57e" containerName="nova-cell0-conductor-db-sync" Apr 02 14:01:57 crc kubenswrapper[4732]: I0402 14:01:57.642178 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Apr 02 14:01:57 crc kubenswrapper[4732]: I0402 14:01:57.646793 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Apr 02 14:01:57 crc kubenswrapper[4732]: I0402 14:01:57.646793 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kxxgt" Apr 02 14:01:57 crc kubenswrapper[4732]: I0402 14:01:57.660440 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Apr 02 14:01:57 crc kubenswrapper[4732]: I0402 14:01:57.777046 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knrkw\" (UniqueName: \"kubernetes.io/projected/4f075437-481e-4e75-9a85-081d0a0ae448-kube-api-access-knrkw\") pod \"nova-cell0-conductor-0\" (UID: \"4f075437-481e-4e75-9a85-081d0a0ae448\") " pod="openstack/nova-cell0-conductor-0" Apr 02 14:01:57 crc kubenswrapper[4732]: I0402 14:01:57.777214 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f075437-481e-4e75-9a85-081d0a0ae448-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4f075437-481e-4e75-9a85-081d0a0ae448\") " pod="openstack/nova-cell0-conductor-0" Apr 02 14:01:57 crc kubenswrapper[4732]: I0402 14:01:57.777272 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f075437-481e-4e75-9a85-081d0a0ae448-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4f075437-481e-4e75-9a85-081d0a0ae448\") " pod="openstack/nova-cell0-conductor-0" Apr 02 14:01:57 crc kubenswrapper[4732]: I0402 14:01:57.878776 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f075437-481e-4e75-9a85-081d0a0ae448-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4f075437-481e-4e75-9a85-081d0a0ae448\") " pod="openstack/nova-cell0-conductor-0" Apr 02 14:01:57 crc kubenswrapper[4732]: I0402 14:01:57.878858 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knrkw\" (UniqueName: \"kubernetes.io/projected/4f075437-481e-4e75-9a85-081d0a0ae448-kube-api-access-knrkw\") pod \"nova-cell0-conductor-0\" (UID: \"4f075437-481e-4e75-9a85-081d0a0ae448\") " pod="openstack/nova-cell0-conductor-0" Apr 02 14:01:57 crc kubenswrapper[4732]: I0402 14:01:57.878939 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f075437-481e-4e75-9a85-081d0a0ae448-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4f075437-481e-4e75-9a85-081d0a0ae448\") " pod="openstack/nova-cell0-conductor-0" Apr 02 14:01:57 crc kubenswrapper[4732]: I0402 14:01:57.884024 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f075437-481e-4e75-9a85-081d0a0ae448-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4f075437-481e-4e75-9a85-081d0a0ae448\") " pod="openstack/nova-cell0-conductor-0" Apr 02 14:01:57 crc kubenswrapper[4732]: I0402 14:01:57.889875 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f075437-481e-4e75-9a85-081d0a0ae448-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4f075437-481e-4e75-9a85-081d0a0ae448\") " pod="openstack/nova-cell0-conductor-0" Apr 02 14:01:57 crc kubenswrapper[4732]: I0402 14:01:57.896050 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knrkw\" (UniqueName: \"kubernetes.io/projected/4f075437-481e-4e75-9a85-081d0a0ae448-kube-api-access-knrkw\") pod \"nova-cell0-conductor-0\" (UID: \"4f075437-481e-4e75-9a85-081d0a0ae448\") " pod="openstack/nova-cell0-conductor-0" Apr 02 14:01:57 crc kubenswrapper[4732]: I0402 14:01:57.958241 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Apr 02 14:01:58 crc kubenswrapper[4732]: I0402 14:01:58.401920 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Apr 02 14:01:58 crc kubenswrapper[4732]: I0402 14:01:58.574720 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4f075437-481e-4e75-9a85-081d0a0ae448","Type":"ContainerStarted","Data":"687fddf59c927f359774951167846cc29afb6351602f4d0ecbbc587eb29e1488"} Apr 02 14:01:58 crc kubenswrapper[4732]: I0402 14:01:58.999094 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Apr 02 14:01:59 crc kubenswrapper[4732]: I0402 14:01:59.585648 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4f075437-481e-4e75-9a85-081d0a0ae448","Type":"ContainerStarted","Data":"813df6979af898c53fe62751d376e6fa9acbf0c42d87e8d987333c16cb709bc3"} Apr 02 14:01:59 crc kubenswrapper[4732]: I0402 14:01:59.585803 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Apr 02 14:01:59 crc kubenswrapper[4732]: I0402 14:01:59.607511 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.607495921 podStartE2EDuration="2.607495921s" podCreationTimestamp="2026-04-02 14:01:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:01:59.6019709 +0000 UTC m=+1476.506378463" watchObservedRunningTime="2026-04-02 14:01:59.607495921 +0000 UTC m=+1476.511903474" Apr 02 14:02:00 crc kubenswrapper[4732]: I0402 14:02:00.130834 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585642-b58hp"] Apr 02 14:02:00 crc kubenswrapper[4732]: I0402 14:02:00.132121 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585642-b58hp" Apr 02 14:02:00 crc kubenswrapper[4732]: I0402 14:02:00.135198 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:02:00 crc kubenswrapper[4732]: I0402 14:02:00.136410 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:02:00 crc kubenswrapper[4732]: I0402 14:02:00.136570 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:02:00 crc kubenswrapper[4732]: I0402 14:02:00.154086 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585642-b58hp"] Apr 02 14:02:00 crc kubenswrapper[4732]: I0402 14:02:00.219539 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj7gt\" (UniqueName: \"kubernetes.io/projected/c6f8ac3c-1daf-481a-bbb8-2ecbeb7bc44f-kube-api-access-bj7gt\") pod \"auto-csr-approver-29585642-b58hp\" (UID: \"c6f8ac3c-1daf-481a-bbb8-2ecbeb7bc44f\") " pod="openshift-infra/auto-csr-approver-29585642-b58hp" Apr 02 14:02:00 crc kubenswrapper[4732]: I0402 14:02:00.321968 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj7gt\" (UniqueName: \"kubernetes.io/projected/c6f8ac3c-1daf-481a-bbb8-2ecbeb7bc44f-kube-api-access-bj7gt\") pod \"auto-csr-approver-29585642-b58hp\" (UID: \"c6f8ac3c-1daf-481a-bbb8-2ecbeb7bc44f\") " pod="openshift-infra/auto-csr-approver-29585642-b58hp" Apr 02 14:02:00 crc kubenswrapper[4732]: I0402 14:02:00.341070 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj7gt\" (UniqueName: \"kubernetes.io/projected/c6f8ac3c-1daf-481a-bbb8-2ecbeb7bc44f-kube-api-access-bj7gt\") pod \"auto-csr-approver-29585642-b58hp\" (UID: \"c6f8ac3c-1daf-481a-bbb8-2ecbeb7bc44f\") " pod="openshift-infra/auto-csr-approver-29585642-b58hp" Apr 02 14:02:00 crc kubenswrapper[4732]: I0402 14:02:00.462241 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585642-b58hp" Apr 02 14:02:00 crc kubenswrapper[4732]: I0402 14:02:00.592552 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="4f075437-481e-4e75-9a85-081d0a0ae448" containerName="nova-cell0-conductor-conductor" containerID="cri-o://813df6979af898c53fe62751d376e6fa9acbf0c42d87e8d987333c16cb709bc3" gracePeriod=30 Apr 02 14:02:00 crc kubenswrapper[4732]: I0402 14:02:00.894500 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585642-b58hp"] Apr 02 14:02:00 crc kubenswrapper[4732]: W0402 14:02:00.914171 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6f8ac3c_1daf_481a_bbb8_2ecbeb7bc44f.slice/crio-9afa08ad831bea0a8252a3bd3d38c626c2886b17d2ab4853769711dfe5f03fe6 WatchSource:0}: Error finding container 9afa08ad831bea0a8252a3bd3d38c626c2886b17d2ab4853769711dfe5f03fe6: Status 404 returned error can't find the container with id 9afa08ad831bea0a8252a3bd3d38c626c2886b17d2ab4853769711dfe5f03fe6 Apr 02 14:02:01 crc kubenswrapper[4732]: I0402 14:02:01.601000 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585642-b58hp" event={"ID":"c6f8ac3c-1daf-481a-bbb8-2ecbeb7bc44f","Type":"ContainerStarted","Data":"9afa08ad831bea0a8252a3bd3d38c626c2886b17d2ab4853769711dfe5f03fe6"} Apr 02 14:02:02 crc kubenswrapper[4732]: I0402 14:02:02.612039 4732 generic.go:334] "Generic (PLEG): container finished" podID="c6f8ac3c-1daf-481a-bbb8-2ecbeb7bc44f" containerID="023d1061732a239ce49a7c0ad8b8c1320c94212598a48b9f7bd92b00be54b302" exitCode=0 Apr 02 14:02:02 crc kubenswrapper[4732]: I0402 14:02:02.612089 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585642-b58hp" event={"ID":"c6f8ac3c-1daf-481a-bbb8-2ecbeb7bc44f","Type":"ContainerDied","Data":"023d1061732a239ce49a7c0ad8b8c1320c94212598a48b9f7bd92b00be54b302"} Apr 02 14:02:03 crc kubenswrapper[4732]: I0402 14:02:03.977521 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585642-b58hp" Apr 02 14:02:04 crc kubenswrapper[4732]: I0402 14:02:04.086543 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj7gt\" (UniqueName: \"kubernetes.io/projected/c6f8ac3c-1daf-481a-bbb8-2ecbeb7bc44f-kube-api-access-bj7gt\") pod \"c6f8ac3c-1daf-481a-bbb8-2ecbeb7bc44f\" (UID: \"c6f8ac3c-1daf-481a-bbb8-2ecbeb7bc44f\") " Apr 02 14:02:04 crc kubenswrapper[4732]: I0402 14:02:04.091157 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6f8ac3c-1daf-481a-bbb8-2ecbeb7bc44f-kube-api-access-bj7gt" (OuterVolumeSpecName: "kube-api-access-bj7gt") pod "c6f8ac3c-1daf-481a-bbb8-2ecbeb7bc44f" (UID: "c6f8ac3c-1daf-481a-bbb8-2ecbeb7bc44f"). InnerVolumeSpecName "kube-api-access-bj7gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:02:04 crc kubenswrapper[4732]: I0402 14:02:04.189366 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bj7gt\" (UniqueName: \"kubernetes.io/projected/c6f8ac3c-1daf-481a-bbb8-2ecbeb7bc44f-kube-api-access-bj7gt\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:04 crc kubenswrapper[4732]: I0402 14:02:04.632586 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585642-b58hp" event={"ID":"c6f8ac3c-1daf-481a-bbb8-2ecbeb7bc44f","Type":"ContainerDied","Data":"9afa08ad831bea0a8252a3bd3d38c626c2886b17d2ab4853769711dfe5f03fe6"} Apr 02 14:02:04 crc kubenswrapper[4732]: I0402 14:02:04.632651 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9afa08ad831bea0a8252a3bd3d38c626c2886b17d2ab4853769711dfe5f03fe6" Apr 02 14:02:04 crc kubenswrapper[4732]: I0402 14:02:04.632648 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585642-b58hp" Apr 02 14:02:05 crc kubenswrapper[4732]: I0402 14:02:05.046017 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585636-nrxqm"] Apr 02 14:02:05 crc kubenswrapper[4732]: I0402 14:02:05.054245 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585636-nrxqm"] Apr 02 14:02:06 crc kubenswrapper[4732]: I0402 14:02:06.607526 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="1248b5d3-77aa-4d5b-9311-3481f2cfbde6" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 02 14:02:06 crc kubenswrapper[4732]: I0402 14:02:06.691187 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62eb3313-3f3d-4378-9427-ed1985cffffe" path="/var/lib/kubelet/pods/62eb3313-3f3d-4378-9427-ed1985cffffe/volumes" Apr 02 14:02:07 crc kubenswrapper[4732]: E0402 14:02:07.961075 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="813df6979af898c53fe62751d376e6fa9acbf0c42d87e8d987333c16cb709bc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 02 14:02:07 crc kubenswrapper[4732]: E0402 14:02:07.962873 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="813df6979af898c53fe62751d376e6fa9acbf0c42d87e8d987333c16cb709bc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 02 14:02:07 crc kubenswrapper[4732]: E0402 14:02:07.964077 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="813df6979af898c53fe62751d376e6fa9acbf0c42d87e8d987333c16cb709bc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 02 14:02:07 crc kubenswrapper[4732]: E0402 14:02:07.964126 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="4f075437-481e-4e75-9a85-081d0a0ae448" containerName="nova-cell0-conductor-conductor" Apr 02 14:02:11 crc kubenswrapper[4732]: I0402 14:02:11.702438 4732 generic.go:334] "Generic (PLEG): container finished" podID="1248b5d3-77aa-4d5b-9311-3481f2cfbde6" containerID="9d4640df4df0902f12ed980e9a16d4204f01a0b2fc31c15a35950122cffd7650" exitCode=137 Apr 02 14:02:11 crc kubenswrapper[4732]: I0402 14:02:11.702664 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1248b5d3-77aa-4d5b-9311-3481f2cfbde6","Type":"ContainerDied","Data":"9d4640df4df0902f12ed980e9a16d4204f01a0b2fc31c15a35950122cffd7650"} Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.288192 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.333829 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft5wj\" (UniqueName: \"kubernetes.io/projected/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-kube-api-access-ft5wj\") pod \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.333895 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-config-data\") pod \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.333934 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-combined-ca-bundle\") pod \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.334024 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-log-httpd\") pod \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.334089 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-run-httpd\") pod \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.334128 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-sg-core-conf-yaml\") pod \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.334150 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-scripts\") pod \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\" (UID: \"1248b5d3-77aa-4d5b-9311-3481f2cfbde6\") " Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.334796 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1248b5d3-77aa-4d5b-9311-3481f2cfbde6" (UID: "1248b5d3-77aa-4d5b-9311-3481f2cfbde6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.335031 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1248b5d3-77aa-4d5b-9311-3481f2cfbde6" (UID: "1248b5d3-77aa-4d5b-9311-3481f2cfbde6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.340174 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-kube-api-access-ft5wj" (OuterVolumeSpecName: "kube-api-access-ft5wj") pod "1248b5d3-77aa-4d5b-9311-3481f2cfbde6" (UID: "1248b5d3-77aa-4d5b-9311-3481f2cfbde6"). InnerVolumeSpecName "kube-api-access-ft5wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.348845 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-scripts" (OuterVolumeSpecName: "scripts") pod "1248b5d3-77aa-4d5b-9311-3481f2cfbde6" (UID: "1248b5d3-77aa-4d5b-9311-3481f2cfbde6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.365936 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1248b5d3-77aa-4d5b-9311-3481f2cfbde6" (UID: "1248b5d3-77aa-4d5b-9311-3481f2cfbde6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.428138 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1248b5d3-77aa-4d5b-9311-3481f2cfbde6" (UID: "1248b5d3-77aa-4d5b-9311-3481f2cfbde6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.436314 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft5wj\" (UniqueName: \"kubernetes.io/projected/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-kube-api-access-ft5wj\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.436349 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.436362 4732 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-log-httpd\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.436374 4732 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-run-httpd\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.436385 4732 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.436395 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.442269 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-config-data" (OuterVolumeSpecName: "config-data") pod "1248b5d3-77aa-4d5b-9311-3481f2cfbde6" (UID: "1248b5d3-77aa-4d5b-9311-3481f2cfbde6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.538329 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1248b5d3-77aa-4d5b-9311-3481f2cfbde6-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.719409 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1248b5d3-77aa-4d5b-9311-3481f2cfbde6","Type":"ContainerDied","Data":"c9f83fceae95825f61117bdfa17af70ddba5b5af0d63f3a4acbcd093a395899f"} Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.719470 4732 scope.go:117] "RemoveContainer" containerID="9d4640df4df0902f12ed980e9a16d4204f01a0b2fc31c15a35950122cffd7650" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.719484 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.747694 4732 scope.go:117] "RemoveContainer" containerID="41f88a7b9959ef3cf4dedfd030cc84dd85e78f7fd2632b538abdc89d1edd11c6" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.749015 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.758655 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.769815 4732 scope.go:117] "RemoveContainer" containerID="5e432d61851798b7733da9aeef480c9eae81af07a601737682affb42dd9ce147" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.791094 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:02:12 crc kubenswrapper[4732]: E0402 14:02:12.791641 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1248b5d3-77aa-4d5b-9311-3481f2cfbde6" containerName="sg-core" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.791673 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1248b5d3-77aa-4d5b-9311-3481f2cfbde6" containerName="sg-core" Apr 02 14:02:12 crc kubenswrapper[4732]: E0402 14:02:12.791691 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1248b5d3-77aa-4d5b-9311-3481f2cfbde6" containerName="proxy-httpd" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.791699 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1248b5d3-77aa-4d5b-9311-3481f2cfbde6" containerName="proxy-httpd" Apr 02 14:02:12 crc kubenswrapper[4732]: E0402 14:02:12.791738 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1248b5d3-77aa-4d5b-9311-3481f2cfbde6" containerName="ceilometer-notification-agent" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.791747 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1248b5d3-77aa-4d5b-9311-3481f2cfbde6" containerName="ceilometer-notification-agent" Apr 02 14:02:12 crc kubenswrapper[4732]: E0402 14:02:12.791762 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f8ac3c-1daf-481a-bbb8-2ecbeb7bc44f" containerName="oc" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.791770 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f8ac3c-1daf-481a-bbb8-2ecbeb7bc44f" containerName="oc" Apr 02 14:02:12 crc kubenswrapper[4732]: E0402 14:02:12.791785 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1248b5d3-77aa-4d5b-9311-3481f2cfbde6" containerName="ceilometer-central-agent" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.791796 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1248b5d3-77aa-4d5b-9311-3481f2cfbde6" containerName="ceilometer-central-agent" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.792052 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1248b5d3-77aa-4d5b-9311-3481f2cfbde6" containerName="proxy-httpd" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.792082 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1248b5d3-77aa-4d5b-9311-3481f2cfbde6" containerName="ceilometer-notification-agent" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.792102 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6f8ac3c-1daf-481a-bbb8-2ecbeb7bc44f" containerName="oc" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.792118 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1248b5d3-77aa-4d5b-9311-3481f2cfbde6" containerName="sg-core" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.792129 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1248b5d3-77aa-4d5b-9311-3481f2cfbde6" containerName="ceilometer-central-agent" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.794473 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.800784 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.801466 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.809077 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.809659 4732 scope.go:117] "RemoveContainer" containerID="ddbda13604c8ef351162bf5a9a09766f98206d7f6c7634213de5b12cf4b2fc76" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.845570 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fca45d1b-97a6-44fe-a519-29ed5ab029da-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " pod="openstack/ceilometer-0" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.845641 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fca45d1b-97a6-44fe-a519-29ed5ab029da-run-httpd\") pod \"ceilometer-0\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " pod="openstack/ceilometer-0" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.845676 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fca45d1b-97a6-44fe-a519-29ed5ab029da-scripts\") pod \"ceilometer-0\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " pod="openstack/ceilometer-0" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.845698 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhhps\" (UniqueName: \"kubernetes.io/projected/fca45d1b-97a6-44fe-a519-29ed5ab029da-kube-api-access-nhhps\") pod \"ceilometer-0\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " pod="openstack/ceilometer-0" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.845724 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fca45d1b-97a6-44fe-a519-29ed5ab029da-config-data\") pod \"ceilometer-0\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " pod="openstack/ceilometer-0" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.845784 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fca45d1b-97a6-44fe-a519-29ed5ab029da-log-httpd\") pod \"ceilometer-0\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " pod="openstack/ceilometer-0" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.845824 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fca45d1b-97a6-44fe-a519-29ed5ab029da-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " pod="openstack/ceilometer-0" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.947885 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fca45d1b-97a6-44fe-a519-29ed5ab029da-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " pod="openstack/ceilometer-0" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.947957 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fca45d1b-97a6-44fe-a519-29ed5ab029da-run-httpd\") pod \"ceilometer-0\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " pod="openstack/ceilometer-0" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.947986 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fca45d1b-97a6-44fe-a519-29ed5ab029da-scripts\") pod \"ceilometer-0\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " pod="openstack/ceilometer-0" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.948005 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhhps\" (UniqueName: \"kubernetes.io/projected/fca45d1b-97a6-44fe-a519-29ed5ab029da-kube-api-access-nhhps\") pod \"ceilometer-0\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " pod="openstack/ceilometer-0" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.948024 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fca45d1b-97a6-44fe-a519-29ed5ab029da-config-data\") pod \"ceilometer-0\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " pod="openstack/ceilometer-0" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.948064 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fca45d1b-97a6-44fe-a519-29ed5ab029da-log-httpd\") pod \"ceilometer-0\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " pod="openstack/ceilometer-0" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.948087 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fca45d1b-97a6-44fe-a519-29ed5ab029da-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " pod="openstack/ceilometer-0" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.949075 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fca45d1b-97a6-44fe-a519-29ed5ab029da-run-httpd\") pod \"ceilometer-0\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " pod="openstack/ceilometer-0" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.949389 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fca45d1b-97a6-44fe-a519-29ed5ab029da-log-httpd\") pod \"ceilometer-0\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " pod="openstack/ceilometer-0" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.952182 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fca45d1b-97a6-44fe-a519-29ed5ab029da-scripts\") pod \"ceilometer-0\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " pod="openstack/ceilometer-0" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.958347 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fca45d1b-97a6-44fe-a519-29ed5ab029da-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " pod="openstack/ceilometer-0" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.958702 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fca45d1b-97a6-44fe-a519-29ed5ab029da-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " pod="openstack/ceilometer-0" Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.962360 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fca45d1b-97a6-44fe-a519-29ed5ab029da-config-data\") pod \"ceilometer-0\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " pod="openstack/ceilometer-0" Apr 02 14:02:12 crc kubenswrapper[4732]: E0402 14:02:12.964937 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="813df6979af898c53fe62751d376e6fa9acbf0c42d87e8d987333c16cb709bc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 02 14:02:12 crc kubenswrapper[4732]: I0402 14:02:12.966258 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhhps\" (UniqueName: \"kubernetes.io/projected/fca45d1b-97a6-44fe-a519-29ed5ab029da-kube-api-access-nhhps\") pod \"ceilometer-0\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " pod="openstack/ceilometer-0" Apr 02 14:02:12 crc kubenswrapper[4732]: E0402 14:02:12.966578 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="813df6979af898c53fe62751d376e6fa9acbf0c42d87e8d987333c16cb709bc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 02 14:02:12 crc kubenswrapper[4732]: E0402 14:02:12.968224 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="813df6979af898c53fe62751d376e6fa9acbf0c42d87e8d987333c16cb709bc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 02 14:02:12 crc kubenswrapper[4732]: E0402 14:02:12.968270 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="4f075437-481e-4e75-9a85-081d0a0ae448" containerName="nova-cell0-conductor-conductor" Apr 02 14:02:13 crc kubenswrapper[4732]: I0402 14:02:13.125206 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 14:02:14 crc kubenswrapper[4732]: I0402 14:02:14.106713 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:02:14 crc kubenswrapper[4732]: W0402 14:02:14.107685 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfca45d1b_97a6_44fe_a519_29ed5ab029da.slice/crio-6f7087a937493a10bdd9322df7d42a5eece85137858f4b0393017c50cccef1f8 WatchSource:0}: Error finding container 6f7087a937493a10bdd9322df7d42a5eece85137858f4b0393017c50cccef1f8: Status 404 returned error can't find the container with id 6f7087a937493a10bdd9322df7d42a5eece85137858f4b0393017c50cccef1f8 Apr 02 14:02:14 crc kubenswrapper[4732]: I0402 14:02:14.110102 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 02 14:02:14 crc kubenswrapper[4732]: I0402 14:02:14.691797 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1248b5d3-77aa-4d5b-9311-3481f2cfbde6" path="/var/lib/kubelet/pods/1248b5d3-77aa-4d5b-9311-3481f2cfbde6/volumes" Apr 02 14:02:14 crc kubenswrapper[4732]: I0402 14:02:14.742067 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fca45d1b-97a6-44fe-a519-29ed5ab029da","Type":"ContainerStarted","Data":"6f7087a937493a10bdd9322df7d42a5eece85137858f4b0393017c50cccef1f8"} Apr 02 14:02:15 crc kubenswrapper[4732]: I0402 14:02:15.753739 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fca45d1b-97a6-44fe-a519-29ed5ab029da","Type":"ContainerStarted","Data":"86b15c0a6e6bd7171b13d682658a8d0df48f366861a13932771f854240085848"} Apr 02 14:02:15 crc kubenswrapper[4732]: I0402 14:02:15.754240 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fca45d1b-97a6-44fe-a519-29ed5ab029da","Type":"ContainerStarted","Data":"7aff1bfaa21127765fbf36fc46affb20b40da03b4c2fc027f0287fff9b4e86a0"} Apr 02 14:02:16 crc kubenswrapper[4732]: I0402 14:02:16.763531 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fca45d1b-97a6-44fe-a519-29ed5ab029da","Type":"ContainerStarted","Data":"5efa4fba6157a758990fc184bbe56eb1978c503a2bea7de2970f2c392a51e001"} Apr 02 14:02:17 crc kubenswrapper[4732]: E0402 14:02:17.961416 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="813df6979af898c53fe62751d376e6fa9acbf0c42d87e8d987333c16cb709bc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 02 14:02:17 crc kubenswrapper[4732]: E0402 14:02:17.962829 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="813df6979af898c53fe62751d376e6fa9acbf0c42d87e8d987333c16cb709bc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 02 14:02:17 crc kubenswrapper[4732]: E0402 14:02:17.964040 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="813df6979af898c53fe62751d376e6fa9acbf0c42d87e8d987333c16cb709bc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 02 14:02:17 crc kubenswrapper[4732]: E0402 14:02:17.964111 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="4f075437-481e-4e75-9a85-081d0a0ae448" containerName="nova-cell0-conductor-conductor" Apr 02 14:02:18 crc kubenswrapper[4732]: I0402 14:02:18.802723 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fca45d1b-97a6-44fe-a519-29ed5ab029da","Type":"ContainerStarted","Data":"344ad458c644e8d4d3dba5022a0d33beae96bde7071f592d8c4991568275cf8b"} Apr 02 14:02:18 crc kubenswrapper[4732]: I0402 14:02:18.803074 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Apr 02 14:02:18 crc kubenswrapper[4732]: I0402 14:02:18.826813 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.057756505 podStartE2EDuration="6.826798076s" podCreationTimestamp="2026-04-02 14:02:12 +0000 UTC" firstStartedPulling="2026-04-02 14:02:14.109849422 +0000 UTC m=+1491.014256975" lastFinishedPulling="2026-04-02 14:02:17.878890993 +0000 UTC m=+1494.783298546" observedRunningTime="2026-04-02 14:02:18.825801888 +0000 UTC m=+1495.730209451" watchObservedRunningTime="2026-04-02 14:02:18.826798076 +0000 UTC m=+1495.731205629" Apr 02 14:02:22 crc kubenswrapper[4732]: E0402 14:02:22.960340 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="813df6979af898c53fe62751d376e6fa9acbf0c42d87e8d987333c16cb709bc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 02 14:02:22 crc kubenswrapper[4732]: E0402 14:02:22.962107 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="813df6979af898c53fe62751d376e6fa9acbf0c42d87e8d987333c16cb709bc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 02 14:02:22 crc kubenswrapper[4732]: E0402 14:02:22.963334 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="813df6979af898c53fe62751d376e6fa9acbf0c42d87e8d987333c16cb709bc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 02 14:02:22 crc kubenswrapper[4732]: E0402 14:02:22.963379 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="4f075437-481e-4e75-9a85-081d0a0ae448" containerName="nova-cell0-conductor-conductor" Apr 02 14:02:27 crc kubenswrapper[4732]: E0402 14:02:27.961605 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="813df6979af898c53fe62751d376e6fa9acbf0c42d87e8d987333c16cb709bc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 02 14:02:27 crc kubenswrapper[4732]: E0402 14:02:27.964943 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="813df6979af898c53fe62751d376e6fa9acbf0c42d87e8d987333c16cb709bc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 02 14:02:27 crc kubenswrapper[4732]: E0402 14:02:27.966576 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="813df6979af898c53fe62751d376e6fa9acbf0c42d87e8d987333c16cb709bc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 02 14:02:27 crc kubenswrapper[4732]: E0402 14:02:27.966728 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="4f075437-481e-4e75-9a85-081d0a0ae448" containerName="nova-cell0-conductor-conductor" Apr 02 14:02:30 crc kubenswrapper[4732]: I0402 14:02:30.955632 4732 generic.go:334] "Generic (PLEG): container finished" podID="4f075437-481e-4e75-9a85-081d0a0ae448" containerID="813df6979af898c53fe62751d376e6fa9acbf0c42d87e8d987333c16cb709bc3" exitCode=137 Apr 02 14:02:30 crc kubenswrapper[4732]: I0402 14:02:30.955793 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4f075437-481e-4e75-9a85-081d0a0ae448","Type":"ContainerDied","Data":"813df6979af898c53fe62751d376e6fa9acbf0c42d87e8d987333c16cb709bc3"} Apr 02 14:02:30 crc kubenswrapper[4732]: I0402 14:02:30.956646 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4f075437-481e-4e75-9a85-081d0a0ae448","Type":"ContainerDied","Data":"687fddf59c927f359774951167846cc29afb6351602f4d0ecbbc587eb29e1488"} Apr 02 14:02:30 crc kubenswrapper[4732]: I0402 14:02:30.956664 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="687fddf59c927f359774951167846cc29afb6351602f4d0ecbbc587eb29e1488" Apr 02 14:02:30 crc kubenswrapper[4732]: I0402 14:02:30.980194 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Apr 02 14:02:30 crc kubenswrapper[4732]: I0402 14:02:30.994369 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f075437-481e-4e75-9a85-081d0a0ae448-config-data\") pod \"4f075437-481e-4e75-9a85-081d0a0ae448\" (UID: \"4f075437-481e-4e75-9a85-081d0a0ae448\") " Apr 02 14:02:30 crc kubenswrapper[4732]: I0402 14:02:30.995194 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knrkw\" (UniqueName: \"kubernetes.io/projected/4f075437-481e-4e75-9a85-081d0a0ae448-kube-api-access-knrkw\") pod \"4f075437-481e-4e75-9a85-081d0a0ae448\" (UID: \"4f075437-481e-4e75-9a85-081d0a0ae448\") " Apr 02 14:02:30 crc kubenswrapper[4732]: I0402 14:02:30.995331 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f075437-481e-4e75-9a85-081d0a0ae448-combined-ca-bundle\") pod \"4f075437-481e-4e75-9a85-081d0a0ae448\" (UID: \"4f075437-481e-4e75-9a85-081d0a0ae448\") " Apr 02 14:02:31 crc kubenswrapper[4732]: I0402 14:02:31.003548 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f075437-481e-4e75-9a85-081d0a0ae448-kube-api-access-knrkw" (OuterVolumeSpecName: "kube-api-access-knrkw") pod "4f075437-481e-4e75-9a85-081d0a0ae448" (UID: "4f075437-481e-4e75-9a85-081d0a0ae448"). InnerVolumeSpecName "kube-api-access-knrkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:02:31 crc kubenswrapper[4732]: I0402 14:02:31.032224 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f075437-481e-4e75-9a85-081d0a0ae448-config-data" (OuterVolumeSpecName: "config-data") pod "4f075437-481e-4e75-9a85-081d0a0ae448" (UID: "4f075437-481e-4e75-9a85-081d0a0ae448"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:02:31 crc kubenswrapper[4732]: I0402 14:02:31.043123 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f075437-481e-4e75-9a85-081d0a0ae448-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f075437-481e-4e75-9a85-081d0a0ae448" (UID: "4f075437-481e-4e75-9a85-081d0a0ae448"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:02:31 crc kubenswrapper[4732]: I0402 14:02:31.103783 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f075437-481e-4e75-9a85-081d0a0ae448-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:31 crc kubenswrapper[4732]: I0402 14:02:31.103816 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knrkw\" (UniqueName: \"kubernetes.io/projected/4f075437-481e-4e75-9a85-081d0a0ae448-kube-api-access-knrkw\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:31 crc kubenswrapper[4732]: I0402 14:02:31.103826 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f075437-481e-4e75-9a85-081d0a0ae448-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:31 crc kubenswrapper[4732]: I0402 14:02:31.965886 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Apr 02 14:02:31 crc kubenswrapper[4732]: I0402 14:02:31.996274 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Apr 02 14:02:32 crc kubenswrapper[4732]: I0402 14:02:32.006127 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Apr 02 14:02:32 crc kubenswrapper[4732]: I0402 14:02:32.028257 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Apr 02 14:02:32 crc kubenswrapper[4732]: E0402 14:02:32.028922 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f075437-481e-4e75-9a85-081d0a0ae448" containerName="nova-cell0-conductor-conductor" Apr 02 14:02:32 crc kubenswrapper[4732]: I0402 14:02:32.028949 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f075437-481e-4e75-9a85-081d0a0ae448" containerName="nova-cell0-conductor-conductor" Apr 02 14:02:32 crc kubenswrapper[4732]: I0402 14:02:32.029168 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f075437-481e-4e75-9a85-081d0a0ae448" containerName="nova-cell0-conductor-conductor" Apr 02 14:02:32 crc kubenswrapper[4732]: I0402 14:02:32.029944 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Apr 02 14:02:32 crc kubenswrapper[4732]: I0402 14:02:32.032422 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kxxgt" Apr 02 14:02:32 crc kubenswrapper[4732]: I0402 14:02:32.033240 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Apr 02 14:02:32 crc kubenswrapper[4732]: I0402 14:02:32.042868 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Apr 02 14:02:32 crc kubenswrapper[4732]: I0402 14:02:32.131008 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f7b2b4-c3da-49e6-b873-c2937dc27bbf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b5f7b2b4-c3da-49e6-b873-c2937dc27bbf\") " pod="openstack/nova-cell0-conductor-0" Apr 02 14:02:32 crc kubenswrapper[4732]: I0402 14:02:32.131075 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvcps\" (UniqueName: \"kubernetes.io/projected/b5f7b2b4-c3da-49e6-b873-c2937dc27bbf-kube-api-access-wvcps\") pod \"nova-cell0-conductor-0\" (UID: \"b5f7b2b4-c3da-49e6-b873-c2937dc27bbf\") " pod="openstack/nova-cell0-conductor-0" Apr 02 14:02:32 crc kubenswrapper[4732]: I0402 14:02:32.131153 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f7b2b4-c3da-49e6-b873-c2937dc27bbf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b5f7b2b4-c3da-49e6-b873-c2937dc27bbf\") " pod="openstack/nova-cell0-conductor-0" Apr 02 14:02:32 crc kubenswrapper[4732]: I0402 14:02:32.233324 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f7b2b4-c3da-49e6-b873-c2937dc27bbf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b5f7b2b4-c3da-49e6-b873-c2937dc27bbf\") " pod="openstack/nova-cell0-conductor-0" Apr 02 14:02:32 crc kubenswrapper[4732]: I0402 14:02:32.233401 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvcps\" (UniqueName: \"kubernetes.io/projected/b5f7b2b4-c3da-49e6-b873-c2937dc27bbf-kube-api-access-wvcps\") pod \"nova-cell0-conductor-0\" (UID: \"b5f7b2b4-c3da-49e6-b873-c2937dc27bbf\") " pod="openstack/nova-cell0-conductor-0" Apr 02 14:02:32 crc kubenswrapper[4732]: I0402 14:02:32.233476 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f7b2b4-c3da-49e6-b873-c2937dc27bbf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b5f7b2b4-c3da-49e6-b873-c2937dc27bbf\") " pod="openstack/nova-cell0-conductor-0" Apr 02 14:02:32 crc kubenswrapper[4732]: I0402 14:02:32.238887 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f7b2b4-c3da-49e6-b873-c2937dc27bbf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b5f7b2b4-c3da-49e6-b873-c2937dc27bbf\") " pod="openstack/nova-cell0-conductor-0" Apr 02 14:02:32 crc kubenswrapper[4732]: I0402 14:02:32.239745 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f7b2b4-c3da-49e6-b873-c2937dc27bbf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b5f7b2b4-c3da-49e6-b873-c2937dc27bbf\") " pod="openstack/nova-cell0-conductor-0" Apr 02 14:02:32 crc kubenswrapper[4732]: I0402 14:02:32.261008 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvcps\" (UniqueName: \"kubernetes.io/projected/b5f7b2b4-c3da-49e6-b873-c2937dc27bbf-kube-api-access-wvcps\") pod \"nova-cell0-conductor-0\" (UID: \"b5f7b2b4-c3da-49e6-b873-c2937dc27bbf\") " pod="openstack/nova-cell0-conductor-0" Apr 02 14:02:32 crc kubenswrapper[4732]: I0402 14:02:32.351441 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Apr 02 14:02:32 crc kubenswrapper[4732]: I0402 14:02:32.694274 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f075437-481e-4e75-9a85-081d0a0ae448" path="/var/lib/kubelet/pods/4f075437-481e-4e75-9a85-081d0a0ae448/volumes" Apr 02 14:02:32 crc kubenswrapper[4732]: W0402 14:02:32.793222 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5f7b2b4_c3da_49e6_b873_c2937dc27bbf.slice/crio-97b64fb0f54b03714dafcf2416729e3cf17bfc8cd1dd09f01b1fd122a902a4d4 WatchSource:0}: Error finding container 97b64fb0f54b03714dafcf2416729e3cf17bfc8cd1dd09f01b1fd122a902a4d4: Status 404 returned error can't find the container with id 97b64fb0f54b03714dafcf2416729e3cf17bfc8cd1dd09f01b1fd122a902a4d4 Apr 02 14:02:32 crc kubenswrapper[4732]: I0402 14:02:32.794744 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Apr 02 14:02:32 crc kubenswrapper[4732]: I0402 14:02:32.975609 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b5f7b2b4-c3da-49e6-b873-c2937dc27bbf","Type":"ContainerStarted","Data":"53ca53bf7b483b1e1df50110fc9c68c39bab82ab921de9ad3f98488b42fb49c2"} Apr 02 14:02:32 crc kubenswrapper[4732]: I0402 14:02:32.975674 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b5f7b2b4-c3da-49e6-b873-c2937dc27bbf","Type":"ContainerStarted","Data":"97b64fb0f54b03714dafcf2416729e3cf17bfc8cd1dd09f01b1fd122a902a4d4"} Apr 02 14:02:32 crc kubenswrapper[4732]: I0402 14:02:32.975890 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Apr 02 14:02:32 crc kubenswrapper[4732]: I0402 14:02:32.994807 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=0.994784994 podStartE2EDuration="994.784994ms" podCreationTimestamp="2026-04-02 14:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:02:32.990631831 +0000 UTC m=+1509.895039394" watchObservedRunningTime="2026-04-02 14:02:32.994784994 +0000 UTC m=+1509.899192557" Apr 02 14:02:37 crc kubenswrapper[4732]: I0402 14:02:37.376696 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Apr 02 14:02:37 crc kubenswrapper[4732]: I0402 14:02:37.820079 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-9pqhd"] Apr 02 14:02:37 crc kubenswrapper[4732]: I0402 14:02:37.821480 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9pqhd" Apr 02 14:02:37 crc kubenswrapper[4732]: I0402 14:02:37.824098 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Apr 02 14:02:37 crc kubenswrapper[4732]: I0402 14:02:37.824539 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Apr 02 14:02:37 crc kubenswrapper[4732]: I0402 14:02:37.835145 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d730389e-11ac-4fa7-86d7-efa07afdbe08-scripts\") pod \"nova-cell0-cell-mapping-9pqhd\" (UID: \"d730389e-11ac-4fa7-86d7-efa07afdbe08\") " pod="openstack/nova-cell0-cell-mapping-9pqhd" Apr 02 14:02:37 crc kubenswrapper[4732]: I0402 14:02:37.835209 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d730389e-11ac-4fa7-86d7-efa07afdbe08-config-data\") pod \"nova-cell0-cell-mapping-9pqhd\" (UID: \"d730389e-11ac-4fa7-86d7-efa07afdbe08\") " pod="openstack/nova-cell0-cell-mapping-9pqhd" Apr 02 14:02:37 crc kubenswrapper[4732]: I0402 14:02:37.835307 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d730389e-11ac-4fa7-86d7-efa07afdbe08-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9pqhd\" (UID: \"d730389e-11ac-4fa7-86d7-efa07afdbe08\") " pod="openstack/nova-cell0-cell-mapping-9pqhd" Apr 02 14:02:37 crc kubenswrapper[4732]: I0402 14:02:37.835333 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgpwb\" (UniqueName: \"kubernetes.io/projected/d730389e-11ac-4fa7-86d7-efa07afdbe08-kube-api-access-rgpwb\") pod \"nova-cell0-cell-mapping-9pqhd\" (UID: \"d730389e-11ac-4fa7-86d7-efa07afdbe08\") " pod="openstack/nova-cell0-cell-mapping-9pqhd" Apr 02 14:02:37 crc kubenswrapper[4732]: I0402 14:02:37.837803 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9pqhd"] Apr 02 14:02:37 crc kubenswrapper[4732]: I0402 14:02:37.937492 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d730389e-11ac-4fa7-86d7-efa07afdbe08-config-data\") pod \"nova-cell0-cell-mapping-9pqhd\" (UID: \"d730389e-11ac-4fa7-86d7-efa07afdbe08\") " pod="openstack/nova-cell0-cell-mapping-9pqhd" Apr 02 14:02:37 crc kubenswrapper[4732]: I0402 14:02:37.937692 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d730389e-11ac-4fa7-86d7-efa07afdbe08-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9pqhd\" (UID: \"d730389e-11ac-4fa7-86d7-efa07afdbe08\") " pod="openstack/nova-cell0-cell-mapping-9pqhd" Apr 02 14:02:37 crc kubenswrapper[4732]: I0402 14:02:37.937733 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgpwb\" (UniqueName: \"kubernetes.io/projected/d730389e-11ac-4fa7-86d7-efa07afdbe08-kube-api-access-rgpwb\") pod \"nova-cell0-cell-mapping-9pqhd\" (UID: \"d730389e-11ac-4fa7-86d7-efa07afdbe08\") " pod="openstack/nova-cell0-cell-mapping-9pqhd" Apr 02 14:02:37 crc kubenswrapper[4732]: I0402 14:02:37.937810 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d730389e-11ac-4fa7-86d7-efa07afdbe08-scripts\") pod \"nova-cell0-cell-mapping-9pqhd\" (UID: \"d730389e-11ac-4fa7-86d7-efa07afdbe08\") " pod="openstack/nova-cell0-cell-mapping-9pqhd" Apr 02 14:02:37 crc kubenswrapper[4732]: I0402 14:02:37.944117 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d730389e-11ac-4fa7-86d7-efa07afdbe08-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9pqhd\" (UID: \"d730389e-11ac-4fa7-86d7-efa07afdbe08\") " pod="openstack/nova-cell0-cell-mapping-9pqhd" Apr 02 14:02:37 crc kubenswrapper[4732]: I0402 14:02:37.944674 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d730389e-11ac-4fa7-86d7-efa07afdbe08-config-data\") pod \"nova-cell0-cell-mapping-9pqhd\" (UID: \"d730389e-11ac-4fa7-86d7-efa07afdbe08\") " pod="openstack/nova-cell0-cell-mapping-9pqhd" Apr 02 14:02:37 crc kubenswrapper[4732]: I0402 14:02:37.971974 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d730389e-11ac-4fa7-86d7-efa07afdbe08-scripts\") pod \"nova-cell0-cell-mapping-9pqhd\" (UID: \"d730389e-11ac-4fa7-86d7-efa07afdbe08\") " pod="openstack/nova-cell0-cell-mapping-9pqhd" Apr 02 14:02:37 crc kubenswrapper[4732]: I0402 14:02:37.984327 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgpwb\" (UniqueName: \"kubernetes.io/projected/d730389e-11ac-4fa7-86d7-efa07afdbe08-kube-api-access-rgpwb\") pod \"nova-cell0-cell-mapping-9pqhd\" (UID: \"d730389e-11ac-4fa7-86d7-efa07afdbe08\") " pod="openstack/nova-cell0-cell-mapping-9pqhd" Apr 02 14:02:37 crc kubenswrapper[4732]: I0402 14:02:37.988682 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Apr 02 14:02:37 crc kubenswrapper[4732]: I0402 14:02:37.990651 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 02 14:02:37 crc kubenswrapper[4732]: I0402 14:02:37.996162 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.008557 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.038026 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.039642 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.040034 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83dc4cc3-ce54-4685-9485-16526fec666a-config-data\") pod \"nova-api-0\" (UID: \"83dc4cc3-ce54-4685-9485-16526fec666a\") " pod="openstack/nova-api-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.040197 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83dc4cc3-ce54-4685-9485-16526fec666a-logs\") pod \"nova-api-0\" (UID: \"83dc4cc3-ce54-4685-9485-16526fec666a\") " pod="openstack/nova-api-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.040270 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83dc4cc3-ce54-4685-9485-16526fec666a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"83dc4cc3-ce54-4685-9485-16526fec666a\") " pod="openstack/nova-api-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.040345 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdvkg\" (UniqueName: \"kubernetes.io/projected/83dc4cc3-ce54-4685-9485-16526fec666a-kube-api-access-kdvkg\") pod \"nova-api-0\" (UID: \"83dc4cc3-ce54-4685-9485-16526fec666a\") " pod="openstack/nova-api-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.041479 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.051356 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.127659 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.136266 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.141660 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.142861 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83dc4cc3-ce54-4685-9485-16526fec666a-logs\") pod \"nova-api-0\" (UID: \"83dc4cc3-ce54-4685-9485-16526fec666a\") " pod="openstack/nova-api-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.142923 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83dc4cc3-ce54-4685-9485-16526fec666a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"83dc4cc3-ce54-4685-9485-16526fec666a\") " pod="openstack/nova-api-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.143025 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdvkg\" (UniqueName: \"kubernetes.io/projected/83dc4cc3-ce54-4685-9485-16526fec666a-kube-api-access-kdvkg\") pod \"nova-api-0\" (UID: \"83dc4cc3-ce54-4685-9485-16526fec666a\") " pod="openstack/nova-api-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.143051 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83dc4cc3-ce54-4685-9485-16526fec666a-config-data\") pod \"nova-api-0\" (UID: \"83dc4cc3-ce54-4685-9485-16526fec666a\") " pod="openstack/nova-api-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.144101 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83dc4cc3-ce54-4685-9485-16526fec666a-logs\") pod \"nova-api-0\" (UID: \"83dc4cc3-ce54-4685-9485-16526fec666a\") " pod="openstack/nova-api-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.150371 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9pqhd" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.162727 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83dc4cc3-ce54-4685-9485-16526fec666a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"83dc4cc3-ce54-4685-9485-16526fec666a\") " pod="openstack/nova-api-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.185999 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83dc4cc3-ce54-4685-9485-16526fec666a-config-data\") pod \"nova-api-0\" (UID: \"83dc4cc3-ce54-4685-9485-16526fec666a\") " pod="openstack/nova-api-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.190960 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdvkg\" (UniqueName: \"kubernetes.io/projected/83dc4cc3-ce54-4685-9485-16526fec666a-kube-api-access-kdvkg\") pod \"nova-api-0\" (UID: \"83dc4cc3-ce54-4685-9485-16526fec666a\") " pod="openstack/nova-api-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.224671 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.238742 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.240261 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.246124 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.248906 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4khv2\" (UniqueName: \"kubernetes.io/projected/6ef800e5-24ee-4bc3-9248-d49d0c4d5abf-kube-api-access-4khv2\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ef800e5-24ee-4bc3-9248-d49d0c4d5abf\") " pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.248963 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvzws\" (UniqueName: \"kubernetes.io/projected/3c4a2d1b-e579-4e87-ba09-500ce4425847-kube-api-access-nvzws\") pod \"nova-metadata-0\" (UID: \"3c4a2d1b-e579-4e87-ba09-500ce4425847\") " pod="openstack/nova-metadata-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.249017 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47153a66-ef48-4bb2-bd19-2a5d5d506d0d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"47153a66-ef48-4bb2-bd19-2a5d5d506d0d\") " pod="openstack/nova-scheduler-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.249048 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c4a2d1b-e579-4e87-ba09-500ce4425847-config-data\") pod \"nova-metadata-0\" (UID: \"3c4a2d1b-e579-4e87-ba09-500ce4425847\") " pod="openstack/nova-metadata-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.249076 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c4a2d1b-e579-4e87-ba09-500ce4425847-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c4a2d1b-e579-4e87-ba09-500ce4425847\") " pod="openstack/nova-metadata-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.249124 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss849\" (UniqueName: \"kubernetes.io/projected/47153a66-ef48-4bb2-bd19-2a5d5d506d0d-kube-api-access-ss849\") pod \"nova-scheduler-0\" (UID: \"47153a66-ef48-4bb2-bd19-2a5d5d506d0d\") " pod="openstack/nova-scheduler-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.249216 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c4a2d1b-e579-4e87-ba09-500ce4425847-logs\") pod \"nova-metadata-0\" (UID: \"3c4a2d1b-e579-4e87-ba09-500ce4425847\") " pod="openstack/nova-metadata-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.249269 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47153a66-ef48-4bb2-bd19-2a5d5d506d0d-config-data\") pod \"nova-scheduler-0\" (UID: \"47153a66-ef48-4bb2-bd19-2a5d5d506d0d\") " pod="openstack/nova-scheduler-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.249380 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef800e5-24ee-4bc3-9248-d49d0c4d5abf-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ef800e5-24ee-4bc3-9248-d49d0c4d5abf\") " pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.249412 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef800e5-24ee-4bc3-9248-d49d0c4d5abf-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ef800e5-24ee-4bc3-9248-d49d0c4d5abf\") " pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.316682 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.350133 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef800e5-24ee-4bc3-9248-d49d0c4d5abf-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ef800e5-24ee-4bc3-9248-d49d0c4d5abf\") " pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.350178 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef800e5-24ee-4bc3-9248-d49d0c4d5abf-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ef800e5-24ee-4bc3-9248-d49d0c4d5abf\") " pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.350226 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4khv2\" (UniqueName: \"kubernetes.io/projected/6ef800e5-24ee-4bc3-9248-d49d0c4d5abf-kube-api-access-4khv2\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ef800e5-24ee-4bc3-9248-d49d0c4d5abf\") " pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.350256 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvzws\" (UniqueName: \"kubernetes.io/projected/3c4a2d1b-e579-4e87-ba09-500ce4425847-kube-api-access-nvzws\") pod \"nova-metadata-0\" (UID: \"3c4a2d1b-e579-4e87-ba09-500ce4425847\") " pod="openstack/nova-metadata-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.350286 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47153a66-ef48-4bb2-bd19-2a5d5d506d0d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"47153a66-ef48-4bb2-bd19-2a5d5d506d0d\") " pod="openstack/nova-scheduler-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.350311 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c4a2d1b-e579-4e87-ba09-500ce4425847-config-data\") pod \"nova-metadata-0\" (UID: \"3c4a2d1b-e579-4e87-ba09-500ce4425847\") " pod="openstack/nova-metadata-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.350338 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c4a2d1b-e579-4e87-ba09-500ce4425847-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c4a2d1b-e579-4e87-ba09-500ce4425847\") " pod="openstack/nova-metadata-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.350382 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss849\" (UniqueName: \"kubernetes.io/projected/47153a66-ef48-4bb2-bd19-2a5d5d506d0d-kube-api-access-ss849\") pod \"nova-scheduler-0\" (UID: \"47153a66-ef48-4bb2-bd19-2a5d5d506d0d\") " pod="openstack/nova-scheduler-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.350426 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c4a2d1b-e579-4e87-ba09-500ce4425847-logs\") pod \"nova-metadata-0\" (UID: \"3c4a2d1b-e579-4e87-ba09-500ce4425847\") " pod="openstack/nova-metadata-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.350448 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47153a66-ef48-4bb2-bd19-2a5d5d506d0d-config-data\") pod \"nova-scheduler-0\" (UID: \"47153a66-ef48-4bb2-bd19-2a5d5d506d0d\") " pod="openstack/nova-scheduler-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.353918 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47153a66-ef48-4bb2-bd19-2a5d5d506d0d-config-data\") pod \"nova-scheduler-0\" (UID: \"47153a66-ef48-4bb2-bd19-2a5d5d506d0d\") " pod="openstack/nova-scheduler-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.355899 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef800e5-24ee-4bc3-9248-d49d0c4d5abf-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ef800e5-24ee-4bc3-9248-d49d0c4d5abf\") " pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.356743 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c4a2d1b-e579-4e87-ba09-500ce4425847-logs\") pod \"nova-metadata-0\" (UID: \"3c4a2d1b-e579-4e87-ba09-500ce4425847\") " pod="openstack/nova-metadata-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.366250 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c4a2d1b-e579-4e87-ba09-500ce4425847-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c4a2d1b-e579-4e87-ba09-500ce4425847\") " pod="openstack/nova-metadata-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.366419 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-tjr4k"] Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.368141 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47153a66-ef48-4bb2-bd19-2a5d5d506d0d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"47153a66-ef48-4bb2-bd19-2a5d5d506d0d\") " pod="openstack/nova-scheduler-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.368872 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.369248 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef800e5-24ee-4bc3-9248-d49d0c4d5abf-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ef800e5-24ee-4bc3-9248-d49d0c4d5abf\") " pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.378781 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4khv2\" (UniqueName: \"kubernetes.io/projected/6ef800e5-24ee-4bc3-9248-d49d0c4d5abf-kube-api-access-4khv2\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ef800e5-24ee-4bc3-9248-d49d0c4d5abf\") " pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.388227 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss849\" (UniqueName: \"kubernetes.io/projected/47153a66-ef48-4bb2-bd19-2a5d5d506d0d-kube-api-access-ss849\") pod \"nova-scheduler-0\" (UID: \"47153a66-ef48-4bb2-bd19-2a5d5d506d0d\") " pod="openstack/nova-scheduler-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.388281 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvzws\" (UniqueName: \"kubernetes.io/projected/3c4a2d1b-e579-4e87-ba09-500ce4425847-kube-api-access-nvzws\") pod \"nova-metadata-0\" (UID: \"3c4a2d1b-e579-4e87-ba09-500ce4425847\") " pod="openstack/nova-metadata-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.388864 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c4a2d1b-e579-4e87-ba09-500ce4425847-config-data\") pod \"nova-metadata-0\" (UID: \"3c4a2d1b-e579-4e87-ba09-500ce4425847\") " pod="openstack/nova-metadata-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.388973 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.393214 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-tjr4k"] Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.423956 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.452368 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-dns-svc\") pod \"dnsmasq-dns-865f5d856f-tjr4k\" (UID: \"bdbfa898-0999-49da-9194-ff7bf15c955a\") " pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.452487 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-tjr4k\" (UID: \"bdbfa898-0999-49da-9194-ff7bf15c955a\") " pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.452572 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-tjr4k\" (UID: \"bdbfa898-0999-49da-9194-ff7bf15c955a\") " pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.452607 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8996\" (UniqueName: \"kubernetes.io/projected/bdbfa898-0999-49da-9194-ff7bf15c955a-kube-api-access-l8996\") pod \"dnsmasq-dns-865f5d856f-tjr4k\" (UID: \"bdbfa898-0999-49da-9194-ff7bf15c955a\") " pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.452657 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-config\") pod \"dnsmasq-dns-865f5d856f-tjr4k\" (UID: \"bdbfa898-0999-49da-9194-ff7bf15c955a\") " pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.452677 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-tjr4k\" (UID: \"bdbfa898-0999-49da-9194-ff7bf15c955a\") " pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.553918 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-tjr4k\" (UID: \"bdbfa898-0999-49da-9194-ff7bf15c955a\") " pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.553994 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-tjr4k\" (UID: \"bdbfa898-0999-49da-9194-ff7bf15c955a\") " pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.554021 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8996\" (UniqueName: \"kubernetes.io/projected/bdbfa898-0999-49da-9194-ff7bf15c955a-kube-api-access-l8996\") pod \"dnsmasq-dns-865f5d856f-tjr4k\" (UID: \"bdbfa898-0999-49da-9194-ff7bf15c955a\") " pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.554051 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-config\") pod \"dnsmasq-dns-865f5d856f-tjr4k\" (UID: \"bdbfa898-0999-49da-9194-ff7bf15c955a\") " pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.554071 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-tjr4k\" (UID: \"bdbfa898-0999-49da-9194-ff7bf15c955a\") " pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.554118 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-dns-svc\") pod \"dnsmasq-dns-865f5d856f-tjr4k\" (UID: \"bdbfa898-0999-49da-9194-ff7bf15c955a\") " pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.555422 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-config\") pod \"dnsmasq-dns-865f5d856f-tjr4k\" (UID: \"bdbfa898-0999-49da-9194-ff7bf15c955a\") " pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.555506 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-tjr4k\" (UID: \"bdbfa898-0999-49da-9194-ff7bf15c955a\") " pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.555541 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-dns-svc\") pod \"dnsmasq-dns-865f5d856f-tjr4k\" (UID: \"bdbfa898-0999-49da-9194-ff7bf15c955a\") " pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.555590 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-tjr4k\" (UID: \"bdbfa898-0999-49da-9194-ff7bf15c955a\") " pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.556059 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-tjr4k\" (UID: \"bdbfa898-0999-49da-9194-ff7bf15c955a\") " pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.573361 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8996\" (UniqueName: \"kubernetes.io/projected/bdbfa898-0999-49da-9194-ff7bf15c955a-kube-api-access-l8996\") pod \"dnsmasq-dns-865f5d856f-tjr4k\" (UID: \"bdbfa898-0999-49da-9194-ff7bf15c955a\") " pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.641273 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.665330 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.742062 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.789369 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9pqhd"] Apr 02 14:02:38 crc kubenswrapper[4732]: I0402 14:02:38.975720 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Apr 02 14:02:39 crc kubenswrapper[4732]: I0402 14:02:39.045051 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9pqhd" event={"ID":"d730389e-11ac-4fa7-86d7-efa07afdbe08","Type":"ContainerStarted","Data":"13987ae8669abfd7ae8edbe870112408c39d97aada631737512a26e14fa02f74"} Apr 02 14:02:39 crc kubenswrapper[4732]: I0402 14:02:39.047161 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83dc4cc3-ce54-4685-9485-16526fec666a","Type":"ContainerStarted","Data":"b22ff01a4115ec8d827d4983552cabe5598c5257ef0680386b4351fd026739e6"} Apr 02 14:02:39 crc kubenswrapper[4732]: I0402 14:02:39.099473 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bbht6"] Apr 02 14:02:39 crc kubenswrapper[4732]: I0402 14:02:39.100889 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bbht6" Apr 02 14:02:39 crc kubenswrapper[4732]: I0402 14:02:39.102917 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Apr 02 14:02:39 crc kubenswrapper[4732]: I0402 14:02:39.105180 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Apr 02 14:02:39 crc kubenswrapper[4732]: I0402 14:02:39.120453 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bbht6"] Apr 02 14:02:39 crc kubenswrapper[4732]: I0402 14:02:39.151229 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Apr 02 14:02:39 crc kubenswrapper[4732]: I0402 14:02:39.174640 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83eb790e-b902-4dd9-bcf8-352d5675fbce-scripts\") pod \"nova-cell1-conductor-db-sync-bbht6\" (UID: \"83eb790e-b902-4dd9-bcf8-352d5675fbce\") " pod="openstack/nova-cell1-conductor-db-sync-bbht6" Apr 02 14:02:39 crc kubenswrapper[4732]: I0402 14:02:39.174745 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f5ln\" (UniqueName: \"kubernetes.io/projected/83eb790e-b902-4dd9-bcf8-352d5675fbce-kube-api-access-4f5ln\") pod \"nova-cell1-conductor-db-sync-bbht6\" (UID: \"83eb790e-b902-4dd9-bcf8-352d5675fbce\") " pod="openstack/nova-cell1-conductor-db-sync-bbht6" Apr 02 14:02:39 crc kubenswrapper[4732]: I0402 14:02:39.174783 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83eb790e-b902-4dd9-bcf8-352d5675fbce-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bbht6\" (UID: \"83eb790e-b902-4dd9-bcf8-352d5675fbce\") " pod="openstack/nova-cell1-conductor-db-sync-bbht6" Apr 02 14:02:39 crc kubenswrapper[4732]: I0402 14:02:39.174919 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83eb790e-b902-4dd9-bcf8-352d5675fbce-config-data\") pod \"nova-cell1-conductor-db-sync-bbht6\" (UID: \"83eb790e-b902-4dd9-bcf8-352d5675fbce\") " pod="openstack/nova-cell1-conductor-db-sync-bbht6" Apr 02 14:02:39 crc kubenswrapper[4732]: I0402 14:02:39.237213 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Apr 02 14:02:39 crc kubenswrapper[4732]: W0402 14:02:39.247818 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c4a2d1b_e579_4e87_ba09_500ce4425847.slice/crio-cae47488dee349aa10f2244848c455aa964ebb37046b689230c00ba8c8d2fbfd WatchSource:0}: Error finding container cae47488dee349aa10f2244848c455aa964ebb37046b689230c00ba8c8d2fbfd: Status 404 returned error can't find the container with id cae47488dee349aa10f2244848c455aa964ebb37046b689230c00ba8c8d2fbfd Apr 02 14:02:39 crc kubenswrapper[4732]: W0402 14:02:39.268587 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47153a66_ef48_4bb2_bd19_2a5d5d506d0d.slice/crio-d7dcd11dc4f24241bd222304325f19af954ec9664501d8e6e7970c611cdc89f8 WatchSource:0}: Error finding container d7dcd11dc4f24241bd222304325f19af954ec9664501d8e6e7970c611cdc89f8: Status 404 returned error can't find the container with id d7dcd11dc4f24241bd222304325f19af954ec9664501d8e6e7970c611cdc89f8 Apr 02 14:02:39 crc kubenswrapper[4732]: I0402 14:02:39.269420 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Apr 02 14:02:39 crc kubenswrapper[4732]: I0402 14:02:39.276701 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83eb790e-b902-4dd9-bcf8-352d5675fbce-config-data\") pod \"nova-cell1-conductor-db-sync-bbht6\" (UID: \"83eb790e-b902-4dd9-bcf8-352d5675fbce\") " pod="openstack/nova-cell1-conductor-db-sync-bbht6" Apr 02 14:02:39 crc kubenswrapper[4732]: I0402 14:02:39.276750 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83eb790e-b902-4dd9-bcf8-352d5675fbce-scripts\") pod \"nova-cell1-conductor-db-sync-bbht6\" (UID: \"83eb790e-b902-4dd9-bcf8-352d5675fbce\") " pod="openstack/nova-cell1-conductor-db-sync-bbht6" Apr 02 14:02:39 crc kubenswrapper[4732]: I0402 14:02:39.276798 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f5ln\" (UniqueName: \"kubernetes.io/projected/83eb790e-b902-4dd9-bcf8-352d5675fbce-kube-api-access-4f5ln\") pod \"nova-cell1-conductor-db-sync-bbht6\" (UID: \"83eb790e-b902-4dd9-bcf8-352d5675fbce\") " pod="openstack/nova-cell1-conductor-db-sync-bbht6" Apr 02 14:02:39 crc kubenswrapper[4732]: I0402 14:02:39.276818 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83eb790e-b902-4dd9-bcf8-352d5675fbce-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bbht6\" (UID: \"83eb790e-b902-4dd9-bcf8-352d5675fbce\") " pod="openstack/nova-cell1-conductor-db-sync-bbht6" Apr 02 14:02:39 crc kubenswrapper[4732]: I0402 14:02:39.279276 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-tjr4k"] Apr 02 14:02:39 crc kubenswrapper[4732]: I0402 14:02:39.281220 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83eb790e-b902-4dd9-bcf8-352d5675fbce-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bbht6\" (UID: \"83eb790e-b902-4dd9-bcf8-352d5675fbce\") " pod="openstack/nova-cell1-conductor-db-sync-bbht6" Apr 02 14:02:39 crc kubenswrapper[4732]: I0402 14:02:39.281952 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83eb790e-b902-4dd9-bcf8-352d5675fbce-scripts\") pod \"nova-cell1-conductor-db-sync-bbht6\" (UID: \"83eb790e-b902-4dd9-bcf8-352d5675fbce\") " pod="openstack/nova-cell1-conductor-db-sync-bbht6" Apr 02 14:02:39 crc kubenswrapper[4732]: W0402 14:02:39.283562 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdbfa898_0999_49da_9194_ff7bf15c955a.slice/crio-163b2e846d09b9a03cf17ea555e6647944ddc665e524767fe6e4f21f700331ea WatchSource:0}: Error finding container 163b2e846d09b9a03cf17ea555e6647944ddc665e524767fe6e4f21f700331ea: Status 404 returned error can't find the container with id 163b2e846d09b9a03cf17ea555e6647944ddc665e524767fe6e4f21f700331ea Apr 02 14:02:39 crc kubenswrapper[4732]: I0402 14:02:39.284328 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83eb790e-b902-4dd9-bcf8-352d5675fbce-config-data\") pod \"nova-cell1-conductor-db-sync-bbht6\" (UID: \"83eb790e-b902-4dd9-bcf8-352d5675fbce\") " pod="openstack/nova-cell1-conductor-db-sync-bbht6" Apr 02 14:02:39 crc kubenswrapper[4732]: I0402 14:02:39.292442 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f5ln\" (UniqueName: \"kubernetes.io/projected/83eb790e-b902-4dd9-bcf8-352d5675fbce-kube-api-access-4f5ln\") pod \"nova-cell1-conductor-db-sync-bbht6\" (UID: \"83eb790e-b902-4dd9-bcf8-352d5675fbce\") " pod="openstack/nova-cell1-conductor-db-sync-bbht6" Apr 02 14:02:39 crc kubenswrapper[4732]: I0402 14:02:39.426179 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bbht6" Apr 02 14:02:39 crc kubenswrapper[4732]: I0402 14:02:39.956342 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bbht6"] Apr 02 14:02:40 crc kubenswrapper[4732]: I0402 14:02:40.077386 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"47153a66-ef48-4bb2-bd19-2a5d5d506d0d","Type":"ContainerStarted","Data":"d7dcd11dc4f24241bd222304325f19af954ec9664501d8e6e7970c611cdc89f8"} Apr 02 14:02:40 crc kubenswrapper[4732]: I0402 14:02:40.082049 4732 generic.go:334] "Generic (PLEG): container finished" podID="bdbfa898-0999-49da-9194-ff7bf15c955a" containerID="6130e8c315cc1739ad437865979ed3ee0d0c4838e16bad5d65a0e855b6764253" exitCode=0 Apr 02 14:02:40 crc kubenswrapper[4732]: I0402 14:02:40.082900 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" event={"ID":"bdbfa898-0999-49da-9194-ff7bf15c955a","Type":"ContainerDied","Data":"6130e8c315cc1739ad437865979ed3ee0d0c4838e16bad5d65a0e855b6764253"} Apr 02 14:02:40 crc kubenswrapper[4732]: I0402 14:02:40.082924 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" event={"ID":"bdbfa898-0999-49da-9194-ff7bf15c955a","Type":"ContainerStarted","Data":"163b2e846d09b9a03cf17ea555e6647944ddc665e524767fe6e4f21f700331ea"} Apr 02 14:02:40 crc kubenswrapper[4732]: I0402 14:02:40.099155 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6ef800e5-24ee-4bc3-9248-d49d0c4d5abf","Type":"ContainerStarted","Data":"4b7130d03936e105938902e0b6ac210b937df3795a189e842cf5a56230e3e5a4"} Apr 02 14:02:40 crc kubenswrapper[4732]: I0402 14:02:40.104687 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bbht6" event={"ID":"83eb790e-b902-4dd9-bcf8-352d5675fbce","Type":"ContainerStarted","Data":"7d109ec8c6dc4eec325e698aaf28f793acab9c6ae737d79fce4f139be2a261ef"} Apr 02 14:02:40 crc kubenswrapper[4732]: I0402 14:02:40.112796 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c4a2d1b-e579-4e87-ba09-500ce4425847","Type":"ContainerStarted","Data":"cae47488dee349aa10f2244848c455aa964ebb37046b689230c00ba8c8d2fbfd"} Apr 02 14:02:40 crc kubenswrapper[4732]: I0402 14:02:40.115487 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9pqhd" event={"ID":"d730389e-11ac-4fa7-86d7-efa07afdbe08","Type":"ContainerStarted","Data":"5b83c75a78e02d4e4d6ae2df76d22ab2277c64bc8ccdd02e965158b68d5659fc"} Apr 02 14:02:40 crc kubenswrapper[4732]: I0402 14:02:40.175933 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-9pqhd" podStartSLOduration=3.175913202 podStartE2EDuration="3.175913202s" podCreationTimestamp="2026-04-02 14:02:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:02:40.145114342 +0000 UTC m=+1517.049521905" watchObservedRunningTime="2026-04-02 14:02:40.175913202 +0000 UTC m=+1517.080320765" Apr 02 14:02:41 crc kubenswrapper[4732]: I0402 14:02:41.127694 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" event={"ID":"bdbfa898-0999-49da-9194-ff7bf15c955a","Type":"ContainerStarted","Data":"7683803f1b4e4c9804e6849c443a9edba0c3c79982d1204a8622cb6bb94f982e"} Apr 02 14:02:41 crc kubenswrapper[4732]: I0402 14:02:41.128181 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" Apr 02 14:02:41 crc kubenswrapper[4732]: I0402 14:02:41.129085 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bbht6" event={"ID":"83eb790e-b902-4dd9-bcf8-352d5675fbce","Type":"ContainerStarted","Data":"34479c920540c29d54b8c53909b3d77dfbd492832314b1dd4444ac9221c2967c"} Apr 02 14:02:41 crc kubenswrapper[4732]: I0402 14:02:41.161787 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" podStartSLOduration=3.161764787 podStartE2EDuration="3.161764787s" podCreationTimestamp="2026-04-02 14:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:02:41.145182175 +0000 UTC m=+1518.049589738" watchObservedRunningTime="2026-04-02 14:02:41.161764787 +0000 UTC m=+1518.066172340" Apr 02 14:02:41 crc kubenswrapper[4732]: I0402 14:02:41.180325 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-bbht6" podStartSLOduration=2.180307392 podStartE2EDuration="2.180307392s" podCreationTimestamp="2026-04-02 14:02:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:02:41.170964458 +0000 UTC m=+1518.075372011" watchObservedRunningTime="2026-04-02 14:02:41.180307392 +0000 UTC m=+1518.084714945" Apr 02 14:02:41 crc kubenswrapper[4732]: I0402 14:02:41.940336 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Apr 02 14:02:41 crc kubenswrapper[4732]: I0402 14:02:41.955329 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Apr 02 14:02:43 crc kubenswrapper[4732]: I0402 14:02:43.158485 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"47153a66-ef48-4bb2-bd19-2a5d5d506d0d","Type":"ContainerStarted","Data":"5ffe28f8aefa671a7ac13f382f3d71e9b4eea196e320ebc47f2a1aa238b09e7d"} Apr 02 14:02:43 crc kubenswrapper[4732]: I0402 14:02:43.163844 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6ef800e5-24ee-4bc3-9248-d49d0c4d5abf","Type":"ContainerStarted","Data":"d6cfb4c0e731123f2293e13519cfa7bf6a97b2a7e3c4356605cb7ac754ecbaea"} Apr 02 14:02:43 crc kubenswrapper[4732]: I0402 14:02:43.164018 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="6ef800e5-24ee-4bc3-9248-d49d0c4d5abf" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://d6cfb4c0e731123f2293e13519cfa7bf6a97b2a7e3c4356605cb7ac754ecbaea" gracePeriod=30 Apr 02 14:02:43 crc kubenswrapper[4732]: I0402 14:02:43.164138 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Apr 02 14:02:43 crc kubenswrapper[4732]: I0402 14:02:43.171172 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83dc4cc3-ce54-4685-9485-16526fec666a","Type":"ContainerStarted","Data":"2bf3c865c9d3f80d52215bceb5acce60796625666c1002a703a1e1c4608f9b78"} Apr 02 14:02:43 crc kubenswrapper[4732]: I0402 14:02:43.177257 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c4a2d1b-e579-4e87-ba09-500ce4425847","Type":"ContainerStarted","Data":"f3e22d0e897b95b5740fcdb3a1c3455efd87eff8aed4915bccc350ae6dd0228e"} Apr 02 14:02:43 crc kubenswrapper[4732]: I0402 14:02:43.184418 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.8631645780000001 podStartE2EDuration="5.184401847s" podCreationTimestamp="2026-04-02 14:02:38 +0000 UTC" firstStartedPulling="2026-04-02 14:02:39.270668512 +0000 UTC m=+1516.175076065" lastFinishedPulling="2026-04-02 14:02:42.591905791 +0000 UTC m=+1519.496313334" observedRunningTime="2026-04-02 14:02:43.181339414 +0000 UTC m=+1520.085746987" watchObservedRunningTime="2026-04-02 14:02:43.184401847 +0000 UTC m=+1520.088809400" Apr 02 14:02:43 crc kubenswrapper[4732]: I0402 14:02:43.203823 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.717288693 podStartE2EDuration="6.203804046s" podCreationTimestamp="2026-04-02 14:02:37 +0000 UTC" firstStartedPulling="2026-04-02 14:02:39.16196189 +0000 UTC m=+1516.066369443" lastFinishedPulling="2026-04-02 14:02:42.648477233 +0000 UTC m=+1519.552884796" observedRunningTime="2026-04-02 14:02:43.198509162 +0000 UTC m=+1520.102916725" watchObservedRunningTime="2026-04-02 14:02:43.203804046 +0000 UTC m=+1520.108211599" Apr 02 14:02:43 crc kubenswrapper[4732]: I0402 14:02:43.425687 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:02:43 crc kubenswrapper[4732]: I0402 14:02:43.644720 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Apr 02 14:02:44 crc kubenswrapper[4732]: I0402 14:02:44.188056 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83dc4cc3-ce54-4685-9485-16526fec666a","Type":"ContainerStarted","Data":"de458a620751559cc0ae0bdde07ff19ef2327c161a36efa23aae8c84591fb109"} Apr 02 14:02:44 crc kubenswrapper[4732]: I0402 14:02:44.191055 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3c4a2d1b-e579-4e87-ba09-500ce4425847" containerName="nova-metadata-log" containerID="cri-o://f3e22d0e897b95b5740fcdb3a1c3455efd87eff8aed4915bccc350ae6dd0228e" gracePeriod=30 Apr 02 14:02:44 crc kubenswrapper[4732]: I0402 14:02:44.191501 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3c4a2d1b-e579-4e87-ba09-500ce4425847" containerName="nova-metadata-metadata" containerID="cri-o://ab86c38511d9f5c8f5bce4411b1a12986bbe3a6b1674d763bd3cf2e14e4d4377" gracePeriod=30 Apr 02 14:02:44 crc kubenswrapper[4732]: I0402 14:02:44.191440 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c4a2d1b-e579-4e87-ba09-500ce4425847","Type":"ContainerStarted","Data":"ab86c38511d9f5c8f5bce4411b1a12986bbe3a6b1674d763bd3cf2e14e4d4377"} Apr 02 14:02:44 crc kubenswrapper[4732]: I0402 14:02:44.227011 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.614285427 podStartE2EDuration="7.226967818s" podCreationTimestamp="2026-04-02 14:02:37 +0000 UTC" firstStartedPulling="2026-04-02 14:02:38.978605173 +0000 UTC m=+1515.883012726" lastFinishedPulling="2026-04-02 14:02:42.591287564 +0000 UTC m=+1519.495695117" observedRunningTime="2026-04-02 14:02:44.216498943 +0000 UTC m=+1521.120906516" watchObservedRunningTime="2026-04-02 14:02:44.226967818 +0000 UTC m=+1521.131375381" Apr 02 14:02:44 crc kubenswrapper[4732]: I0402 14:02:44.246895 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.8994556080000002 podStartE2EDuration="6.246714886s" podCreationTimestamp="2026-04-02 14:02:38 +0000 UTC" firstStartedPulling="2026-04-02 14:02:39.249242628 +0000 UTC m=+1516.153650181" lastFinishedPulling="2026-04-02 14:02:42.596501916 +0000 UTC m=+1519.500909459" observedRunningTime="2026-04-02 14:02:44.237546737 +0000 UTC m=+1521.141954310" watchObservedRunningTime="2026-04-02 14:02:44.246714886 +0000 UTC m=+1521.151122439" Apr 02 14:02:44 crc kubenswrapper[4732]: I0402 14:02:44.800550 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 02 14:02:44 crc kubenswrapper[4732]: I0402 14:02:44.919798 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c4a2d1b-e579-4e87-ba09-500ce4425847-logs\") pod \"3c4a2d1b-e579-4e87-ba09-500ce4425847\" (UID: \"3c4a2d1b-e579-4e87-ba09-500ce4425847\") " Apr 02 14:02:44 crc kubenswrapper[4732]: I0402 14:02:44.920058 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c4a2d1b-e579-4e87-ba09-500ce4425847-config-data\") pod \"3c4a2d1b-e579-4e87-ba09-500ce4425847\" (UID: \"3c4a2d1b-e579-4e87-ba09-500ce4425847\") " Apr 02 14:02:44 crc kubenswrapper[4732]: I0402 14:02:44.920158 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c4a2d1b-e579-4e87-ba09-500ce4425847-combined-ca-bundle\") pod \"3c4a2d1b-e579-4e87-ba09-500ce4425847\" (UID: \"3c4a2d1b-e579-4e87-ba09-500ce4425847\") " Apr 02 14:02:44 crc kubenswrapper[4732]: I0402 14:02:44.920194 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvzws\" (UniqueName: \"kubernetes.io/projected/3c4a2d1b-e579-4e87-ba09-500ce4425847-kube-api-access-nvzws\") pod \"3c4a2d1b-e579-4e87-ba09-500ce4425847\" (UID: \"3c4a2d1b-e579-4e87-ba09-500ce4425847\") " Apr 02 14:02:44 crc kubenswrapper[4732]: I0402 14:02:44.920212 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c4a2d1b-e579-4e87-ba09-500ce4425847-logs" (OuterVolumeSpecName: "logs") pod "3c4a2d1b-e579-4e87-ba09-500ce4425847" (UID: "3c4a2d1b-e579-4e87-ba09-500ce4425847"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:02:44 crc kubenswrapper[4732]: I0402 14:02:44.920720 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c4a2d1b-e579-4e87-ba09-500ce4425847-logs\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:44 crc kubenswrapper[4732]: I0402 14:02:44.936941 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c4a2d1b-e579-4e87-ba09-500ce4425847-kube-api-access-nvzws" (OuterVolumeSpecName: "kube-api-access-nvzws") pod "3c4a2d1b-e579-4e87-ba09-500ce4425847" (UID: "3c4a2d1b-e579-4e87-ba09-500ce4425847"). InnerVolumeSpecName "kube-api-access-nvzws". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:02:44 crc kubenswrapper[4732]: I0402 14:02:44.949245 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c4a2d1b-e579-4e87-ba09-500ce4425847-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c4a2d1b-e579-4e87-ba09-500ce4425847" (UID: "3c4a2d1b-e579-4e87-ba09-500ce4425847"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:02:44 crc kubenswrapper[4732]: I0402 14:02:44.949359 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c4a2d1b-e579-4e87-ba09-500ce4425847-config-data" (OuterVolumeSpecName: "config-data") pod "3c4a2d1b-e579-4e87-ba09-500ce4425847" (UID: "3c4a2d1b-e579-4e87-ba09-500ce4425847"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.023480 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c4a2d1b-e579-4e87-ba09-500ce4425847-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.023565 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvzws\" (UniqueName: \"kubernetes.io/projected/3c4a2d1b-e579-4e87-ba09-500ce4425847-kube-api-access-nvzws\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.023578 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c4a2d1b-e579-4e87-ba09-500ce4425847-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.204997 4732 generic.go:334] "Generic (PLEG): container finished" podID="3c4a2d1b-e579-4e87-ba09-500ce4425847" containerID="ab86c38511d9f5c8f5bce4411b1a12986bbe3a6b1674d763bd3cf2e14e4d4377" exitCode=0 Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.205025 4732 generic.go:334] "Generic (PLEG): container finished" podID="3c4a2d1b-e579-4e87-ba09-500ce4425847" containerID="f3e22d0e897b95b5740fcdb3a1c3455efd87eff8aed4915bccc350ae6dd0228e" exitCode=143 Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.205560 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.205847 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c4a2d1b-e579-4e87-ba09-500ce4425847","Type":"ContainerDied","Data":"ab86c38511d9f5c8f5bce4411b1a12986bbe3a6b1674d763bd3cf2e14e4d4377"} Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.205902 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c4a2d1b-e579-4e87-ba09-500ce4425847","Type":"ContainerDied","Data":"f3e22d0e897b95b5740fcdb3a1c3455efd87eff8aed4915bccc350ae6dd0228e"} Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.205921 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c4a2d1b-e579-4e87-ba09-500ce4425847","Type":"ContainerDied","Data":"cae47488dee349aa10f2244848c455aa964ebb37046b689230c00ba8c8d2fbfd"} Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.205940 4732 scope.go:117] "RemoveContainer" containerID="ab86c38511d9f5c8f5bce4411b1a12986bbe3a6b1674d763bd3cf2e14e4d4377" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.247682 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.261689 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.277487 4732 scope.go:117] "RemoveContainer" containerID="f3e22d0e897b95b5740fcdb3a1c3455efd87eff8aed4915bccc350ae6dd0228e" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.287671 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Apr 02 14:02:45 crc kubenswrapper[4732]: E0402 14:02:45.288250 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c4a2d1b-e579-4e87-ba09-500ce4425847" containerName="nova-metadata-log" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.288275 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4a2d1b-e579-4e87-ba09-500ce4425847" containerName="nova-metadata-log" Apr 02 14:02:45 crc kubenswrapper[4732]: E0402 14:02:45.288290 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c4a2d1b-e579-4e87-ba09-500ce4425847" containerName="nova-metadata-metadata" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.288298 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4a2d1b-e579-4e87-ba09-500ce4425847" containerName="nova-metadata-metadata" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.288559 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c4a2d1b-e579-4e87-ba09-500ce4425847" containerName="nova-metadata-metadata" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.288581 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c4a2d1b-e579-4e87-ba09-500ce4425847" containerName="nova-metadata-log" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.289781 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.294106 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.294342 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.322520 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.329971 4732 scope.go:117] "RemoveContainer" containerID="ab86c38511d9f5c8f5bce4411b1a12986bbe3a6b1674d763bd3cf2e14e4d4377" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.330924 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"45d54bbf-4dbe-46ee-89bd-d2602ef090cd\") " pod="openstack/nova-metadata-0" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.331172 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"45d54bbf-4dbe-46ee-89bd-d2602ef090cd\") " pod="openstack/nova-metadata-0" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.331352 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-logs\") pod \"nova-metadata-0\" (UID: \"45d54bbf-4dbe-46ee-89bd-d2602ef090cd\") " pod="openstack/nova-metadata-0" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.331423 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-config-data\") pod \"nova-metadata-0\" (UID: \"45d54bbf-4dbe-46ee-89bd-d2602ef090cd\") " pod="openstack/nova-metadata-0" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.331456 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9fqd\" (UniqueName: \"kubernetes.io/projected/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-kube-api-access-t9fqd\") pod \"nova-metadata-0\" (UID: \"45d54bbf-4dbe-46ee-89bd-d2602ef090cd\") " pod="openstack/nova-metadata-0" Apr 02 14:02:45 crc kubenswrapper[4732]: E0402 14:02:45.331445 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab86c38511d9f5c8f5bce4411b1a12986bbe3a6b1674d763bd3cf2e14e4d4377\": container with ID starting with ab86c38511d9f5c8f5bce4411b1a12986bbe3a6b1674d763bd3cf2e14e4d4377 not found: ID does not exist" containerID="ab86c38511d9f5c8f5bce4411b1a12986bbe3a6b1674d763bd3cf2e14e4d4377" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.331582 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab86c38511d9f5c8f5bce4411b1a12986bbe3a6b1674d763bd3cf2e14e4d4377"} err="failed to get container status \"ab86c38511d9f5c8f5bce4411b1a12986bbe3a6b1674d763bd3cf2e14e4d4377\": rpc error: code = NotFound desc = could not find container \"ab86c38511d9f5c8f5bce4411b1a12986bbe3a6b1674d763bd3cf2e14e4d4377\": container with ID starting with ab86c38511d9f5c8f5bce4411b1a12986bbe3a6b1674d763bd3cf2e14e4d4377 not found: ID does not exist" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.331742 4732 scope.go:117] "RemoveContainer" containerID="f3e22d0e897b95b5740fcdb3a1c3455efd87eff8aed4915bccc350ae6dd0228e" Apr 02 14:02:45 crc kubenswrapper[4732]: E0402 14:02:45.332153 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3e22d0e897b95b5740fcdb3a1c3455efd87eff8aed4915bccc350ae6dd0228e\": container with ID starting with f3e22d0e897b95b5740fcdb3a1c3455efd87eff8aed4915bccc350ae6dd0228e not found: ID does not exist" containerID="f3e22d0e897b95b5740fcdb3a1c3455efd87eff8aed4915bccc350ae6dd0228e" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.332189 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3e22d0e897b95b5740fcdb3a1c3455efd87eff8aed4915bccc350ae6dd0228e"} err="failed to get container status \"f3e22d0e897b95b5740fcdb3a1c3455efd87eff8aed4915bccc350ae6dd0228e\": rpc error: code = NotFound desc = could not find container \"f3e22d0e897b95b5740fcdb3a1c3455efd87eff8aed4915bccc350ae6dd0228e\": container with ID starting with f3e22d0e897b95b5740fcdb3a1c3455efd87eff8aed4915bccc350ae6dd0228e not found: ID does not exist" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.332211 4732 scope.go:117] "RemoveContainer" containerID="ab86c38511d9f5c8f5bce4411b1a12986bbe3a6b1674d763bd3cf2e14e4d4377" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.332425 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab86c38511d9f5c8f5bce4411b1a12986bbe3a6b1674d763bd3cf2e14e4d4377"} err="failed to get container status \"ab86c38511d9f5c8f5bce4411b1a12986bbe3a6b1674d763bd3cf2e14e4d4377\": rpc error: code = NotFound desc = could not find container \"ab86c38511d9f5c8f5bce4411b1a12986bbe3a6b1674d763bd3cf2e14e4d4377\": container with ID starting with ab86c38511d9f5c8f5bce4411b1a12986bbe3a6b1674d763bd3cf2e14e4d4377 not found: ID does not exist" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.332444 4732 scope.go:117] "RemoveContainer" containerID="f3e22d0e897b95b5740fcdb3a1c3455efd87eff8aed4915bccc350ae6dd0228e" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.332710 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3e22d0e897b95b5740fcdb3a1c3455efd87eff8aed4915bccc350ae6dd0228e"} err="failed to get container status \"f3e22d0e897b95b5740fcdb3a1c3455efd87eff8aed4915bccc350ae6dd0228e\": rpc error: code = NotFound desc = could not find container \"f3e22d0e897b95b5740fcdb3a1c3455efd87eff8aed4915bccc350ae6dd0228e\": container with ID starting with f3e22d0e897b95b5740fcdb3a1c3455efd87eff8aed4915bccc350ae6dd0228e not found: ID does not exist" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.433775 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-logs\") pod \"nova-metadata-0\" (UID: \"45d54bbf-4dbe-46ee-89bd-d2602ef090cd\") " pod="openstack/nova-metadata-0" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.433854 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-config-data\") pod \"nova-metadata-0\" (UID: \"45d54bbf-4dbe-46ee-89bd-d2602ef090cd\") " pod="openstack/nova-metadata-0" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.433902 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9fqd\" (UniqueName: \"kubernetes.io/projected/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-kube-api-access-t9fqd\") pod \"nova-metadata-0\" (UID: \"45d54bbf-4dbe-46ee-89bd-d2602ef090cd\") " pod="openstack/nova-metadata-0" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.433933 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"45d54bbf-4dbe-46ee-89bd-d2602ef090cd\") " pod="openstack/nova-metadata-0" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.434030 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"45d54bbf-4dbe-46ee-89bd-d2602ef090cd\") " pod="openstack/nova-metadata-0" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.434730 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-logs\") pod \"nova-metadata-0\" (UID: \"45d54bbf-4dbe-46ee-89bd-d2602ef090cd\") " pod="openstack/nova-metadata-0" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.452568 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"45d54bbf-4dbe-46ee-89bd-d2602ef090cd\") " pod="openstack/nova-metadata-0" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.452774 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"45d54bbf-4dbe-46ee-89bd-d2602ef090cd\") " pod="openstack/nova-metadata-0" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.453406 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-config-data\") pod \"nova-metadata-0\" (UID: \"45d54bbf-4dbe-46ee-89bd-d2602ef090cd\") " pod="openstack/nova-metadata-0" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.461538 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9fqd\" (UniqueName: \"kubernetes.io/projected/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-kube-api-access-t9fqd\") pod \"nova-metadata-0\" (UID: \"45d54bbf-4dbe-46ee-89bd-d2602ef090cd\") " pod="openstack/nova-metadata-0" Apr 02 14:02:45 crc kubenswrapper[4732]: I0402 14:02:45.630755 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 02 14:02:46 crc kubenswrapper[4732]: I0402 14:02:46.126549 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Apr 02 14:02:46 crc kubenswrapper[4732]: W0402 14:02:46.132688 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45d54bbf_4dbe_46ee_89bd_d2602ef090cd.slice/crio-cb015fcf9de5aaccbc08a5da0d7d7ccd6e75d2d76b46e0fff8a80710f3d8a38f WatchSource:0}: Error finding container cb015fcf9de5aaccbc08a5da0d7d7ccd6e75d2d76b46e0fff8a80710f3d8a38f: Status 404 returned error can't find the container with id cb015fcf9de5aaccbc08a5da0d7d7ccd6e75d2d76b46e0fff8a80710f3d8a38f Apr 02 14:02:46 crc kubenswrapper[4732]: I0402 14:02:46.226723 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"45d54bbf-4dbe-46ee-89bd-d2602ef090cd","Type":"ContainerStarted","Data":"cb015fcf9de5aaccbc08a5da0d7d7ccd6e75d2d76b46e0fff8a80710f3d8a38f"} Apr 02 14:02:46 crc kubenswrapper[4732]: I0402 14:02:46.700763 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c4a2d1b-e579-4e87-ba09-500ce4425847" path="/var/lib/kubelet/pods/3c4a2d1b-e579-4e87-ba09-500ce4425847/volumes" Apr 02 14:02:47 crc kubenswrapper[4732]: I0402 14:02:47.240426 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"45d54bbf-4dbe-46ee-89bd-d2602ef090cd","Type":"ContainerStarted","Data":"9a8d5cd81ddb35179544af3285f9f4a52c4c9d2595ceb694dd497e9730aa404a"} Apr 02 14:02:47 crc kubenswrapper[4732]: I0402 14:02:47.240798 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"45d54bbf-4dbe-46ee-89bd-d2602ef090cd","Type":"ContainerStarted","Data":"dcaf3130ea12eea717edb06e8aec1d7d2f11a219ae73c4bf90021b91d6677f79"} Apr 02 14:02:47 crc kubenswrapper[4732]: I0402 14:02:47.465243 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.465222387 podStartE2EDuration="2.465222387s" podCreationTimestamp="2026-04-02 14:02:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:02:47.263484979 +0000 UTC m=+1524.167892532" watchObservedRunningTime="2026-04-02 14:02:47.465222387 +0000 UTC m=+1524.369629940" Apr 02 14:02:47 crc kubenswrapper[4732]: I0402 14:02:47.474190 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Apr 02 14:02:47 crc kubenswrapper[4732]: I0402 14:02:47.474424 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="ac229e50-5412-4f18-be3c-4a364b95dcf2" containerName="kube-state-metrics" containerID="cri-o://07e46c22b4938ab8c21eaa09b4679a3900fc1e4c9b8afef0dd97cc3226fa06e8" gracePeriod=30 Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.015247 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.105006 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwbkx\" (UniqueName: \"kubernetes.io/projected/ac229e50-5412-4f18-be3c-4a364b95dcf2-kube-api-access-hwbkx\") pod \"ac229e50-5412-4f18-be3c-4a364b95dcf2\" (UID: \"ac229e50-5412-4f18-be3c-4a364b95dcf2\") " Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.118640 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac229e50-5412-4f18-be3c-4a364b95dcf2-kube-api-access-hwbkx" (OuterVolumeSpecName: "kube-api-access-hwbkx") pod "ac229e50-5412-4f18-be3c-4a364b95dcf2" (UID: "ac229e50-5412-4f18-be3c-4a364b95dcf2"). InnerVolumeSpecName "kube-api-access-hwbkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.207099 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwbkx\" (UniqueName: \"kubernetes.io/projected/ac229e50-5412-4f18-be3c-4a364b95dcf2-kube-api-access-hwbkx\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.250441 4732 generic.go:334] "Generic (PLEG): container finished" podID="83eb790e-b902-4dd9-bcf8-352d5675fbce" containerID="34479c920540c29d54b8c53909b3d77dfbd492832314b1dd4444ac9221c2967c" exitCode=0 Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.250523 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bbht6" event={"ID":"83eb790e-b902-4dd9-bcf8-352d5675fbce","Type":"ContainerDied","Data":"34479c920540c29d54b8c53909b3d77dfbd492832314b1dd4444ac9221c2967c"} Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.253098 4732 generic.go:334] "Generic (PLEG): container finished" podID="ac229e50-5412-4f18-be3c-4a364b95dcf2" containerID="07e46c22b4938ab8c21eaa09b4679a3900fc1e4c9b8afef0dd97cc3226fa06e8" exitCode=2 Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.253149 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.253144 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ac229e50-5412-4f18-be3c-4a364b95dcf2","Type":"ContainerDied","Data":"07e46c22b4938ab8c21eaa09b4679a3900fc1e4c9b8afef0dd97cc3226fa06e8"} Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.253390 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ac229e50-5412-4f18-be3c-4a364b95dcf2","Type":"ContainerDied","Data":"4169df95bae865b1b2cfc81f7735b1271d2d0e0bdd8fdd18cd6f2115f95ff22e"} Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.253421 4732 scope.go:117] "RemoveContainer" containerID="07e46c22b4938ab8c21eaa09b4679a3900fc1e4c9b8afef0dd97cc3226fa06e8" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.254943 4732 generic.go:334] "Generic (PLEG): container finished" podID="d730389e-11ac-4fa7-86d7-efa07afdbe08" containerID="5b83c75a78e02d4e4d6ae2df76d22ab2277c64bc8ccdd02e965158b68d5659fc" exitCode=0 Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.255015 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9pqhd" event={"ID":"d730389e-11ac-4fa7-86d7-efa07afdbe08","Type":"ContainerDied","Data":"5b83c75a78e02d4e4d6ae2df76d22ab2277c64bc8ccdd02e965158b68d5659fc"} Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.280220 4732 scope.go:117] "RemoveContainer" containerID="07e46c22b4938ab8c21eaa09b4679a3900fc1e4c9b8afef0dd97cc3226fa06e8" Apr 02 14:02:48 crc kubenswrapper[4732]: E0402 14:02:48.280791 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07e46c22b4938ab8c21eaa09b4679a3900fc1e4c9b8afef0dd97cc3226fa06e8\": container with ID starting with 07e46c22b4938ab8c21eaa09b4679a3900fc1e4c9b8afef0dd97cc3226fa06e8 not found: ID does not exist" containerID="07e46c22b4938ab8c21eaa09b4679a3900fc1e4c9b8afef0dd97cc3226fa06e8" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.280819 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07e46c22b4938ab8c21eaa09b4679a3900fc1e4c9b8afef0dd97cc3226fa06e8"} err="failed to get container status \"07e46c22b4938ab8c21eaa09b4679a3900fc1e4c9b8afef0dd97cc3226fa06e8\": rpc error: code = NotFound desc = could not find container \"07e46c22b4938ab8c21eaa09b4679a3900fc1e4c9b8afef0dd97cc3226fa06e8\": container with ID starting with 07e46c22b4938ab8c21eaa09b4679a3900fc1e4c9b8afef0dd97cc3226fa06e8 not found: ID does not exist" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.299920 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.307892 4732 scope.go:117] "RemoveContainer" containerID="bf88c2f3afd541dea90d3b34e2a4249e0c2216413015300809f098f9a9c8ecd1" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.319123 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.338025 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Apr 02 14:02:48 crc kubenswrapper[4732]: E0402 14:02:48.338500 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac229e50-5412-4f18-be3c-4a364b95dcf2" containerName="kube-state-metrics" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.338521 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac229e50-5412-4f18-be3c-4a364b95dcf2" containerName="kube-state-metrics" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.338730 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac229e50-5412-4f18-be3c-4a364b95dcf2" containerName="kube-state-metrics" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.339408 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.342943 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.343374 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.354443 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.389701 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.389755 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.414173 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/96308a9a-b137-4d84-a470-74395c7a5d60-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"96308a9a-b137-4d84-a470-74395c7a5d60\") " pod="openstack/kube-state-metrics-0" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.414823 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cvqb\" (UniqueName: \"kubernetes.io/projected/96308a9a-b137-4d84-a470-74395c7a5d60-kube-api-access-6cvqb\") pod \"kube-state-metrics-0\" (UID: \"96308a9a-b137-4d84-a470-74395c7a5d60\") " pod="openstack/kube-state-metrics-0" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.414933 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96308a9a-b137-4d84-a470-74395c7a5d60-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"96308a9a-b137-4d84-a470-74395c7a5d60\") " pod="openstack/kube-state-metrics-0" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.415281 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/96308a9a-b137-4d84-a470-74395c7a5d60-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"96308a9a-b137-4d84-a470-74395c7a5d60\") " pod="openstack/kube-state-metrics-0" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.521987 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/96308a9a-b137-4d84-a470-74395c7a5d60-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"96308a9a-b137-4d84-a470-74395c7a5d60\") " pod="openstack/kube-state-metrics-0" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.522164 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cvqb\" (UniqueName: \"kubernetes.io/projected/96308a9a-b137-4d84-a470-74395c7a5d60-kube-api-access-6cvqb\") pod \"kube-state-metrics-0\" (UID: \"96308a9a-b137-4d84-a470-74395c7a5d60\") " pod="openstack/kube-state-metrics-0" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.522205 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96308a9a-b137-4d84-a470-74395c7a5d60-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"96308a9a-b137-4d84-a470-74395c7a5d60\") " pod="openstack/kube-state-metrics-0" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.522372 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/96308a9a-b137-4d84-a470-74395c7a5d60-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"96308a9a-b137-4d84-a470-74395c7a5d60\") " pod="openstack/kube-state-metrics-0" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.545736 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/96308a9a-b137-4d84-a470-74395c7a5d60-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"96308a9a-b137-4d84-a470-74395c7a5d60\") " pod="openstack/kube-state-metrics-0" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.545880 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96308a9a-b137-4d84-a470-74395c7a5d60-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"96308a9a-b137-4d84-a470-74395c7a5d60\") " pod="openstack/kube-state-metrics-0" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.549144 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/96308a9a-b137-4d84-a470-74395c7a5d60-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"96308a9a-b137-4d84-a470-74395c7a5d60\") " pod="openstack/kube-state-metrics-0" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.553813 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cvqb\" (UniqueName: \"kubernetes.io/projected/96308a9a-b137-4d84-a470-74395c7a5d60-kube-api-access-6cvqb\") pod \"kube-state-metrics-0\" (UID: \"96308a9a-b137-4d84-a470-74395c7a5d60\") " pod="openstack/kube-state-metrics-0" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.642299 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.673680 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.689521 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.691086 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac229e50-5412-4f18-be3c-4a364b95dcf2" path="/var/lib/kubelet/pods/ac229e50-5412-4f18-be3c-4a364b95dcf2/volumes" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.743756 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.815492 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-kk8pk"] Apr 02 14:02:48 crc kubenswrapper[4732]: I0402 14:02:48.815741 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" podUID="56849534-b6ee-4c76-a706-22af33d61018" containerName="dnsmasq-dns" containerID="cri-o://fc38836c00961eba57500c9612eea7c44644cb5293906ff4ac62e9cc85f02adf" gracePeriod=10 Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.176841 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.277710 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"96308a9a-b137-4d84-a470-74395c7a5d60","Type":"ContainerStarted","Data":"daf3bbbaa41356c8ee7000275428489ba2edeef2c76e248f5473544535542603"} Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.281282 4732 generic.go:334] "Generic (PLEG): container finished" podID="56849534-b6ee-4c76-a706-22af33d61018" containerID="fc38836c00961eba57500c9612eea7c44644cb5293906ff4ac62e9cc85f02adf" exitCode=0 Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.281343 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" event={"ID":"56849534-b6ee-4c76-a706-22af33d61018","Type":"ContainerDied","Data":"fc38836c00961eba57500c9612eea7c44644cb5293906ff4ac62e9cc85f02adf"} Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.321886 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.470397 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.479716 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="83dc4cc3-ce54-4685-9485-16526fec666a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.479955 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="83dc4cc3-ce54-4685-9485-16526fec666a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.554521 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-dns-swift-storage-0\") pod \"56849534-b6ee-4c76-a706-22af33d61018\" (UID: \"56849534-b6ee-4c76-a706-22af33d61018\") " Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.554783 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-ovsdbserver-sb\") pod \"56849534-b6ee-4c76-a706-22af33d61018\" (UID: \"56849534-b6ee-4c76-a706-22af33d61018\") " Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.554827 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-dns-svc\") pod \"56849534-b6ee-4c76-a706-22af33d61018\" (UID: \"56849534-b6ee-4c76-a706-22af33d61018\") " Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.554852 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-ovsdbserver-nb\") pod \"56849534-b6ee-4c76-a706-22af33d61018\" (UID: \"56849534-b6ee-4c76-a706-22af33d61018\") " Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.554915 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-config\") pod \"56849534-b6ee-4c76-a706-22af33d61018\" (UID: \"56849534-b6ee-4c76-a706-22af33d61018\") " Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.554971 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhdbx\" (UniqueName: \"kubernetes.io/projected/56849534-b6ee-4c76-a706-22af33d61018-kube-api-access-zhdbx\") pod \"56849534-b6ee-4c76-a706-22af33d61018\" (UID: \"56849534-b6ee-4c76-a706-22af33d61018\") " Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.575857 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56849534-b6ee-4c76-a706-22af33d61018-kube-api-access-zhdbx" (OuterVolumeSpecName: "kube-api-access-zhdbx") pod "56849534-b6ee-4c76-a706-22af33d61018" (UID: "56849534-b6ee-4c76-a706-22af33d61018"). InnerVolumeSpecName "kube-api-access-zhdbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.683169 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhdbx\" (UniqueName: \"kubernetes.io/projected/56849534-b6ee-4c76-a706-22af33d61018-kube-api-access-zhdbx\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.699699 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "56849534-b6ee-4c76-a706-22af33d61018" (UID: "56849534-b6ee-4c76-a706-22af33d61018"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.714276 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "56849534-b6ee-4c76-a706-22af33d61018" (UID: "56849534-b6ee-4c76-a706-22af33d61018"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.715400 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "56849534-b6ee-4c76-a706-22af33d61018" (UID: "56849534-b6ee-4c76-a706-22af33d61018"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.721952 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bbht6" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.780263 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "56849534-b6ee-4c76-a706-22af33d61018" (UID: "56849534-b6ee-4c76-a706-22af33d61018"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.783101 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9pqhd" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.783820 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-config" (OuterVolumeSpecName: "config") pod "56849534-b6ee-4c76-a706-22af33d61018" (UID: "56849534-b6ee-4c76-a706-22af33d61018"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.784806 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.784828 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.784838 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.784847 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-config\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.784876 4732 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56849534-b6ee-4c76-a706-22af33d61018-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.887759 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83eb790e-b902-4dd9-bcf8-352d5675fbce-scripts\") pod \"83eb790e-b902-4dd9-bcf8-352d5675fbce\" (UID: \"83eb790e-b902-4dd9-bcf8-352d5675fbce\") " Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.887896 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d730389e-11ac-4fa7-86d7-efa07afdbe08-combined-ca-bundle\") pod \"d730389e-11ac-4fa7-86d7-efa07afdbe08\" (UID: \"d730389e-11ac-4fa7-86d7-efa07afdbe08\") " Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.887926 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgpwb\" (UniqueName: \"kubernetes.io/projected/d730389e-11ac-4fa7-86d7-efa07afdbe08-kube-api-access-rgpwb\") pod \"d730389e-11ac-4fa7-86d7-efa07afdbe08\" (UID: \"d730389e-11ac-4fa7-86d7-efa07afdbe08\") " Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.887954 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d730389e-11ac-4fa7-86d7-efa07afdbe08-config-data\") pod \"d730389e-11ac-4fa7-86d7-efa07afdbe08\" (UID: \"d730389e-11ac-4fa7-86d7-efa07afdbe08\") " Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.888011 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f5ln\" (UniqueName: \"kubernetes.io/projected/83eb790e-b902-4dd9-bcf8-352d5675fbce-kube-api-access-4f5ln\") pod \"83eb790e-b902-4dd9-bcf8-352d5675fbce\" (UID: \"83eb790e-b902-4dd9-bcf8-352d5675fbce\") " Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.888039 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83eb790e-b902-4dd9-bcf8-352d5675fbce-combined-ca-bundle\") pod \"83eb790e-b902-4dd9-bcf8-352d5675fbce\" (UID: \"83eb790e-b902-4dd9-bcf8-352d5675fbce\") " Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.888124 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d730389e-11ac-4fa7-86d7-efa07afdbe08-scripts\") pod \"d730389e-11ac-4fa7-86d7-efa07afdbe08\" (UID: \"d730389e-11ac-4fa7-86d7-efa07afdbe08\") " Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.888169 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83eb790e-b902-4dd9-bcf8-352d5675fbce-config-data\") pod \"83eb790e-b902-4dd9-bcf8-352d5675fbce\" (UID: \"83eb790e-b902-4dd9-bcf8-352d5675fbce\") " Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.905700 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d730389e-11ac-4fa7-86d7-efa07afdbe08-scripts" (OuterVolumeSpecName: "scripts") pod "d730389e-11ac-4fa7-86d7-efa07afdbe08" (UID: "d730389e-11ac-4fa7-86d7-efa07afdbe08"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.906181 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83eb790e-b902-4dd9-bcf8-352d5675fbce-kube-api-access-4f5ln" (OuterVolumeSpecName: "kube-api-access-4f5ln") pod "83eb790e-b902-4dd9-bcf8-352d5675fbce" (UID: "83eb790e-b902-4dd9-bcf8-352d5675fbce"). InnerVolumeSpecName "kube-api-access-4f5ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.917795 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83eb790e-b902-4dd9-bcf8-352d5675fbce-scripts" (OuterVolumeSpecName: "scripts") pod "83eb790e-b902-4dd9-bcf8-352d5675fbce" (UID: "83eb790e-b902-4dd9-bcf8-352d5675fbce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.939835 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d730389e-11ac-4fa7-86d7-efa07afdbe08-kube-api-access-rgpwb" (OuterVolumeSpecName: "kube-api-access-rgpwb") pod "d730389e-11ac-4fa7-86d7-efa07afdbe08" (UID: "d730389e-11ac-4fa7-86d7-efa07afdbe08"). InnerVolumeSpecName "kube-api-access-rgpwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.943791 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d730389e-11ac-4fa7-86d7-efa07afdbe08-config-data" (OuterVolumeSpecName: "config-data") pod "d730389e-11ac-4fa7-86d7-efa07afdbe08" (UID: "d730389e-11ac-4fa7-86d7-efa07afdbe08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.972023 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83eb790e-b902-4dd9-bcf8-352d5675fbce-config-data" (OuterVolumeSpecName: "config-data") pod "83eb790e-b902-4dd9-bcf8-352d5675fbce" (UID: "83eb790e-b902-4dd9-bcf8-352d5675fbce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.992557 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83eb790e-b902-4dd9-bcf8-352d5675fbce-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.992589 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgpwb\" (UniqueName: \"kubernetes.io/projected/d730389e-11ac-4fa7-86d7-efa07afdbe08-kube-api-access-rgpwb\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.992601 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d730389e-11ac-4fa7-86d7-efa07afdbe08-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.992623 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f5ln\" (UniqueName: \"kubernetes.io/projected/83eb790e-b902-4dd9-bcf8-352d5675fbce-kube-api-access-4f5ln\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.992632 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d730389e-11ac-4fa7-86d7-efa07afdbe08-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.992640 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83eb790e-b902-4dd9-bcf8-352d5675fbce-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:49 crc kubenswrapper[4732]: I0402 14:02:49.994141 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83eb790e-b902-4dd9-bcf8-352d5675fbce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83eb790e-b902-4dd9-bcf8-352d5675fbce" (UID: "83eb790e-b902-4dd9-bcf8-352d5675fbce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.016967 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d730389e-11ac-4fa7-86d7-efa07afdbe08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d730389e-11ac-4fa7-86d7-efa07afdbe08" (UID: "d730389e-11ac-4fa7-86d7-efa07afdbe08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.094245 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d730389e-11ac-4fa7-86d7-efa07afdbe08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.094650 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83eb790e-b902-4dd9-bcf8-352d5675fbce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.106234 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.106550 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fca45d1b-97a6-44fe-a519-29ed5ab029da" containerName="ceilometer-central-agent" containerID="cri-o://7aff1bfaa21127765fbf36fc46affb20b40da03b4c2fc027f0287fff9b4e86a0" gracePeriod=30 Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.106633 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fca45d1b-97a6-44fe-a519-29ed5ab029da" containerName="proxy-httpd" containerID="cri-o://344ad458c644e8d4d3dba5022a0d33beae96bde7071f592d8c4991568275cf8b" gracePeriod=30 Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.106712 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fca45d1b-97a6-44fe-a519-29ed5ab029da" containerName="ceilometer-notification-agent" containerID="cri-o://86b15c0a6e6bd7171b13d682658a8d0df48f366861a13932771f854240085848" gracePeriod=30 Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.106692 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fca45d1b-97a6-44fe-a519-29ed5ab029da" containerName="sg-core" containerID="cri-o://5efa4fba6157a758990fc184bbe56eb1978c503a2bea7de2970f2c392a51e001" gracePeriod=30 Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.291242 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"96308a9a-b137-4d84-a470-74395c7a5d60","Type":"ContainerStarted","Data":"71f31c647a475cfc06505cfbfa47eb8dca163d3380d77fcfc539aaa3a14afd68"} Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.291372 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.293652 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" event={"ID":"56849534-b6ee-4c76-a706-22af33d61018","Type":"ContainerDied","Data":"3f109315f5b2612396b6a55ba5afc71c7e59da9514d24bbef834eab5f9134bb6"} Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.294106 4732 scope.go:117] "RemoveContainer" containerID="fc38836c00961eba57500c9612eea7c44644cb5293906ff4ac62e9cc85f02adf" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.293670 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-kk8pk" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.295477 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bbht6" event={"ID":"83eb790e-b902-4dd9-bcf8-352d5675fbce","Type":"ContainerDied","Data":"7d109ec8c6dc4eec325e698aaf28f793acab9c6ae737d79fce4f139be2a261ef"} Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.295501 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d109ec8c6dc4eec325e698aaf28f793acab9c6ae737d79fce4f139be2a261ef" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.295630 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bbht6" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.302478 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9pqhd" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.302680 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9pqhd" event={"ID":"d730389e-11ac-4fa7-86d7-efa07afdbe08","Type":"ContainerDied","Data":"13987ae8669abfd7ae8edbe870112408c39d97aada631737512a26e14fa02f74"} Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.302735 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13987ae8669abfd7ae8edbe870112408c39d97aada631737512a26e14fa02f74" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.315735 4732 generic.go:334] "Generic (PLEG): container finished" podID="fca45d1b-97a6-44fe-a519-29ed5ab029da" containerID="5efa4fba6157a758990fc184bbe56eb1978c503a2bea7de2970f2c392a51e001" exitCode=2 Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.315858 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fca45d1b-97a6-44fe-a519-29ed5ab029da","Type":"ContainerDied","Data":"5efa4fba6157a758990fc184bbe56eb1978c503a2bea7de2970f2c392a51e001"} Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.335360 4732 scope.go:117] "RemoveContainer" containerID="95c2e76f85264ee19c5f471b3778dba7420a8e8aea1b68c330f24d9465562e2a" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.361757 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Apr 02 14:02:50 crc kubenswrapper[4732]: E0402 14:02:50.362285 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56849534-b6ee-4c76-a706-22af33d61018" containerName="dnsmasq-dns" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.362304 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="56849534-b6ee-4c76-a706-22af33d61018" containerName="dnsmasq-dns" Apr 02 14:02:50 crc kubenswrapper[4732]: E0402 14:02:50.362334 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d730389e-11ac-4fa7-86d7-efa07afdbe08" containerName="nova-manage" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.362344 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d730389e-11ac-4fa7-86d7-efa07afdbe08" containerName="nova-manage" Apr 02 14:02:50 crc kubenswrapper[4732]: E0402 14:02:50.362360 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56849534-b6ee-4c76-a706-22af33d61018" containerName="init" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.362368 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="56849534-b6ee-4c76-a706-22af33d61018" containerName="init" Apr 02 14:02:50 crc kubenswrapper[4732]: E0402 14:02:50.362392 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83eb790e-b902-4dd9-bcf8-352d5675fbce" containerName="nova-cell1-conductor-db-sync" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.362401 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="83eb790e-b902-4dd9-bcf8-352d5675fbce" containerName="nova-cell1-conductor-db-sync" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.362650 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="83eb790e-b902-4dd9-bcf8-352d5675fbce" containerName="nova-cell1-conductor-db-sync" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.362673 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d730389e-11ac-4fa7-86d7-efa07afdbe08" containerName="nova-manage" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.362733 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="56849534-b6ee-4c76-a706-22af33d61018" containerName="dnsmasq-dns" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.363811 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.366306 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.393738 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.021279593 podStartE2EDuration="2.393720812s" podCreationTimestamp="2026-04-02 14:02:48 +0000 UTC" firstStartedPulling="2026-04-02 14:02:49.190514863 +0000 UTC m=+1526.094922416" lastFinishedPulling="2026-04-02 14:02:49.562956082 +0000 UTC m=+1526.467363635" observedRunningTime="2026-04-02 14:02:50.337915811 +0000 UTC m=+1527.242323384" watchObservedRunningTime="2026-04-02 14:02:50.393720812 +0000 UTC m=+1527.298128365" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.414661 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.423780 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-kk8pk"] Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.432787 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-kk8pk"] Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.502643 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43834d16-35ed-4baa-8292-26a762220c9a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"43834d16-35ed-4baa-8292-26a762220c9a\") " pod="openstack/nova-cell1-conductor-0" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.502706 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43834d16-35ed-4baa-8292-26a762220c9a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"43834d16-35ed-4baa-8292-26a762220c9a\") " pod="openstack/nova-cell1-conductor-0" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.502863 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j85sx\" (UniqueName: \"kubernetes.io/projected/43834d16-35ed-4baa-8292-26a762220c9a-kube-api-access-j85sx\") pod \"nova-cell1-conductor-0\" (UID: \"43834d16-35ed-4baa-8292-26a762220c9a\") " pod="openstack/nova-cell1-conductor-0" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.518216 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.518445 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="83dc4cc3-ce54-4685-9485-16526fec666a" containerName="nova-api-log" containerID="cri-o://2bf3c865c9d3f80d52215bceb5acce60796625666c1002a703a1e1c4608f9b78" gracePeriod=30 Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.518518 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="83dc4cc3-ce54-4685-9485-16526fec666a" containerName="nova-api-api" containerID="cri-o://de458a620751559cc0ae0bdde07ff19ef2327c161a36efa23aae8c84591fb109" gracePeriod=30 Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.568220 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.581922 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.582423 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="45d54bbf-4dbe-46ee-89bd-d2602ef090cd" containerName="nova-metadata-log" containerID="cri-o://dcaf3130ea12eea717edb06e8aec1d7d2f11a219ae73c4bf90021b91d6677f79" gracePeriod=30 Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.582396 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="45d54bbf-4dbe-46ee-89bd-d2602ef090cd" containerName="nova-metadata-metadata" containerID="cri-o://9a8d5cd81ddb35179544af3285f9f4a52c4c9d2595ceb694dd497e9730aa404a" gracePeriod=30 Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.604961 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43834d16-35ed-4baa-8292-26a762220c9a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"43834d16-35ed-4baa-8292-26a762220c9a\") " pod="openstack/nova-cell1-conductor-0" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.605039 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43834d16-35ed-4baa-8292-26a762220c9a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"43834d16-35ed-4baa-8292-26a762220c9a\") " pod="openstack/nova-cell1-conductor-0" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.605215 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j85sx\" (UniqueName: \"kubernetes.io/projected/43834d16-35ed-4baa-8292-26a762220c9a-kube-api-access-j85sx\") pod \"nova-cell1-conductor-0\" (UID: \"43834d16-35ed-4baa-8292-26a762220c9a\") " pod="openstack/nova-cell1-conductor-0" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.611507 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43834d16-35ed-4baa-8292-26a762220c9a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"43834d16-35ed-4baa-8292-26a762220c9a\") " pod="openstack/nova-cell1-conductor-0" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.611561 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43834d16-35ed-4baa-8292-26a762220c9a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"43834d16-35ed-4baa-8292-26a762220c9a\") " pod="openstack/nova-cell1-conductor-0" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.626681 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j85sx\" (UniqueName: \"kubernetes.io/projected/43834d16-35ed-4baa-8292-26a762220c9a-kube-api-access-j85sx\") pod \"nova-cell1-conductor-0\" (UID: \"43834d16-35ed-4baa-8292-26a762220c9a\") " pod="openstack/nova-cell1-conductor-0" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.692280 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56849534-b6ee-4c76-a706-22af33d61018" path="/var/lib/kubelet/pods/56849534-b6ee-4c76-a706-22af33d61018/volumes" Apr 02 14:02:50 crc kubenswrapper[4732]: I0402 14:02:50.703423 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.182078 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.342468 4732 generic.go:334] "Generic (PLEG): container finished" podID="fca45d1b-97a6-44fe-a519-29ed5ab029da" containerID="344ad458c644e8d4d3dba5022a0d33beae96bde7071f592d8c4991568275cf8b" exitCode=0 Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.342499 4732 generic.go:334] "Generic (PLEG): container finished" podID="fca45d1b-97a6-44fe-a519-29ed5ab029da" containerID="7aff1bfaa21127765fbf36fc46affb20b40da03b4c2fc027f0287fff9b4e86a0" exitCode=0 Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.342559 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fca45d1b-97a6-44fe-a519-29ed5ab029da","Type":"ContainerDied","Data":"344ad458c644e8d4d3dba5022a0d33beae96bde7071f592d8c4991568275cf8b"} Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.342586 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fca45d1b-97a6-44fe-a519-29ed5ab029da","Type":"ContainerDied","Data":"7aff1bfaa21127765fbf36fc46affb20b40da03b4c2fc027f0287fff9b4e86a0"} Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.346007 4732 generic.go:334] "Generic (PLEG): container finished" podID="83dc4cc3-ce54-4685-9485-16526fec666a" containerID="2bf3c865c9d3f80d52215bceb5acce60796625666c1002a703a1e1c4608f9b78" exitCode=143 Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.346047 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83dc4cc3-ce54-4685-9485-16526fec666a","Type":"ContainerDied","Data":"2bf3c865c9d3f80d52215bceb5acce60796625666c1002a703a1e1c4608f9b78"} Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.346698 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.350969 4732 generic.go:334] "Generic (PLEG): container finished" podID="45d54bbf-4dbe-46ee-89bd-d2602ef090cd" containerID="9a8d5cd81ddb35179544af3285f9f4a52c4c9d2595ceb694dd497e9730aa404a" exitCode=0 Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.351000 4732 generic.go:334] "Generic (PLEG): container finished" podID="45d54bbf-4dbe-46ee-89bd-d2602ef090cd" containerID="dcaf3130ea12eea717edb06e8aec1d7d2f11a219ae73c4bf90021b91d6677f79" exitCode=143 Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.351031 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"45d54bbf-4dbe-46ee-89bd-d2602ef090cd","Type":"ContainerDied","Data":"9a8d5cd81ddb35179544af3285f9f4a52c4c9d2595ceb694dd497e9730aa404a"} Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.351074 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"45d54bbf-4dbe-46ee-89bd-d2602ef090cd","Type":"ContainerDied","Data":"dcaf3130ea12eea717edb06e8aec1d7d2f11a219ae73c4bf90021b91d6677f79"} Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.351087 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"45d54bbf-4dbe-46ee-89bd-d2602ef090cd","Type":"ContainerDied","Data":"cb015fcf9de5aaccbc08a5da0d7d7ccd6e75d2d76b46e0fff8a80710f3d8a38f"} Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.351105 4732 scope.go:117] "RemoveContainer" containerID="9a8d5cd81ddb35179544af3285f9f4a52c4c9d2595ceb694dd497e9730aa404a" Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.353236 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"43834d16-35ed-4baa-8292-26a762220c9a","Type":"ContainerStarted","Data":"81b78ec8d499237b7eae183825626ccd51e57e2c14aa7f0d73a21747120a7efa"} Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.353364 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="47153a66-ef48-4bb2-bd19-2a5d5d506d0d" containerName="nova-scheduler-scheduler" containerID="cri-o://5ffe28f8aefa671a7ac13f382f3d71e9b4eea196e320ebc47f2a1aa238b09e7d" gracePeriod=30 Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.388928 4732 scope.go:117] "RemoveContainer" containerID="dcaf3130ea12eea717edb06e8aec1d7d2f11a219ae73c4bf90021b91d6677f79" Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.419117 4732 scope.go:117] "RemoveContainer" containerID="9a8d5cd81ddb35179544af3285f9f4a52c4c9d2595ceb694dd497e9730aa404a" Apr 02 14:02:51 crc kubenswrapper[4732]: E0402 14:02:51.419590 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a8d5cd81ddb35179544af3285f9f4a52c4c9d2595ceb694dd497e9730aa404a\": container with ID starting with 9a8d5cd81ddb35179544af3285f9f4a52c4c9d2595ceb694dd497e9730aa404a not found: ID does not exist" containerID="9a8d5cd81ddb35179544af3285f9f4a52c4c9d2595ceb694dd497e9730aa404a" Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.419748 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a8d5cd81ddb35179544af3285f9f4a52c4c9d2595ceb694dd497e9730aa404a"} err="failed to get container status \"9a8d5cd81ddb35179544af3285f9f4a52c4c9d2595ceb694dd497e9730aa404a\": rpc error: code = NotFound desc = could not find container \"9a8d5cd81ddb35179544af3285f9f4a52c4c9d2595ceb694dd497e9730aa404a\": container with ID starting with 9a8d5cd81ddb35179544af3285f9f4a52c4c9d2595ceb694dd497e9730aa404a not found: ID does not exist" Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.419774 4732 scope.go:117] "RemoveContainer" containerID="dcaf3130ea12eea717edb06e8aec1d7d2f11a219ae73c4bf90021b91d6677f79" Apr 02 14:02:51 crc kubenswrapper[4732]: E0402 14:02:51.420088 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcaf3130ea12eea717edb06e8aec1d7d2f11a219ae73c4bf90021b91d6677f79\": container with ID starting with dcaf3130ea12eea717edb06e8aec1d7d2f11a219ae73c4bf90021b91d6677f79 not found: ID does not exist" containerID="dcaf3130ea12eea717edb06e8aec1d7d2f11a219ae73c4bf90021b91d6677f79" Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.420144 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcaf3130ea12eea717edb06e8aec1d7d2f11a219ae73c4bf90021b91d6677f79"} err="failed to get container status \"dcaf3130ea12eea717edb06e8aec1d7d2f11a219ae73c4bf90021b91d6677f79\": rpc error: code = NotFound desc = could not find container \"dcaf3130ea12eea717edb06e8aec1d7d2f11a219ae73c4bf90021b91d6677f79\": container with ID starting with dcaf3130ea12eea717edb06e8aec1d7d2f11a219ae73c4bf90021b91d6677f79 not found: ID does not exist" Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.420171 4732 scope.go:117] "RemoveContainer" containerID="9a8d5cd81ddb35179544af3285f9f4a52c4c9d2595ceb694dd497e9730aa404a" Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.420703 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a8d5cd81ddb35179544af3285f9f4a52c4c9d2595ceb694dd497e9730aa404a"} err="failed to get container status \"9a8d5cd81ddb35179544af3285f9f4a52c4c9d2595ceb694dd497e9730aa404a\": rpc error: code = NotFound desc = could not find container \"9a8d5cd81ddb35179544af3285f9f4a52c4c9d2595ceb694dd497e9730aa404a\": container with ID starting with 9a8d5cd81ddb35179544af3285f9f4a52c4c9d2595ceb694dd497e9730aa404a not found: ID does not exist" Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.420795 4732 scope.go:117] "RemoveContainer" containerID="dcaf3130ea12eea717edb06e8aec1d7d2f11a219ae73c4bf90021b91d6677f79" Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.421647 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcaf3130ea12eea717edb06e8aec1d7d2f11a219ae73c4bf90021b91d6677f79"} err="failed to get container status \"dcaf3130ea12eea717edb06e8aec1d7d2f11a219ae73c4bf90021b91d6677f79\": rpc error: code = NotFound desc = could not find container \"dcaf3130ea12eea717edb06e8aec1d7d2f11a219ae73c4bf90021b91d6677f79\": container with ID starting with dcaf3130ea12eea717edb06e8aec1d7d2f11a219ae73c4bf90021b91d6677f79 not found: ID does not exist" Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.435084 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-nova-metadata-tls-certs\") pod \"45d54bbf-4dbe-46ee-89bd-d2602ef090cd\" (UID: \"45d54bbf-4dbe-46ee-89bd-d2602ef090cd\") " Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.435320 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-config-data\") pod \"45d54bbf-4dbe-46ee-89bd-d2602ef090cd\" (UID: \"45d54bbf-4dbe-46ee-89bd-d2602ef090cd\") " Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.435471 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-logs\") pod \"45d54bbf-4dbe-46ee-89bd-d2602ef090cd\" (UID: \"45d54bbf-4dbe-46ee-89bd-d2602ef090cd\") " Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.435565 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-combined-ca-bundle\") pod \"45d54bbf-4dbe-46ee-89bd-d2602ef090cd\" (UID: \"45d54bbf-4dbe-46ee-89bd-d2602ef090cd\") " Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.435676 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9fqd\" (UniqueName: \"kubernetes.io/projected/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-kube-api-access-t9fqd\") pod \"45d54bbf-4dbe-46ee-89bd-d2602ef090cd\" (UID: \"45d54bbf-4dbe-46ee-89bd-d2602ef090cd\") " Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.435839 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-logs" (OuterVolumeSpecName: "logs") pod "45d54bbf-4dbe-46ee-89bd-d2602ef090cd" (UID: "45d54bbf-4dbe-46ee-89bd-d2602ef090cd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.436473 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-logs\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.442358 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-kube-api-access-t9fqd" (OuterVolumeSpecName: "kube-api-access-t9fqd") pod "45d54bbf-4dbe-46ee-89bd-d2602ef090cd" (UID: "45d54bbf-4dbe-46ee-89bd-d2602ef090cd"). InnerVolumeSpecName "kube-api-access-t9fqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.477989 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45d54bbf-4dbe-46ee-89bd-d2602ef090cd" (UID: "45d54bbf-4dbe-46ee-89bd-d2602ef090cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.480720 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-config-data" (OuterVolumeSpecName: "config-data") pod "45d54bbf-4dbe-46ee-89bd-d2602ef090cd" (UID: "45d54bbf-4dbe-46ee-89bd-d2602ef090cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.497016 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "45d54bbf-4dbe-46ee-89bd-d2602ef090cd" (UID: "45d54bbf-4dbe-46ee-89bd-d2602ef090cd"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.538610 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.538671 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.538692 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9fqd\" (UniqueName: \"kubernetes.io/projected/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-kube-api-access-t9fqd\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:51 crc kubenswrapper[4732]: I0402 14:02:51.538705 4732 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/45d54bbf-4dbe-46ee-89bd-d2602ef090cd-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.139980 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.254090 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fca45d1b-97a6-44fe-a519-29ed5ab029da-combined-ca-bundle\") pod \"fca45d1b-97a6-44fe-a519-29ed5ab029da\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.254150 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhhps\" (UniqueName: \"kubernetes.io/projected/fca45d1b-97a6-44fe-a519-29ed5ab029da-kube-api-access-nhhps\") pod \"fca45d1b-97a6-44fe-a519-29ed5ab029da\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.254212 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fca45d1b-97a6-44fe-a519-29ed5ab029da-run-httpd\") pod \"fca45d1b-97a6-44fe-a519-29ed5ab029da\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.254271 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fca45d1b-97a6-44fe-a519-29ed5ab029da-log-httpd\") pod \"fca45d1b-97a6-44fe-a519-29ed5ab029da\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.254755 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fca45d1b-97a6-44fe-a519-29ed5ab029da-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fca45d1b-97a6-44fe-a519-29ed5ab029da" (UID: "fca45d1b-97a6-44fe-a519-29ed5ab029da"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.254842 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fca45d1b-97a6-44fe-a519-29ed5ab029da-config-data\") pod \"fca45d1b-97a6-44fe-a519-29ed5ab029da\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.255087 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fca45d1b-97a6-44fe-a519-29ed5ab029da-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fca45d1b-97a6-44fe-a519-29ed5ab029da" (UID: "fca45d1b-97a6-44fe-a519-29ed5ab029da"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.255212 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fca45d1b-97a6-44fe-a519-29ed5ab029da-scripts\") pod \"fca45d1b-97a6-44fe-a519-29ed5ab029da\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.255278 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fca45d1b-97a6-44fe-a519-29ed5ab029da-sg-core-conf-yaml\") pod \"fca45d1b-97a6-44fe-a519-29ed5ab029da\" (UID: \"fca45d1b-97a6-44fe-a519-29ed5ab029da\") " Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.255825 4732 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fca45d1b-97a6-44fe-a519-29ed5ab029da-run-httpd\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.255848 4732 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fca45d1b-97a6-44fe-a519-29ed5ab029da-log-httpd\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.268005 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fca45d1b-97a6-44fe-a519-29ed5ab029da-kube-api-access-nhhps" (OuterVolumeSpecName: "kube-api-access-nhhps") pod "fca45d1b-97a6-44fe-a519-29ed5ab029da" (UID: "fca45d1b-97a6-44fe-a519-29ed5ab029da"). InnerVolumeSpecName "kube-api-access-nhhps". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.268188 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fca45d1b-97a6-44fe-a519-29ed5ab029da-scripts" (OuterVolumeSpecName: "scripts") pod "fca45d1b-97a6-44fe-a519-29ed5ab029da" (UID: "fca45d1b-97a6-44fe-a519-29ed5ab029da"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.285883 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fca45d1b-97a6-44fe-a519-29ed5ab029da-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fca45d1b-97a6-44fe-a519-29ed5ab029da" (UID: "fca45d1b-97a6-44fe-a519-29ed5ab029da"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.330677 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fca45d1b-97a6-44fe-a519-29ed5ab029da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fca45d1b-97a6-44fe-a519-29ed5ab029da" (UID: "fca45d1b-97a6-44fe-a519-29ed5ab029da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.349303 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fca45d1b-97a6-44fe-a519-29ed5ab029da-config-data" (OuterVolumeSpecName: "config-data") pod "fca45d1b-97a6-44fe-a519-29ed5ab029da" (UID: "fca45d1b-97a6-44fe-a519-29ed5ab029da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.358249 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fca45d1b-97a6-44fe-a519-29ed5ab029da-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.358757 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fca45d1b-97a6-44fe-a519-29ed5ab029da-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.358878 4732 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fca45d1b-97a6-44fe-a519-29ed5ab029da-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.358955 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fca45d1b-97a6-44fe-a519-29ed5ab029da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.359030 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhhps\" (UniqueName: \"kubernetes.io/projected/fca45d1b-97a6-44fe-a519-29ed5ab029da-kube-api-access-nhhps\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.369053 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"43834d16-35ed-4baa-8292-26a762220c9a","Type":"ContainerStarted","Data":"f661df659997f01af0b0aad55048463b26f7df6a03de3a9c956c71182d8d6ad5"} Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.369191 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.373608 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.373641 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fca45d1b-97a6-44fe-a519-29ed5ab029da","Type":"ContainerDied","Data":"86b15c0a6e6bd7171b13d682658a8d0df48f366861a13932771f854240085848"} Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.373696 4732 scope.go:117] "RemoveContainer" containerID="344ad458c644e8d4d3dba5022a0d33beae96bde7071f592d8c4991568275cf8b" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.373516 4732 generic.go:334] "Generic (PLEG): container finished" podID="fca45d1b-97a6-44fe-a519-29ed5ab029da" containerID="86b15c0a6e6bd7171b13d682658a8d0df48f366861a13932771f854240085848" exitCode=0 Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.374137 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fca45d1b-97a6-44fe-a519-29ed5ab029da","Type":"ContainerDied","Data":"6f7087a937493a10bdd9322df7d42a5eece85137858f4b0393017c50cccef1f8"} Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.375463 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.385631 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.385595163 podStartE2EDuration="2.385595163s" podCreationTimestamp="2026-04-02 14:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:02:52.385072309 +0000 UTC m=+1529.289479882" watchObservedRunningTime="2026-04-02 14:02:52.385595163 +0000 UTC m=+1529.290002726" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.405884 4732 scope.go:117] "RemoveContainer" containerID="5efa4fba6157a758990fc184bbe56eb1978c503a2bea7de2970f2c392a51e001" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.434849 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.437858 4732 scope.go:117] "RemoveContainer" containerID="86b15c0a6e6bd7171b13d682658a8d0df48f366861a13932771f854240085848" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.455339 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.465724 4732 scope.go:117] "RemoveContainer" containerID="7aff1bfaa21127765fbf36fc46affb20b40da03b4c2fc027f0287fff9b4e86a0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.473966 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.503595 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.514978 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:02:52 crc kubenswrapper[4732]: E0402 14:02:52.515366 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d54bbf-4dbe-46ee-89bd-d2602ef090cd" containerName="nova-metadata-metadata" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.515378 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d54bbf-4dbe-46ee-89bd-d2602ef090cd" containerName="nova-metadata-metadata" Apr 02 14:02:52 crc kubenswrapper[4732]: E0402 14:02:52.515393 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fca45d1b-97a6-44fe-a519-29ed5ab029da" containerName="ceilometer-central-agent" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.515399 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="fca45d1b-97a6-44fe-a519-29ed5ab029da" containerName="ceilometer-central-agent" Apr 02 14:02:52 crc kubenswrapper[4732]: E0402 14:02:52.515413 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fca45d1b-97a6-44fe-a519-29ed5ab029da" containerName="sg-core" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.515419 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="fca45d1b-97a6-44fe-a519-29ed5ab029da" containerName="sg-core" Apr 02 14:02:52 crc kubenswrapper[4732]: E0402 14:02:52.515439 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d54bbf-4dbe-46ee-89bd-d2602ef090cd" containerName="nova-metadata-log" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.515445 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d54bbf-4dbe-46ee-89bd-d2602ef090cd" containerName="nova-metadata-log" Apr 02 14:02:52 crc kubenswrapper[4732]: E0402 14:02:52.515458 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fca45d1b-97a6-44fe-a519-29ed5ab029da" containerName="proxy-httpd" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.515463 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="fca45d1b-97a6-44fe-a519-29ed5ab029da" containerName="proxy-httpd" Apr 02 14:02:52 crc kubenswrapper[4732]: E0402 14:02:52.515478 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fca45d1b-97a6-44fe-a519-29ed5ab029da" containerName="ceilometer-notification-agent" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.515483 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="fca45d1b-97a6-44fe-a519-29ed5ab029da" containerName="ceilometer-notification-agent" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.515647 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="fca45d1b-97a6-44fe-a519-29ed5ab029da" containerName="ceilometer-notification-agent" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.515658 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="fca45d1b-97a6-44fe-a519-29ed5ab029da" containerName="sg-core" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.515666 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="45d54bbf-4dbe-46ee-89bd-d2602ef090cd" containerName="nova-metadata-metadata" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.515678 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="fca45d1b-97a6-44fe-a519-29ed5ab029da" containerName="ceilometer-central-agent" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.515687 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="fca45d1b-97a6-44fe-a519-29ed5ab029da" containerName="proxy-httpd" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.515707 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="45d54bbf-4dbe-46ee-89bd-d2602ef090cd" containerName="nova-metadata-log" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.517223 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.517890 4732 scope.go:117] "RemoveContainer" containerID="344ad458c644e8d4d3dba5022a0d33beae96bde7071f592d8c4991568275cf8b" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.519003 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.521222 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.521376 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Apr 02 14:02:52 crc kubenswrapper[4732]: E0402 14:02:52.522261 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"344ad458c644e8d4d3dba5022a0d33beae96bde7071f592d8c4991568275cf8b\": container with ID starting with 344ad458c644e8d4d3dba5022a0d33beae96bde7071f592d8c4991568275cf8b not found: ID does not exist" containerID="344ad458c644e8d4d3dba5022a0d33beae96bde7071f592d8c4991568275cf8b" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.522287 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"344ad458c644e8d4d3dba5022a0d33beae96bde7071f592d8c4991568275cf8b"} err="failed to get container status \"344ad458c644e8d4d3dba5022a0d33beae96bde7071f592d8c4991568275cf8b\": rpc error: code = NotFound desc = could not find container \"344ad458c644e8d4d3dba5022a0d33beae96bde7071f592d8c4991568275cf8b\": container with ID starting with 344ad458c644e8d4d3dba5022a0d33beae96bde7071f592d8c4991568275cf8b not found: ID does not exist" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.522309 4732 scope.go:117] "RemoveContainer" containerID="5efa4fba6157a758990fc184bbe56eb1978c503a2bea7de2970f2c392a51e001" Apr 02 14:02:52 crc kubenswrapper[4732]: E0402 14:02:52.523767 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5efa4fba6157a758990fc184bbe56eb1978c503a2bea7de2970f2c392a51e001\": container with ID starting with 5efa4fba6157a758990fc184bbe56eb1978c503a2bea7de2970f2c392a51e001 not found: ID does not exist" containerID="5efa4fba6157a758990fc184bbe56eb1978c503a2bea7de2970f2c392a51e001" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.523797 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5efa4fba6157a758990fc184bbe56eb1978c503a2bea7de2970f2c392a51e001"} err="failed to get container status \"5efa4fba6157a758990fc184bbe56eb1978c503a2bea7de2970f2c392a51e001\": rpc error: code = NotFound desc = could not find container \"5efa4fba6157a758990fc184bbe56eb1978c503a2bea7de2970f2c392a51e001\": container with ID starting with 5efa4fba6157a758990fc184bbe56eb1978c503a2bea7de2970f2c392a51e001 not found: ID does not exist" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.523815 4732 scope.go:117] "RemoveContainer" containerID="86b15c0a6e6bd7171b13d682658a8d0df48f366861a13932771f854240085848" Apr 02 14:02:52 crc kubenswrapper[4732]: E0402 14:02:52.524207 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86b15c0a6e6bd7171b13d682658a8d0df48f366861a13932771f854240085848\": container with ID starting with 86b15c0a6e6bd7171b13d682658a8d0df48f366861a13932771f854240085848 not found: ID does not exist" containerID="86b15c0a6e6bd7171b13d682658a8d0df48f366861a13932771f854240085848" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.524235 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86b15c0a6e6bd7171b13d682658a8d0df48f366861a13932771f854240085848"} err="failed to get container status \"86b15c0a6e6bd7171b13d682658a8d0df48f366861a13932771f854240085848\": rpc error: code = NotFound desc = could not find container \"86b15c0a6e6bd7171b13d682658a8d0df48f366861a13932771f854240085848\": container with ID starting with 86b15c0a6e6bd7171b13d682658a8d0df48f366861a13932771f854240085848 not found: ID does not exist" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.524251 4732 scope.go:117] "RemoveContainer" containerID="7aff1bfaa21127765fbf36fc46affb20b40da03b4c2fc027f0287fff9b4e86a0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.526895 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:02:52 crc kubenswrapper[4732]: E0402 14:02:52.528518 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aff1bfaa21127765fbf36fc46affb20b40da03b4c2fc027f0287fff9b4e86a0\": container with ID starting with 7aff1bfaa21127765fbf36fc46affb20b40da03b4c2fc027f0287fff9b4e86a0 not found: ID does not exist" containerID="7aff1bfaa21127765fbf36fc46affb20b40da03b4c2fc027f0287fff9b4e86a0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.528546 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aff1bfaa21127765fbf36fc46affb20b40da03b4c2fc027f0287fff9b4e86a0"} err="failed to get container status \"7aff1bfaa21127765fbf36fc46affb20b40da03b4c2fc027f0287fff9b4e86a0\": rpc error: code = NotFound desc = could not find container \"7aff1bfaa21127765fbf36fc46affb20b40da03b4c2fc027f0287fff9b4e86a0\": container with ID starting with 7aff1bfaa21127765fbf36fc46affb20b40da03b4c2fc027f0287fff9b4e86a0 not found: ID does not exist" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.544690 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.546678 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.552892 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.554682 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.554782 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.663929 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.663994 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbe56802-47d3-4977-ad5c-1f4223d984e4-run-httpd\") pod \"ceilometer-0\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.664030 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.664152 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4hnk\" (UniqueName: \"kubernetes.io/projected/bbe56802-47d3-4977-ad5c-1f4223d984e4-kube-api-access-v4hnk\") pod \"ceilometer-0\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.664267 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.664324 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn727\" (UniqueName: \"kubernetes.io/projected/737abec3-8d91-4234-83fc-ebbdbd59812e-kube-api-access-zn727\") pod \"nova-metadata-0\" (UID: \"737abec3-8d91-4234-83fc-ebbdbd59812e\") " pod="openstack/nova-metadata-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.664422 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbe56802-47d3-4977-ad5c-1f4223d984e4-log-httpd\") pod \"ceilometer-0\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.664457 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/737abec3-8d91-4234-83fc-ebbdbd59812e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"737abec3-8d91-4234-83fc-ebbdbd59812e\") " pod="openstack/nova-metadata-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.664485 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-config-data\") pod \"ceilometer-0\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.664538 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737abec3-8d91-4234-83fc-ebbdbd59812e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"737abec3-8d91-4234-83fc-ebbdbd59812e\") " pod="openstack/nova-metadata-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.664651 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/737abec3-8d91-4234-83fc-ebbdbd59812e-logs\") pod \"nova-metadata-0\" (UID: \"737abec3-8d91-4234-83fc-ebbdbd59812e\") " pod="openstack/nova-metadata-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.664679 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-scripts\") pod \"ceilometer-0\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.664728 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737abec3-8d91-4234-83fc-ebbdbd59812e-config-data\") pod \"nova-metadata-0\" (UID: \"737abec3-8d91-4234-83fc-ebbdbd59812e\") " pod="openstack/nova-metadata-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.696451 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45d54bbf-4dbe-46ee-89bd-d2602ef090cd" path="/var/lib/kubelet/pods/45d54bbf-4dbe-46ee-89bd-d2602ef090cd/volumes" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.697164 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fca45d1b-97a6-44fe-a519-29ed5ab029da" path="/var/lib/kubelet/pods/fca45d1b-97a6-44fe-a519-29ed5ab029da/volumes" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.767417 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.768056 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbe56802-47d3-4977-ad5c-1f4223d984e4-run-httpd\") pod \"ceilometer-0\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.768518 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbe56802-47d3-4977-ad5c-1f4223d984e4-run-httpd\") pod \"ceilometer-0\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.768092 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.768944 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4hnk\" (UniqueName: \"kubernetes.io/projected/bbe56802-47d3-4977-ad5c-1f4223d984e4-kube-api-access-v4hnk\") pod \"ceilometer-0\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.769005 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.769876 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn727\" (UniqueName: \"kubernetes.io/projected/737abec3-8d91-4234-83fc-ebbdbd59812e-kube-api-access-zn727\") pod \"nova-metadata-0\" (UID: \"737abec3-8d91-4234-83fc-ebbdbd59812e\") " pod="openstack/nova-metadata-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.770087 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbe56802-47d3-4977-ad5c-1f4223d984e4-log-httpd\") pod \"ceilometer-0\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.770135 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/737abec3-8d91-4234-83fc-ebbdbd59812e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"737abec3-8d91-4234-83fc-ebbdbd59812e\") " pod="openstack/nova-metadata-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.770183 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-config-data\") pod \"ceilometer-0\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.770533 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbe56802-47d3-4977-ad5c-1f4223d984e4-log-httpd\") pod \"ceilometer-0\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.770704 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737abec3-8d91-4234-83fc-ebbdbd59812e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"737abec3-8d91-4234-83fc-ebbdbd59812e\") " pod="openstack/nova-metadata-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.770863 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/737abec3-8d91-4234-83fc-ebbdbd59812e-logs\") pod \"nova-metadata-0\" (UID: \"737abec3-8d91-4234-83fc-ebbdbd59812e\") " pod="openstack/nova-metadata-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.770901 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-scripts\") pod \"ceilometer-0\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.770971 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737abec3-8d91-4234-83fc-ebbdbd59812e-config-data\") pod \"nova-metadata-0\" (UID: \"737abec3-8d91-4234-83fc-ebbdbd59812e\") " pod="openstack/nova-metadata-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.772106 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/737abec3-8d91-4234-83fc-ebbdbd59812e-logs\") pod \"nova-metadata-0\" (UID: \"737abec3-8d91-4234-83fc-ebbdbd59812e\") " pod="openstack/nova-metadata-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.772128 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.773976 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.774003 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-scripts\") pod \"ceilometer-0\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.776225 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737abec3-8d91-4234-83fc-ebbdbd59812e-config-data\") pod \"nova-metadata-0\" (UID: \"737abec3-8d91-4234-83fc-ebbdbd59812e\") " pod="openstack/nova-metadata-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.786713 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.786989 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn727\" (UniqueName: \"kubernetes.io/projected/737abec3-8d91-4234-83fc-ebbdbd59812e-kube-api-access-zn727\") pod \"nova-metadata-0\" (UID: \"737abec3-8d91-4234-83fc-ebbdbd59812e\") " pod="openstack/nova-metadata-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.789568 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4hnk\" (UniqueName: \"kubernetes.io/projected/bbe56802-47d3-4977-ad5c-1f4223d984e4-kube-api-access-v4hnk\") pod \"ceilometer-0\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.792146 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-config-data\") pod \"ceilometer-0\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.793502 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737abec3-8d91-4234-83fc-ebbdbd59812e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"737abec3-8d91-4234-83fc-ebbdbd59812e\") " pod="openstack/nova-metadata-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.794709 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/737abec3-8d91-4234-83fc-ebbdbd59812e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"737abec3-8d91-4234-83fc-ebbdbd59812e\") " pod="openstack/nova-metadata-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.847538 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 14:02:52 crc kubenswrapper[4732]: I0402 14:02:52.866977 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 02 14:02:53 crc kubenswrapper[4732]: I0402 14:02:53.330462 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:02:53 crc kubenswrapper[4732]: I0402 14:02:53.395925 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbe56802-47d3-4977-ad5c-1f4223d984e4","Type":"ContainerStarted","Data":"603f6595a4b14d769d2d6180978748cd6df477c752db5a8cae36dfb826f741fd"} Apr 02 14:02:53 crc kubenswrapper[4732]: I0402 14:02:53.407030 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Apr 02 14:02:53 crc kubenswrapper[4732]: W0402 14:02:53.408051 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod737abec3_8d91_4234_83fc_ebbdbd59812e.slice/crio-01bcc02a9242e9f31701771b32216137c04538dbd5f487ce1739fb397445d13c WatchSource:0}: Error finding container 01bcc02a9242e9f31701771b32216137c04538dbd5f487ce1739fb397445d13c: Status 404 returned error can't find the container with id 01bcc02a9242e9f31701771b32216137c04538dbd5f487ce1739fb397445d13c Apr 02 14:02:53 crc kubenswrapper[4732]: E0402 14:02:53.644495 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ffe28f8aefa671a7ac13f382f3d71e9b4eea196e320ebc47f2a1aa238b09e7d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Apr 02 14:02:53 crc kubenswrapper[4732]: E0402 14:02:53.645656 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ffe28f8aefa671a7ac13f382f3d71e9b4eea196e320ebc47f2a1aa238b09e7d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Apr 02 14:02:53 crc kubenswrapper[4732]: E0402 14:02:53.647483 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ffe28f8aefa671a7ac13f382f3d71e9b4eea196e320ebc47f2a1aa238b09e7d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Apr 02 14:02:53 crc kubenswrapper[4732]: E0402 14:02:53.647585 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="47153a66-ef48-4bb2-bd19-2a5d5d506d0d" containerName="nova-scheduler-scheduler" Apr 02 14:02:54 crc kubenswrapper[4732]: I0402 14:02:54.410212 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbe56802-47d3-4977-ad5c-1f4223d984e4","Type":"ContainerStarted","Data":"8470ac4168306100ac49bef7fa6845da4b17056e0ad926f76c2f9c0ad0645754"} Apr 02 14:02:54 crc kubenswrapper[4732]: I0402 14:02:54.411644 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"737abec3-8d91-4234-83fc-ebbdbd59812e","Type":"ContainerStarted","Data":"3da6bf392b2c2b310dca2b6ee59b744d6cddff583caaf6fc88bfecfccdf461b1"} Apr 02 14:02:54 crc kubenswrapper[4732]: I0402 14:02:54.411668 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"737abec3-8d91-4234-83fc-ebbdbd59812e","Type":"ContainerStarted","Data":"d436a18c3276554f7ff3affa860edb60ae090d140dfd34f92ab6855bd15d983c"} Apr 02 14:02:54 crc kubenswrapper[4732]: I0402 14:02:54.411681 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"737abec3-8d91-4234-83fc-ebbdbd59812e","Type":"ContainerStarted","Data":"01bcc02a9242e9f31701771b32216137c04538dbd5f487ce1739fb397445d13c"} Apr 02 14:02:54 crc kubenswrapper[4732]: I0402 14:02:54.437296 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.437281165 podStartE2EDuration="2.437281165s" podCreationTimestamp="2026-04-02 14:02:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:02:54.435754994 +0000 UTC m=+1531.340162547" watchObservedRunningTime="2026-04-02 14:02:54.437281165 +0000 UTC m=+1531.341688718" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.157540 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.325728 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47153a66-ef48-4bb2-bd19-2a5d5d506d0d-config-data\") pod \"47153a66-ef48-4bb2-bd19-2a5d5d506d0d\" (UID: \"47153a66-ef48-4bb2-bd19-2a5d5d506d0d\") " Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.325874 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss849\" (UniqueName: \"kubernetes.io/projected/47153a66-ef48-4bb2-bd19-2a5d5d506d0d-kube-api-access-ss849\") pod \"47153a66-ef48-4bb2-bd19-2a5d5d506d0d\" (UID: \"47153a66-ef48-4bb2-bd19-2a5d5d506d0d\") " Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.326027 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47153a66-ef48-4bb2-bd19-2a5d5d506d0d-combined-ca-bundle\") pod \"47153a66-ef48-4bb2-bd19-2a5d5d506d0d\" (UID: \"47153a66-ef48-4bb2-bd19-2a5d5d506d0d\") " Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.332106 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47153a66-ef48-4bb2-bd19-2a5d5d506d0d-kube-api-access-ss849" (OuterVolumeSpecName: "kube-api-access-ss849") pod "47153a66-ef48-4bb2-bd19-2a5d5d506d0d" (UID: "47153a66-ef48-4bb2-bd19-2a5d5d506d0d"). InnerVolumeSpecName "kube-api-access-ss849". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.382101 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47153a66-ef48-4bb2-bd19-2a5d5d506d0d-config-data" (OuterVolumeSpecName: "config-data") pod "47153a66-ef48-4bb2-bd19-2a5d5d506d0d" (UID: "47153a66-ef48-4bb2-bd19-2a5d5d506d0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.385705 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47153a66-ef48-4bb2-bd19-2a5d5d506d0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47153a66-ef48-4bb2-bd19-2a5d5d506d0d" (UID: "47153a66-ef48-4bb2-bd19-2a5d5d506d0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.422035 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbe56802-47d3-4977-ad5c-1f4223d984e4","Type":"ContainerStarted","Data":"3e2ebc42fe287a20a712e93cc316d5757dbdee4f1664600996be458a1d5393b5"} Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.422073 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbe56802-47d3-4977-ad5c-1f4223d984e4","Type":"ContainerStarted","Data":"40598c5244c261a553473444f4fdef43a20371ebd9ec11c02d9850f44828de4d"} Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.425072 4732 generic.go:334] "Generic (PLEG): container finished" podID="83dc4cc3-ce54-4685-9485-16526fec666a" containerID="de458a620751559cc0ae0bdde07ff19ef2327c161a36efa23aae8c84591fb109" exitCode=0 Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.425140 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83dc4cc3-ce54-4685-9485-16526fec666a","Type":"ContainerDied","Data":"de458a620751559cc0ae0bdde07ff19ef2327c161a36efa23aae8c84591fb109"} Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.425167 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83dc4cc3-ce54-4685-9485-16526fec666a","Type":"ContainerDied","Data":"b22ff01a4115ec8d827d4983552cabe5598c5257ef0680386b4351fd026739e6"} Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.425179 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b22ff01a4115ec8d827d4983552cabe5598c5257ef0680386b4351fd026739e6" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.426831 4732 generic.go:334] "Generic (PLEG): container finished" podID="47153a66-ef48-4bb2-bd19-2a5d5d506d0d" containerID="5ffe28f8aefa671a7ac13f382f3d71e9b4eea196e320ebc47f2a1aa238b09e7d" exitCode=0 Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.427834 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.427868 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"47153a66-ef48-4bb2-bd19-2a5d5d506d0d","Type":"ContainerDied","Data":"5ffe28f8aefa671a7ac13f382f3d71e9b4eea196e320ebc47f2a1aa238b09e7d"} Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.427899 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"47153a66-ef48-4bb2-bd19-2a5d5d506d0d","Type":"ContainerDied","Data":"d7dcd11dc4f24241bd222304325f19af954ec9664501d8e6e7970c611cdc89f8"} Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.427933 4732 scope.go:117] "RemoveContainer" containerID="5ffe28f8aefa671a7ac13f382f3d71e9b4eea196e320ebc47f2a1aa238b09e7d" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.429023 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47153a66-ef48-4bb2-bd19-2a5d5d506d0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.429046 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47153a66-ef48-4bb2-bd19-2a5d5d506d0d-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.429057 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss849\" (UniqueName: \"kubernetes.io/projected/47153a66-ef48-4bb2-bd19-2a5d5d506d0d-kube-api-access-ss849\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.450078 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.455778 4732 scope.go:117] "RemoveContainer" containerID="5ffe28f8aefa671a7ac13f382f3d71e9b4eea196e320ebc47f2a1aa238b09e7d" Apr 02 14:02:55 crc kubenswrapper[4732]: E0402 14:02:55.458896 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ffe28f8aefa671a7ac13f382f3d71e9b4eea196e320ebc47f2a1aa238b09e7d\": container with ID starting with 5ffe28f8aefa671a7ac13f382f3d71e9b4eea196e320ebc47f2a1aa238b09e7d not found: ID does not exist" containerID="5ffe28f8aefa671a7ac13f382f3d71e9b4eea196e320ebc47f2a1aa238b09e7d" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.458952 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ffe28f8aefa671a7ac13f382f3d71e9b4eea196e320ebc47f2a1aa238b09e7d"} err="failed to get container status \"5ffe28f8aefa671a7ac13f382f3d71e9b4eea196e320ebc47f2a1aa238b09e7d\": rpc error: code = NotFound desc = could not find container \"5ffe28f8aefa671a7ac13f382f3d71e9b4eea196e320ebc47f2a1aa238b09e7d\": container with ID starting with 5ffe28f8aefa671a7ac13f382f3d71e9b4eea196e320ebc47f2a1aa238b09e7d not found: ID does not exist" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.467587 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.481204 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.491426 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Apr 02 14:02:55 crc kubenswrapper[4732]: E0402 14:02:55.491872 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83dc4cc3-ce54-4685-9485-16526fec666a" containerName="nova-api-api" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.491890 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="83dc4cc3-ce54-4685-9485-16526fec666a" containerName="nova-api-api" Apr 02 14:02:55 crc kubenswrapper[4732]: E0402 14:02:55.491906 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47153a66-ef48-4bb2-bd19-2a5d5d506d0d" containerName="nova-scheduler-scheduler" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.491913 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="47153a66-ef48-4bb2-bd19-2a5d5d506d0d" containerName="nova-scheduler-scheduler" Apr 02 14:02:55 crc kubenswrapper[4732]: E0402 14:02:55.491928 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83dc4cc3-ce54-4685-9485-16526fec666a" containerName="nova-api-log" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.491934 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="83dc4cc3-ce54-4685-9485-16526fec666a" containerName="nova-api-log" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.492097 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="83dc4cc3-ce54-4685-9485-16526fec666a" containerName="nova-api-log" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.492119 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="83dc4cc3-ce54-4685-9485-16526fec666a" containerName="nova-api-api" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.492127 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="47153a66-ef48-4bb2-bd19-2a5d5d506d0d" containerName="nova-scheduler-scheduler" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.492851 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.494587 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.509603 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.529675 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83dc4cc3-ce54-4685-9485-16526fec666a-logs\") pod \"83dc4cc3-ce54-4685-9485-16526fec666a\" (UID: \"83dc4cc3-ce54-4685-9485-16526fec666a\") " Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.529755 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdvkg\" (UniqueName: \"kubernetes.io/projected/83dc4cc3-ce54-4685-9485-16526fec666a-kube-api-access-kdvkg\") pod \"83dc4cc3-ce54-4685-9485-16526fec666a\" (UID: \"83dc4cc3-ce54-4685-9485-16526fec666a\") " Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.529780 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83dc4cc3-ce54-4685-9485-16526fec666a-combined-ca-bundle\") pod \"83dc4cc3-ce54-4685-9485-16526fec666a\" (UID: \"83dc4cc3-ce54-4685-9485-16526fec666a\") " Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.529868 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83dc4cc3-ce54-4685-9485-16526fec666a-config-data\") pod \"83dc4cc3-ce54-4685-9485-16526fec666a\" (UID: \"83dc4cc3-ce54-4685-9485-16526fec666a\") " Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.530308 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83dc4cc3-ce54-4685-9485-16526fec666a-logs" (OuterVolumeSpecName: "logs") pod "83dc4cc3-ce54-4685-9485-16526fec666a" (UID: "83dc4cc3-ce54-4685-9485-16526fec666a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.530895 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83dc4cc3-ce54-4685-9485-16526fec666a-logs\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.534335 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83dc4cc3-ce54-4685-9485-16526fec666a-kube-api-access-kdvkg" (OuterVolumeSpecName: "kube-api-access-kdvkg") pod "83dc4cc3-ce54-4685-9485-16526fec666a" (UID: "83dc4cc3-ce54-4685-9485-16526fec666a"). InnerVolumeSpecName "kube-api-access-kdvkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.556318 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83dc4cc3-ce54-4685-9485-16526fec666a-config-data" (OuterVolumeSpecName: "config-data") pod "83dc4cc3-ce54-4685-9485-16526fec666a" (UID: "83dc4cc3-ce54-4685-9485-16526fec666a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.562673 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83dc4cc3-ce54-4685-9485-16526fec666a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83dc4cc3-ce54-4685-9485-16526fec666a" (UID: "83dc4cc3-ce54-4685-9485-16526fec666a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.632751 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7d18cc6-1dc3-4ed6-a8a4-822678468b70-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f7d18cc6-1dc3-4ed6-a8a4-822678468b70\") " pod="openstack/nova-scheduler-0" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.633158 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgzxp\" (UniqueName: \"kubernetes.io/projected/f7d18cc6-1dc3-4ed6-a8a4-822678468b70-kube-api-access-kgzxp\") pod \"nova-scheduler-0\" (UID: \"f7d18cc6-1dc3-4ed6-a8a4-822678468b70\") " pod="openstack/nova-scheduler-0" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.633231 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7d18cc6-1dc3-4ed6-a8a4-822678468b70-config-data\") pod \"nova-scheduler-0\" (UID: \"f7d18cc6-1dc3-4ed6-a8a4-822678468b70\") " pod="openstack/nova-scheduler-0" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.633313 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdvkg\" (UniqueName: \"kubernetes.io/projected/83dc4cc3-ce54-4685-9485-16526fec666a-kube-api-access-kdvkg\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.633329 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83dc4cc3-ce54-4685-9485-16526fec666a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.633339 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83dc4cc3-ce54-4685-9485-16526fec666a-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.735211 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7d18cc6-1dc3-4ed6-a8a4-822678468b70-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f7d18cc6-1dc3-4ed6-a8a4-822678468b70\") " pod="openstack/nova-scheduler-0" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.735380 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgzxp\" (UniqueName: \"kubernetes.io/projected/f7d18cc6-1dc3-4ed6-a8a4-822678468b70-kube-api-access-kgzxp\") pod \"nova-scheduler-0\" (UID: \"f7d18cc6-1dc3-4ed6-a8a4-822678468b70\") " pod="openstack/nova-scheduler-0" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.735417 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7d18cc6-1dc3-4ed6-a8a4-822678468b70-config-data\") pod \"nova-scheduler-0\" (UID: \"f7d18cc6-1dc3-4ed6-a8a4-822678468b70\") " pod="openstack/nova-scheduler-0" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.738674 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7d18cc6-1dc3-4ed6-a8a4-822678468b70-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f7d18cc6-1dc3-4ed6-a8a4-822678468b70\") " pod="openstack/nova-scheduler-0" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.739655 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7d18cc6-1dc3-4ed6-a8a4-822678468b70-config-data\") pod \"nova-scheduler-0\" (UID: \"f7d18cc6-1dc3-4ed6-a8a4-822678468b70\") " pod="openstack/nova-scheduler-0" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.755086 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgzxp\" (UniqueName: \"kubernetes.io/projected/f7d18cc6-1dc3-4ed6-a8a4-822678468b70-kube-api-access-kgzxp\") pod \"nova-scheduler-0\" (UID: \"f7d18cc6-1dc3-4ed6-a8a4-822678468b70\") " pod="openstack/nova-scheduler-0" Apr 02 14:02:55 crc kubenswrapper[4732]: I0402 14:02:55.817772 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Apr 02 14:02:56 crc kubenswrapper[4732]: I0402 14:02:56.258318 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Apr 02 14:02:56 crc kubenswrapper[4732]: I0402 14:02:56.441951 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 02 14:02:56 crc kubenswrapper[4732]: I0402 14:02:56.443700 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f7d18cc6-1dc3-4ed6-a8a4-822678468b70","Type":"ContainerStarted","Data":"22a2ce890f802e32c8d3f892ec3e0c107dc57d74066f650b3aaf601fe736453a"} Apr 02 14:02:56 crc kubenswrapper[4732]: I0402 14:02:56.488096 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Apr 02 14:02:56 crc kubenswrapper[4732]: I0402 14:02:56.510145 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Apr 02 14:02:56 crc kubenswrapper[4732]: I0402 14:02:56.522661 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Apr 02 14:02:56 crc kubenswrapper[4732]: I0402 14:02:56.524652 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 02 14:02:56 crc kubenswrapper[4732]: I0402 14:02:56.527397 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Apr 02 14:02:56 crc kubenswrapper[4732]: I0402 14:02:56.536086 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Apr 02 14:02:56 crc kubenswrapper[4732]: I0402 14:02:56.652123 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/528e73de-0003-424b-9f26-7dd54d1942f2-config-data\") pod \"nova-api-0\" (UID: \"528e73de-0003-424b-9f26-7dd54d1942f2\") " pod="openstack/nova-api-0" Apr 02 14:02:56 crc kubenswrapper[4732]: I0402 14:02:56.652169 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528e73de-0003-424b-9f26-7dd54d1942f2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"528e73de-0003-424b-9f26-7dd54d1942f2\") " pod="openstack/nova-api-0" Apr 02 14:02:56 crc kubenswrapper[4732]: I0402 14:02:56.652210 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pfld\" (UniqueName: \"kubernetes.io/projected/528e73de-0003-424b-9f26-7dd54d1942f2-kube-api-access-6pfld\") pod \"nova-api-0\" (UID: \"528e73de-0003-424b-9f26-7dd54d1942f2\") " pod="openstack/nova-api-0" Apr 02 14:02:56 crc kubenswrapper[4732]: I0402 14:02:56.652277 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/528e73de-0003-424b-9f26-7dd54d1942f2-logs\") pod \"nova-api-0\" (UID: \"528e73de-0003-424b-9f26-7dd54d1942f2\") " pod="openstack/nova-api-0" Apr 02 14:02:56 crc kubenswrapper[4732]: I0402 14:02:56.695569 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47153a66-ef48-4bb2-bd19-2a5d5d506d0d" path="/var/lib/kubelet/pods/47153a66-ef48-4bb2-bd19-2a5d5d506d0d/volumes" Apr 02 14:02:56 crc kubenswrapper[4732]: I0402 14:02:56.696978 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83dc4cc3-ce54-4685-9485-16526fec666a" path="/var/lib/kubelet/pods/83dc4cc3-ce54-4685-9485-16526fec666a/volumes" Apr 02 14:02:56 crc kubenswrapper[4732]: I0402 14:02:56.753756 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/528e73de-0003-424b-9f26-7dd54d1942f2-config-data\") pod \"nova-api-0\" (UID: \"528e73de-0003-424b-9f26-7dd54d1942f2\") " pod="openstack/nova-api-0" Apr 02 14:02:56 crc kubenswrapper[4732]: I0402 14:02:56.753828 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528e73de-0003-424b-9f26-7dd54d1942f2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"528e73de-0003-424b-9f26-7dd54d1942f2\") " pod="openstack/nova-api-0" Apr 02 14:02:56 crc kubenswrapper[4732]: I0402 14:02:56.753877 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pfld\" (UniqueName: \"kubernetes.io/projected/528e73de-0003-424b-9f26-7dd54d1942f2-kube-api-access-6pfld\") pod \"nova-api-0\" (UID: \"528e73de-0003-424b-9f26-7dd54d1942f2\") " pod="openstack/nova-api-0" Apr 02 14:02:56 crc kubenswrapper[4732]: I0402 14:02:56.753917 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/528e73de-0003-424b-9f26-7dd54d1942f2-logs\") pod \"nova-api-0\" (UID: \"528e73de-0003-424b-9f26-7dd54d1942f2\") " pod="openstack/nova-api-0" Apr 02 14:02:56 crc kubenswrapper[4732]: I0402 14:02:56.754431 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/528e73de-0003-424b-9f26-7dd54d1942f2-logs\") pod \"nova-api-0\" (UID: \"528e73de-0003-424b-9f26-7dd54d1942f2\") " pod="openstack/nova-api-0" Apr 02 14:02:56 crc kubenswrapper[4732]: I0402 14:02:56.759154 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/528e73de-0003-424b-9f26-7dd54d1942f2-config-data\") pod \"nova-api-0\" (UID: \"528e73de-0003-424b-9f26-7dd54d1942f2\") " pod="openstack/nova-api-0" Apr 02 14:02:56 crc kubenswrapper[4732]: I0402 14:02:56.761540 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528e73de-0003-424b-9f26-7dd54d1942f2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"528e73de-0003-424b-9f26-7dd54d1942f2\") " pod="openstack/nova-api-0" Apr 02 14:02:56 crc kubenswrapper[4732]: I0402 14:02:56.771602 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pfld\" (UniqueName: \"kubernetes.io/projected/528e73de-0003-424b-9f26-7dd54d1942f2-kube-api-access-6pfld\") pod \"nova-api-0\" (UID: \"528e73de-0003-424b-9f26-7dd54d1942f2\") " pod="openstack/nova-api-0" Apr 02 14:02:56 crc kubenswrapper[4732]: I0402 14:02:56.846517 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 02 14:02:57 crc kubenswrapper[4732]: I0402 14:02:57.334779 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Apr 02 14:02:57 crc kubenswrapper[4732]: W0402 14:02:57.335227 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod528e73de_0003_424b_9f26_7dd54d1942f2.slice/crio-2197897d2d4af884722456de6d28e3d53266723c638dcc4cbfb36f587abfc4f3 WatchSource:0}: Error finding container 2197897d2d4af884722456de6d28e3d53266723c638dcc4cbfb36f587abfc4f3: Status 404 returned error can't find the container with id 2197897d2d4af884722456de6d28e3d53266723c638dcc4cbfb36f587abfc4f3 Apr 02 14:02:57 crc kubenswrapper[4732]: I0402 14:02:57.462011 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f7d18cc6-1dc3-4ed6-a8a4-822678468b70","Type":"ContainerStarted","Data":"553271ae9fa33a76a3a65beeb9372a3380897000327ade77ae956c1a1ba7cffd"} Apr 02 14:02:57 crc kubenswrapper[4732]: I0402 14:02:57.466639 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"528e73de-0003-424b-9f26-7dd54d1942f2","Type":"ContainerStarted","Data":"2197897d2d4af884722456de6d28e3d53266723c638dcc4cbfb36f587abfc4f3"} Apr 02 14:02:57 crc kubenswrapper[4732]: I0402 14:02:57.478726 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.478707089 podStartE2EDuration="2.478707089s" podCreationTimestamp="2026-04-02 14:02:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:02:57.47584029 +0000 UTC m=+1534.380247843" watchObservedRunningTime="2026-04-02 14:02:57.478707089 +0000 UTC m=+1534.383114642" Apr 02 14:02:58 crc kubenswrapper[4732]: I0402 14:02:58.480846 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbe56802-47d3-4977-ad5c-1f4223d984e4","Type":"ContainerStarted","Data":"a8bb55a1ac6a17d89b80d83ecab6d30c708be261b92e66adb7bad3020c7208af"} Apr 02 14:02:58 crc kubenswrapper[4732]: I0402 14:02:58.481842 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Apr 02 14:02:58 crc kubenswrapper[4732]: I0402 14:02:58.484672 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"528e73de-0003-424b-9f26-7dd54d1942f2","Type":"ContainerStarted","Data":"5819edcdf83ec3c7ea34879074ac0ababcaeb29f71ee732cc169fcd2c92c4358"} Apr 02 14:02:58 crc kubenswrapper[4732]: I0402 14:02:58.484719 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"528e73de-0003-424b-9f26-7dd54d1942f2","Type":"ContainerStarted","Data":"701851997acc62fba000c6ed45cd23fa4b96f1745fb951c822c9e909dfb07771"} Apr 02 14:02:58 crc kubenswrapper[4732]: I0402 14:02:58.520349 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.6260303780000003 podStartE2EDuration="6.520326364s" podCreationTimestamp="2026-04-02 14:02:52 +0000 UTC" firstStartedPulling="2026-04-02 14:02:53.332479977 +0000 UTC m=+1530.236887530" lastFinishedPulling="2026-04-02 14:02:57.226775963 +0000 UTC m=+1534.131183516" observedRunningTime="2026-04-02 14:02:58.49963159 +0000 UTC m=+1535.404039163" watchObservedRunningTime="2026-04-02 14:02:58.520326364 +0000 UTC m=+1535.424733927" Apr 02 14:02:58 crc kubenswrapper[4732]: I0402 14:02:58.545402 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.545380066 podStartE2EDuration="2.545380066s" podCreationTimestamp="2026-04-02 14:02:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:02:58.525000331 +0000 UTC m=+1535.429407884" watchObservedRunningTime="2026-04-02 14:02:58.545380066 +0000 UTC m=+1535.449787629" Apr 02 14:02:58 crc kubenswrapper[4732]: I0402 14:02:58.708304 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Apr 02 14:03:00 crc kubenswrapper[4732]: I0402 14:03:00.740955 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Apr 02 14:03:00 crc kubenswrapper[4732]: I0402 14:03:00.818358 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Apr 02 14:03:01 crc kubenswrapper[4732]: I0402 14:03:01.924723 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:03:01 crc kubenswrapper[4732]: I0402 14:03:01.925118 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:03:02 crc kubenswrapper[4732]: I0402 14:03:02.867332 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Apr 02 14:03:02 crc kubenswrapper[4732]: I0402 14:03:02.867474 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Apr 02 14:03:03 crc kubenswrapper[4732]: I0402 14:03:03.881919 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="737abec3-8d91-4234-83fc-ebbdbd59812e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Apr 02 14:03:03 crc kubenswrapper[4732]: I0402 14:03:03.881999 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="737abec3-8d91-4234-83fc-ebbdbd59812e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Apr 02 14:03:05 crc kubenswrapper[4732]: I0402 14:03:05.818366 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Apr 02 14:03:05 crc kubenswrapper[4732]: I0402 14:03:05.845073 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Apr 02 14:03:06 crc kubenswrapper[4732]: I0402 14:03:06.593954 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Apr 02 14:03:06 crc kubenswrapper[4732]: I0402 14:03:06.846840 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Apr 02 14:03:06 crc kubenswrapper[4732]: I0402 14:03:06.846894 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Apr 02 14:03:07 crc kubenswrapper[4732]: I0402 14:03:07.929859 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="528e73de-0003-424b-9f26-7dd54d1942f2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 02 14:03:07 crc kubenswrapper[4732]: I0402 14:03:07.930237 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="528e73de-0003-424b-9f26-7dd54d1942f2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 02 14:03:10 crc kubenswrapper[4732]: I0402 14:03:10.868015 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Apr 02 14:03:10 crc kubenswrapper[4732]: I0402 14:03:10.868592 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Apr 02 14:03:12 crc kubenswrapper[4732]: I0402 14:03:12.873168 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Apr 02 14:03:12 crc kubenswrapper[4732]: I0402 14:03:12.873982 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Apr 02 14:03:12 crc kubenswrapper[4732]: I0402 14:03:12.879159 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Apr 02 14:03:13 crc kubenswrapper[4732]: I0402 14:03:13.552642 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:03:13 crc kubenswrapper[4732]: I0402 14:03:13.641672 4732 generic.go:334] "Generic (PLEG): container finished" podID="6ef800e5-24ee-4bc3-9248-d49d0c4d5abf" containerID="d6cfb4c0e731123f2293e13519cfa7bf6a97b2a7e3c4356605cb7ac754ecbaea" exitCode=137 Apr 02 14:03:13 crc kubenswrapper[4732]: I0402 14:03:13.643230 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:03:13 crc kubenswrapper[4732]: I0402 14:03:13.643478 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6ef800e5-24ee-4bc3-9248-d49d0c4d5abf","Type":"ContainerDied","Data":"d6cfb4c0e731123f2293e13519cfa7bf6a97b2a7e3c4356605cb7ac754ecbaea"} Apr 02 14:03:13 crc kubenswrapper[4732]: I0402 14:03:13.643515 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6ef800e5-24ee-4bc3-9248-d49d0c4d5abf","Type":"ContainerDied","Data":"4b7130d03936e105938902e0b6ac210b937df3795a189e842cf5a56230e3e5a4"} Apr 02 14:03:13 crc kubenswrapper[4732]: I0402 14:03:13.643536 4732 scope.go:117] "RemoveContainer" containerID="d6cfb4c0e731123f2293e13519cfa7bf6a97b2a7e3c4356605cb7ac754ecbaea" Apr 02 14:03:13 crc kubenswrapper[4732]: I0402 14:03:13.648378 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Apr 02 14:03:13 crc kubenswrapper[4732]: I0402 14:03:13.673850 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4khv2\" (UniqueName: \"kubernetes.io/projected/6ef800e5-24ee-4bc3-9248-d49d0c4d5abf-kube-api-access-4khv2\") pod \"6ef800e5-24ee-4bc3-9248-d49d0c4d5abf\" (UID: \"6ef800e5-24ee-4bc3-9248-d49d0c4d5abf\") " Apr 02 14:03:13 crc kubenswrapper[4732]: I0402 14:03:13.674029 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef800e5-24ee-4bc3-9248-d49d0c4d5abf-combined-ca-bundle\") pod \"6ef800e5-24ee-4bc3-9248-d49d0c4d5abf\" (UID: \"6ef800e5-24ee-4bc3-9248-d49d0c4d5abf\") " Apr 02 14:03:13 crc kubenswrapper[4732]: I0402 14:03:13.674531 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef800e5-24ee-4bc3-9248-d49d0c4d5abf-config-data\") pod \"6ef800e5-24ee-4bc3-9248-d49d0c4d5abf\" (UID: \"6ef800e5-24ee-4bc3-9248-d49d0c4d5abf\") " Apr 02 14:03:13 crc kubenswrapper[4732]: I0402 14:03:13.688861 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ef800e5-24ee-4bc3-9248-d49d0c4d5abf-kube-api-access-4khv2" (OuterVolumeSpecName: "kube-api-access-4khv2") pod "6ef800e5-24ee-4bc3-9248-d49d0c4d5abf" (UID: "6ef800e5-24ee-4bc3-9248-d49d0c4d5abf"). InnerVolumeSpecName "kube-api-access-4khv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:03:13 crc kubenswrapper[4732]: I0402 14:03:13.711741 4732 scope.go:117] "RemoveContainer" containerID="d6cfb4c0e731123f2293e13519cfa7bf6a97b2a7e3c4356605cb7ac754ecbaea" Apr 02 14:03:13 crc kubenswrapper[4732]: E0402 14:03:13.712258 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6cfb4c0e731123f2293e13519cfa7bf6a97b2a7e3c4356605cb7ac754ecbaea\": container with ID starting with d6cfb4c0e731123f2293e13519cfa7bf6a97b2a7e3c4356605cb7ac754ecbaea not found: ID does not exist" containerID="d6cfb4c0e731123f2293e13519cfa7bf6a97b2a7e3c4356605cb7ac754ecbaea" Apr 02 14:03:13 crc kubenswrapper[4732]: I0402 14:03:13.712303 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6cfb4c0e731123f2293e13519cfa7bf6a97b2a7e3c4356605cb7ac754ecbaea"} err="failed to get container status \"d6cfb4c0e731123f2293e13519cfa7bf6a97b2a7e3c4356605cb7ac754ecbaea\": rpc error: code = NotFound desc = could not find container \"d6cfb4c0e731123f2293e13519cfa7bf6a97b2a7e3c4356605cb7ac754ecbaea\": container with ID starting with d6cfb4c0e731123f2293e13519cfa7bf6a97b2a7e3c4356605cb7ac754ecbaea not found: ID does not exist" Apr 02 14:03:13 crc kubenswrapper[4732]: I0402 14:03:13.717289 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef800e5-24ee-4bc3-9248-d49d0c4d5abf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ef800e5-24ee-4bc3-9248-d49d0c4d5abf" (UID: "6ef800e5-24ee-4bc3-9248-d49d0c4d5abf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:03:13 crc kubenswrapper[4732]: I0402 14:03:13.732496 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef800e5-24ee-4bc3-9248-d49d0c4d5abf-config-data" (OuterVolumeSpecName: "config-data") pod "6ef800e5-24ee-4bc3-9248-d49d0c4d5abf" (UID: "6ef800e5-24ee-4bc3-9248-d49d0c4d5abf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:03:13 crc kubenswrapper[4732]: I0402 14:03:13.776737 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef800e5-24ee-4bc3-9248-d49d0c4d5abf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:13 crc kubenswrapper[4732]: I0402 14:03:13.776775 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef800e5-24ee-4bc3-9248-d49d0c4d5abf-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:13 crc kubenswrapper[4732]: I0402 14:03:13.776786 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4khv2\" (UniqueName: \"kubernetes.io/projected/6ef800e5-24ee-4bc3-9248-d49d0c4d5abf-kube-api-access-4khv2\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:13 crc kubenswrapper[4732]: I0402 14:03:13.973401 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Apr 02 14:03:13 crc kubenswrapper[4732]: I0402 14:03:13.983483 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.001219 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Apr 02 14:03:14 crc kubenswrapper[4732]: E0402 14:03:14.006906 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef800e5-24ee-4bc3-9248-d49d0c4d5abf" containerName="nova-cell1-novncproxy-novncproxy" Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.006940 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef800e5-24ee-4bc3-9248-d49d0c4d5abf" containerName="nova-cell1-novncproxy-novncproxy" Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.007234 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef800e5-24ee-4bc3-9248-d49d0c4d5abf" containerName="nova-cell1-novncproxy-novncproxy" Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.007977 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.010529 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.010538 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.010941 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.013331 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.083033 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b61eb9-52fd-4d29-8942-c1c18b2f4aff-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"35b61eb9-52fd-4d29-8942-c1c18b2f4aff\") " pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.083163 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/35b61eb9-52fd-4d29-8942-c1c18b2f4aff-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"35b61eb9-52fd-4d29-8942-c1c18b2f4aff\") " pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.083190 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35b61eb9-52fd-4d29-8942-c1c18b2f4aff-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"35b61eb9-52fd-4d29-8942-c1c18b2f4aff\") " pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.083244 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mpdr\" (UniqueName: \"kubernetes.io/projected/35b61eb9-52fd-4d29-8942-c1c18b2f4aff-kube-api-access-4mpdr\") pod \"nova-cell1-novncproxy-0\" (UID: \"35b61eb9-52fd-4d29-8942-c1c18b2f4aff\") " pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.083351 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/35b61eb9-52fd-4d29-8942-c1c18b2f4aff-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"35b61eb9-52fd-4d29-8942-c1c18b2f4aff\") " pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.185209 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/35b61eb9-52fd-4d29-8942-c1c18b2f4aff-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"35b61eb9-52fd-4d29-8942-c1c18b2f4aff\") " pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.185307 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b61eb9-52fd-4d29-8942-c1c18b2f4aff-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"35b61eb9-52fd-4d29-8942-c1c18b2f4aff\") " pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.185380 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/35b61eb9-52fd-4d29-8942-c1c18b2f4aff-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"35b61eb9-52fd-4d29-8942-c1c18b2f4aff\") " pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.185412 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35b61eb9-52fd-4d29-8942-c1c18b2f4aff-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"35b61eb9-52fd-4d29-8942-c1c18b2f4aff\") " pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.185471 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mpdr\" (UniqueName: \"kubernetes.io/projected/35b61eb9-52fd-4d29-8942-c1c18b2f4aff-kube-api-access-4mpdr\") pod \"nova-cell1-novncproxy-0\" (UID: \"35b61eb9-52fd-4d29-8942-c1c18b2f4aff\") " pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.188864 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/35b61eb9-52fd-4d29-8942-c1c18b2f4aff-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"35b61eb9-52fd-4d29-8942-c1c18b2f4aff\") " pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.188906 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b61eb9-52fd-4d29-8942-c1c18b2f4aff-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"35b61eb9-52fd-4d29-8942-c1c18b2f4aff\") " pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.189020 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/35b61eb9-52fd-4d29-8942-c1c18b2f4aff-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"35b61eb9-52fd-4d29-8942-c1c18b2f4aff\") " pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.189022 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35b61eb9-52fd-4d29-8942-c1c18b2f4aff-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"35b61eb9-52fd-4d29-8942-c1c18b2f4aff\") " pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.201136 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mpdr\" (UniqueName: \"kubernetes.io/projected/35b61eb9-52fd-4d29-8942-c1c18b2f4aff-kube-api-access-4mpdr\") pod \"nova-cell1-novncproxy-0\" (UID: \"35b61eb9-52fd-4d29-8942-c1c18b2f4aff\") " pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.326752 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.691302 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ef800e5-24ee-4bc3-9248-d49d0c4d5abf" path="/var/lib/kubelet/pods/6ef800e5-24ee-4bc3-9248-d49d0c4d5abf/volumes" Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.815512 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.846931 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Apr 02 14:03:14 crc kubenswrapper[4732]: I0402 14:03:14.847199 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Apr 02 14:03:15 crc kubenswrapper[4732]: I0402 14:03:15.665134 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"35b61eb9-52fd-4d29-8942-c1c18b2f4aff","Type":"ContainerStarted","Data":"551d7c5396d76f06852e66b63ea73502c29d058b440400f6dc865ff130de4007"} Apr 02 14:03:15 crc kubenswrapper[4732]: I0402 14:03:15.665198 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"35b61eb9-52fd-4d29-8942-c1c18b2f4aff","Type":"ContainerStarted","Data":"41afeca94919545da70c47b035adc1dd7d2acae536f1fcdaec51625d7a22388b"} Apr 02 14:03:15 crc kubenswrapper[4732]: I0402 14:03:15.692905 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.6928888029999998 podStartE2EDuration="2.692888803s" podCreationTimestamp="2026-04-02 14:03:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:03:15.684779872 +0000 UTC m=+1552.589187425" watchObservedRunningTime="2026-04-02 14:03:15.692888803 +0000 UTC m=+1552.597296356" Apr 02 14:03:16 crc kubenswrapper[4732]: I0402 14:03:16.851853 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Apr 02 14:03:16 crc kubenswrapper[4732]: I0402 14:03:16.852242 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Apr 02 14:03:16 crc kubenswrapper[4732]: I0402 14:03:16.856897 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Apr 02 14:03:16 crc kubenswrapper[4732]: I0402 14:03:16.859168 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Apr 02 14:03:17 crc kubenswrapper[4732]: I0402 14:03:17.054109 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-d4dtv"] Apr 02 14:03:17 crc kubenswrapper[4732]: I0402 14:03:17.055691 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" Apr 02 14:03:17 crc kubenswrapper[4732]: I0402 14:03:17.068901 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-d4dtv"] Apr 02 14:03:17 crc kubenswrapper[4732]: I0402 14:03:17.136122 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-d4dtv\" (UID: \"2b032b72-f7bb-4aa9-9519-53f05329a833\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" Apr 02 14:03:17 crc kubenswrapper[4732]: I0402 14:03:17.136224 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x64wb\" (UniqueName: \"kubernetes.io/projected/2b032b72-f7bb-4aa9-9519-53f05329a833-kube-api-access-x64wb\") pod \"dnsmasq-dns-5c7b6c5df9-d4dtv\" (UID: \"2b032b72-f7bb-4aa9-9519-53f05329a833\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" Apr 02 14:03:17 crc kubenswrapper[4732]: I0402 14:03:17.136305 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-d4dtv\" (UID: \"2b032b72-f7bb-4aa9-9519-53f05329a833\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" Apr 02 14:03:17 crc kubenswrapper[4732]: I0402 14:03:17.136470 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-d4dtv\" (UID: \"2b032b72-f7bb-4aa9-9519-53f05329a833\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" Apr 02 14:03:17 crc kubenswrapper[4732]: I0402 14:03:17.136518 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-d4dtv\" (UID: \"2b032b72-f7bb-4aa9-9519-53f05329a833\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" Apr 02 14:03:17 crc kubenswrapper[4732]: I0402 14:03:17.136555 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-config\") pod \"dnsmasq-dns-5c7b6c5df9-d4dtv\" (UID: \"2b032b72-f7bb-4aa9-9519-53f05329a833\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" Apr 02 14:03:17 crc kubenswrapper[4732]: I0402 14:03:17.238704 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-d4dtv\" (UID: \"2b032b72-f7bb-4aa9-9519-53f05329a833\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" Apr 02 14:03:17 crc kubenswrapper[4732]: I0402 14:03:17.238759 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-d4dtv\" (UID: \"2b032b72-f7bb-4aa9-9519-53f05329a833\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" Apr 02 14:03:17 crc kubenswrapper[4732]: I0402 14:03:17.238788 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-config\") pod \"dnsmasq-dns-5c7b6c5df9-d4dtv\" (UID: \"2b032b72-f7bb-4aa9-9519-53f05329a833\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" Apr 02 14:03:17 crc kubenswrapper[4732]: I0402 14:03:17.238827 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-d4dtv\" (UID: \"2b032b72-f7bb-4aa9-9519-53f05329a833\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" Apr 02 14:03:17 crc kubenswrapper[4732]: I0402 14:03:17.238853 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x64wb\" (UniqueName: \"kubernetes.io/projected/2b032b72-f7bb-4aa9-9519-53f05329a833-kube-api-access-x64wb\") pod \"dnsmasq-dns-5c7b6c5df9-d4dtv\" (UID: \"2b032b72-f7bb-4aa9-9519-53f05329a833\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" Apr 02 14:03:17 crc kubenswrapper[4732]: I0402 14:03:17.238906 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-d4dtv\" (UID: \"2b032b72-f7bb-4aa9-9519-53f05329a833\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" Apr 02 14:03:17 crc kubenswrapper[4732]: I0402 14:03:17.239909 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-d4dtv\" (UID: \"2b032b72-f7bb-4aa9-9519-53f05329a833\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" Apr 02 14:03:17 crc kubenswrapper[4732]: I0402 14:03:17.240474 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-d4dtv\" (UID: \"2b032b72-f7bb-4aa9-9519-53f05329a833\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" Apr 02 14:03:17 crc kubenswrapper[4732]: I0402 14:03:17.241030 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-d4dtv\" (UID: \"2b032b72-f7bb-4aa9-9519-53f05329a833\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" Apr 02 14:03:17 crc kubenswrapper[4732]: I0402 14:03:17.241911 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-d4dtv\" (UID: \"2b032b72-f7bb-4aa9-9519-53f05329a833\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" Apr 02 14:03:17 crc kubenswrapper[4732]: I0402 14:03:17.242161 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-config\") pod \"dnsmasq-dns-5c7b6c5df9-d4dtv\" (UID: \"2b032b72-f7bb-4aa9-9519-53f05329a833\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" Apr 02 14:03:17 crc kubenswrapper[4732]: I0402 14:03:17.260761 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x64wb\" (UniqueName: \"kubernetes.io/projected/2b032b72-f7bb-4aa9-9519-53f05329a833-kube-api-access-x64wb\") pod \"dnsmasq-dns-5c7b6c5df9-d4dtv\" (UID: \"2b032b72-f7bb-4aa9-9519-53f05329a833\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" Apr 02 14:03:17 crc kubenswrapper[4732]: I0402 14:03:17.392416 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" Apr 02 14:03:17 crc kubenswrapper[4732]: I0402 14:03:17.900029 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-d4dtv"] Apr 02 14:03:18 crc kubenswrapper[4732]: I0402 14:03:18.696656 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" event={"ID":"2b032b72-f7bb-4aa9-9519-53f05329a833","Type":"ContainerDied","Data":"06342f502fa93f7175ea06d4316f631c61c05a407fde251397c55031e384ad5c"} Apr 02 14:03:18 crc kubenswrapper[4732]: I0402 14:03:18.696990 4732 generic.go:334] "Generic (PLEG): container finished" podID="2b032b72-f7bb-4aa9-9519-53f05329a833" containerID="06342f502fa93f7175ea06d4316f631c61c05a407fde251397c55031e384ad5c" exitCode=0 Apr 02 14:03:18 crc kubenswrapper[4732]: I0402 14:03:18.697240 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" event={"ID":"2b032b72-f7bb-4aa9-9519-53f05329a833","Type":"ContainerStarted","Data":"974819f0f1198e96f406a56dbdd66e616e9cc4d23df99ca023a59ac4bf52b9a5"} Apr 02 14:03:18 crc kubenswrapper[4732]: I0402 14:03:18.876601 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:03:18 crc kubenswrapper[4732]: I0402 14:03:18.877224 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bbe56802-47d3-4977-ad5c-1f4223d984e4" containerName="proxy-httpd" containerID="cri-o://a8bb55a1ac6a17d89b80d83ecab6d30c708be261b92e66adb7bad3020c7208af" gracePeriod=30 Apr 02 14:03:18 crc kubenswrapper[4732]: I0402 14:03:18.877599 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bbe56802-47d3-4977-ad5c-1f4223d984e4" containerName="sg-core" containerID="cri-o://3e2ebc42fe287a20a712e93cc316d5757dbdee4f1664600996be458a1d5393b5" gracePeriod=30 Apr 02 14:03:18 crc kubenswrapper[4732]: I0402 14:03:18.877691 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bbe56802-47d3-4977-ad5c-1f4223d984e4" containerName="ceilometer-notification-agent" containerID="cri-o://40598c5244c261a553473444f4fdef43a20371ebd9ec11c02d9850f44828de4d" gracePeriod=30 Apr 02 14:03:18 crc kubenswrapper[4732]: I0402 14:03:18.877741 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bbe56802-47d3-4977-ad5c-1f4223d984e4" containerName="ceilometer-central-agent" containerID="cri-o://8470ac4168306100ac49bef7fa6845da4b17056e0ad926f76c2f9c0ad0645754" gracePeriod=30 Apr 02 14:03:18 crc kubenswrapper[4732]: I0402 14:03:18.886199 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="bbe56802-47d3-4977-ad5c-1f4223d984e4" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.201:3000/\": EOF" Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.327774 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.369461 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.709149 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" event={"ID":"2b032b72-f7bb-4aa9-9519-53f05329a833","Type":"ContainerStarted","Data":"f96c0913b24199548d07f7ef68e2f2127ac60fa7555ee2a1e923924d659ddf1f"} Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.709645 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.713951 4732 generic.go:334] "Generic (PLEG): container finished" podID="bbe56802-47d3-4977-ad5c-1f4223d984e4" containerID="a8bb55a1ac6a17d89b80d83ecab6d30c708be261b92e66adb7bad3020c7208af" exitCode=0 Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.713985 4732 generic.go:334] "Generic (PLEG): container finished" podID="bbe56802-47d3-4977-ad5c-1f4223d984e4" containerID="3e2ebc42fe287a20a712e93cc316d5757dbdee4f1664600996be458a1d5393b5" exitCode=2 Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.713994 4732 generic.go:334] "Generic (PLEG): container finished" podID="bbe56802-47d3-4977-ad5c-1f4223d984e4" containerID="40598c5244c261a553473444f4fdef43a20371ebd9ec11c02d9850f44828de4d" exitCode=0 Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.714003 4732 generic.go:334] "Generic (PLEG): container finished" podID="bbe56802-47d3-4977-ad5c-1f4223d984e4" containerID="8470ac4168306100ac49bef7fa6845da4b17056e0ad926f76c2f9c0ad0645754" exitCode=0 Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.714201 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="528e73de-0003-424b-9f26-7dd54d1942f2" containerName="nova-api-log" containerID="cri-o://701851997acc62fba000c6ed45cd23fa4b96f1745fb951c822c9e909dfb07771" gracePeriod=30 Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.714450 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbe56802-47d3-4977-ad5c-1f4223d984e4","Type":"ContainerDied","Data":"a8bb55a1ac6a17d89b80d83ecab6d30c708be261b92e66adb7bad3020c7208af"} Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.714486 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbe56802-47d3-4977-ad5c-1f4223d984e4","Type":"ContainerDied","Data":"3e2ebc42fe287a20a712e93cc316d5757dbdee4f1664600996be458a1d5393b5"} Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.714501 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbe56802-47d3-4977-ad5c-1f4223d984e4","Type":"ContainerDied","Data":"40598c5244c261a553473444f4fdef43a20371ebd9ec11c02d9850f44828de4d"} Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.714512 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbe56802-47d3-4977-ad5c-1f4223d984e4","Type":"ContainerDied","Data":"8470ac4168306100ac49bef7fa6845da4b17056e0ad926f76c2f9c0ad0645754"} Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.714567 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="528e73de-0003-424b-9f26-7dd54d1942f2" containerName="nova-api-api" containerID="cri-o://5819edcdf83ec3c7ea34879074ac0ababcaeb29f71ee732cc169fcd2c92c4358" gracePeriod=30 Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.926114 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.958033 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" podStartSLOduration=2.958008484 podStartE2EDuration="2.958008484s" podCreationTimestamp="2026-04-02 14:03:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:03:19.737971958 +0000 UTC m=+1556.642379511" watchObservedRunningTime="2026-04-02 14:03:19.958008484 +0000 UTC m=+1556.862416027" Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.997817 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbe56802-47d3-4977-ad5c-1f4223d984e4-log-httpd\") pod \"bbe56802-47d3-4977-ad5c-1f4223d984e4\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.997934 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4hnk\" (UniqueName: \"kubernetes.io/projected/bbe56802-47d3-4977-ad5c-1f4223d984e4-kube-api-access-v4hnk\") pod \"bbe56802-47d3-4977-ad5c-1f4223d984e4\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.997972 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-config-data\") pod \"bbe56802-47d3-4977-ad5c-1f4223d984e4\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.997999 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-scripts\") pod \"bbe56802-47d3-4977-ad5c-1f4223d984e4\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.998040 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbe56802-47d3-4977-ad5c-1f4223d984e4-run-httpd\") pod \"bbe56802-47d3-4977-ad5c-1f4223d984e4\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.998070 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-ceilometer-tls-certs\") pod \"bbe56802-47d3-4977-ad5c-1f4223d984e4\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.998145 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-combined-ca-bundle\") pod \"bbe56802-47d3-4977-ad5c-1f4223d984e4\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.998197 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-sg-core-conf-yaml\") pod \"bbe56802-47d3-4977-ad5c-1f4223d984e4\" (UID: \"bbe56802-47d3-4977-ad5c-1f4223d984e4\") " Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.998301 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbe56802-47d3-4977-ad5c-1f4223d984e4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bbe56802-47d3-4977-ad5c-1f4223d984e4" (UID: "bbe56802-47d3-4977-ad5c-1f4223d984e4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.998492 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbe56802-47d3-4977-ad5c-1f4223d984e4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bbe56802-47d3-4977-ad5c-1f4223d984e4" (UID: "bbe56802-47d3-4977-ad5c-1f4223d984e4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.998651 4732 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbe56802-47d3-4977-ad5c-1f4223d984e4-log-httpd\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:19 crc kubenswrapper[4732]: I0402 14:03:19.998669 4732 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbe56802-47d3-4977-ad5c-1f4223d984e4-run-httpd\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.003819 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-scripts" (OuterVolumeSpecName: "scripts") pod "bbe56802-47d3-4977-ad5c-1f4223d984e4" (UID: "bbe56802-47d3-4977-ad5c-1f4223d984e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.007551 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbe56802-47d3-4977-ad5c-1f4223d984e4-kube-api-access-v4hnk" (OuterVolumeSpecName: "kube-api-access-v4hnk") pod "bbe56802-47d3-4977-ad5c-1f4223d984e4" (UID: "bbe56802-47d3-4977-ad5c-1f4223d984e4"). InnerVolumeSpecName "kube-api-access-v4hnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.033657 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bbe56802-47d3-4977-ad5c-1f4223d984e4" (UID: "bbe56802-47d3-4977-ad5c-1f4223d984e4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.061967 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "bbe56802-47d3-4977-ad5c-1f4223d984e4" (UID: "bbe56802-47d3-4977-ad5c-1f4223d984e4"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.086987 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbe56802-47d3-4977-ad5c-1f4223d984e4" (UID: "bbe56802-47d3-4977-ad5c-1f4223d984e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.100533 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4hnk\" (UniqueName: \"kubernetes.io/projected/bbe56802-47d3-4977-ad5c-1f4223d984e4-kube-api-access-v4hnk\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.100566 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.100576 4732 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.100586 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.100594 4732 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.108873 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-config-data" (OuterVolumeSpecName: "config-data") pod "bbe56802-47d3-4977-ad5c-1f4223d984e4" (UID: "bbe56802-47d3-4977-ad5c-1f4223d984e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.202626 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe56802-47d3-4977-ad5c-1f4223d984e4-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.733474 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbe56802-47d3-4977-ad5c-1f4223d984e4","Type":"ContainerDied","Data":"603f6595a4b14d769d2d6180978748cd6df477c752db5a8cae36dfb826f741fd"} Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.733523 4732 scope.go:117] "RemoveContainer" containerID="a8bb55a1ac6a17d89b80d83ecab6d30c708be261b92e66adb7bad3020c7208af" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.733689 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.739184 4732 generic.go:334] "Generic (PLEG): container finished" podID="528e73de-0003-424b-9f26-7dd54d1942f2" containerID="701851997acc62fba000c6ed45cd23fa4b96f1745fb951c822c9e909dfb07771" exitCode=143 Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.739236 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"528e73de-0003-424b-9f26-7dd54d1942f2","Type":"ContainerDied","Data":"701851997acc62fba000c6ed45cd23fa4b96f1745fb951c822c9e909dfb07771"} Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.780597 4732 scope.go:117] "RemoveContainer" containerID="3e2ebc42fe287a20a712e93cc316d5757dbdee4f1664600996be458a1d5393b5" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.785741 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.797779 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.810036 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:03:20 crc kubenswrapper[4732]: E0402 14:03:20.810605 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe56802-47d3-4977-ad5c-1f4223d984e4" containerName="ceilometer-central-agent" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.810629 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe56802-47d3-4977-ad5c-1f4223d984e4" containerName="ceilometer-central-agent" Apr 02 14:03:20 crc kubenswrapper[4732]: E0402 14:03:20.810643 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe56802-47d3-4977-ad5c-1f4223d984e4" containerName="ceilometer-notification-agent" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.810649 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe56802-47d3-4977-ad5c-1f4223d984e4" containerName="ceilometer-notification-agent" Apr 02 14:03:20 crc kubenswrapper[4732]: E0402 14:03:20.810675 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe56802-47d3-4977-ad5c-1f4223d984e4" containerName="proxy-httpd" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.810681 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe56802-47d3-4977-ad5c-1f4223d984e4" containerName="proxy-httpd" Apr 02 14:03:20 crc kubenswrapper[4732]: E0402 14:03:20.810699 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe56802-47d3-4977-ad5c-1f4223d984e4" containerName="sg-core" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.810704 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe56802-47d3-4977-ad5c-1f4223d984e4" containerName="sg-core" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.810872 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbe56802-47d3-4977-ad5c-1f4223d984e4" containerName="ceilometer-central-agent" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.810885 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbe56802-47d3-4977-ad5c-1f4223d984e4" containerName="ceilometer-notification-agent" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.810898 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbe56802-47d3-4977-ad5c-1f4223d984e4" containerName="sg-core" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.810917 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbe56802-47d3-4977-ad5c-1f4223d984e4" containerName="proxy-httpd" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.813937 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.818716 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.820780 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.821021 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.823794 4732 scope.go:117] "RemoveContainer" containerID="40598c5244c261a553473444f4fdef43a20371ebd9ec11c02d9850f44828de4d" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.823890 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.857720 4732 scope.go:117] "RemoveContainer" containerID="8470ac4168306100ac49bef7fa6845da4b17056e0ad926f76c2f9c0ad0645754" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.913883 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad6cbd5d-4434-4885-bf56-8ee47171b897-log-httpd\") pod \"ceilometer-0\" (UID: \"ad6cbd5d-4434-4885-bf56-8ee47171b897\") " pod="openstack/ceilometer-0" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.913952 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad6cbd5d-4434-4885-bf56-8ee47171b897-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad6cbd5d-4434-4885-bf56-8ee47171b897\") " pod="openstack/ceilometer-0" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.914291 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad6cbd5d-4434-4885-bf56-8ee47171b897-scripts\") pod \"ceilometer-0\" (UID: \"ad6cbd5d-4434-4885-bf56-8ee47171b897\") " pod="openstack/ceilometer-0" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.914356 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad6cbd5d-4434-4885-bf56-8ee47171b897-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad6cbd5d-4434-4885-bf56-8ee47171b897\") " pod="openstack/ceilometer-0" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.914422 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xnk8\" (UniqueName: \"kubernetes.io/projected/ad6cbd5d-4434-4885-bf56-8ee47171b897-kube-api-access-2xnk8\") pod \"ceilometer-0\" (UID: \"ad6cbd5d-4434-4885-bf56-8ee47171b897\") " pod="openstack/ceilometer-0" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.914569 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad6cbd5d-4434-4885-bf56-8ee47171b897-config-data\") pod \"ceilometer-0\" (UID: \"ad6cbd5d-4434-4885-bf56-8ee47171b897\") " pod="openstack/ceilometer-0" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.914653 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad6cbd5d-4434-4885-bf56-8ee47171b897-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ad6cbd5d-4434-4885-bf56-8ee47171b897\") " pod="openstack/ceilometer-0" Apr 02 14:03:20 crc kubenswrapper[4732]: I0402 14:03:20.914759 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad6cbd5d-4434-4885-bf56-8ee47171b897-run-httpd\") pod \"ceilometer-0\" (UID: \"ad6cbd5d-4434-4885-bf56-8ee47171b897\") " pod="openstack/ceilometer-0" Apr 02 14:03:21 crc kubenswrapper[4732]: I0402 14:03:21.016159 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad6cbd5d-4434-4885-bf56-8ee47171b897-scripts\") pod \"ceilometer-0\" (UID: \"ad6cbd5d-4434-4885-bf56-8ee47171b897\") " pod="openstack/ceilometer-0" Apr 02 14:03:21 crc kubenswrapper[4732]: I0402 14:03:21.016205 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad6cbd5d-4434-4885-bf56-8ee47171b897-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad6cbd5d-4434-4885-bf56-8ee47171b897\") " pod="openstack/ceilometer-0" Apr 02 14:03:21 crc kubenswrapper[4732]: I0402 14:03:21.016241 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xnk8\" (UniqueName: \"kubernetes.io/projected/ad6cbd5d-4434-4885-bf56-8ee47171b897-kube-api-access-2xnk8\") pod \"ceilometer-0\" (UID: \"ad6cbd5d-4434-4885-bf56-8ee47171b897\") " pod="openstack/ceilometer-0" Apr 02 14:03:21 crc kubenswrapper[4732]: I0402 14:03:21.016296 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad6cbd5d-4434-4885-bf56-8ee47171b897-config-data\") pod \"ceilometer-0\" (UID: \"ad6cbd5d-4434-4885-bf56-8ee47171b897\") " pod="openstack/ceilometer-0" Apr 02 14:03:21 crc kubenswrapper[4732]: I0402 14:03:21.016327 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad6cbd5d-4434-4885-bf56-8ee47171b897-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ad6cbd5d-4434-4885-bf56-8ee47171b897\") " pod="openstack/ceilometer-0" Apr 02 14:03:21 crc kubenswrapper[4732]: I0402 14:03:21.016365 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad6cbd5d-4434-4885-bf56-8ee47171b897-run-httpd\") pod \"ceilometer-0\" (UID: \"ad6cbd5d-4434-4885-bf56-8ee47171b897\") " pod="openstack/ceilometer-0" Apr 02 14:03:21 crc kubenswrapper[4732]: I0402 14:03:21.016421 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad6cbd5d-4434-4885-bf56-8ee47171b897-log-httpd\") pod \"ceilometer-0\" (UID: \"ad6cbd5d-4434-4885-bf56-8ee47171b897\") " pod="openstack/ceilometer-0" Apr 02 14:03:21 crc kubenswrapper[4732]: I0402 14:03:21.016462 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad6cbd5d-4434-4885-bf56-8ee47171b897-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad6cbd5d-4434-4885-bf56-8ee47171b897\") " pod="openstack/ceilometer-0" Apr 02 14:03:21 crc kubenswrapper[4732]: I0402 14:03:21.017714 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad6cbd5d-4434-4885-bf56-8ee47171b897-run-httpd\") pod \"ceilometer-0\" (UID: \"ad6cbd5d-4434-4885-bf56-8ee47171b897\") " pod="openstack/ceilometer-0" Apr 02 14:03:21 crc kubenswrapper[4732]: I0402 14:03:21.019088 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad6cbd5d-4434-4885-bf56-8ee47171b897-log-httpd\") pod \"ceilometer-0\" (UID: \"ad6cbd5d-4434-4885-bf56-8ee47171b897\") " pod="openstack/ceilometer-0" Apr 02 14:03:21 crc kubenswrapper[4732]: I0402 14:03:21.023338 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad6cbd5d-4434-4885-bf56-8ee47171b897-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad6cbd5d-4434-4885-bf56-8ee47171b897\") " pod="openstack/ceilometer-0" Apr 02 14:03:21 crc kubenswrapper[4732]: I0402 14:03:21.023434 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad6cbd5d-4434-4885-bf56-8ee47171b897-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ad6cbd5d-4434-4885-bf56-8ee47171b897\") " pod="openstack/ceilometer-0" Apr 02 14:03:21 crc kubenswrapper[4732]: I0402 14:03:21.023933 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad6cbd5d-4434-4885-bf56-8ee47171b897-scripts\") pod \"ceilometer-0\" (UID: \"ad6cbd5d-4434-4885-bf56-8ee47171b897\") " pod="openstack/ceilometer-0" Apr 02 14:03:21 crc kubenswrapper[4732]: I0402 14:03:21.025758 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad6cbd5d-4434-4885-bf56-8ee47171b897-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad6cbd5d-4434-4885-bf56-8ee47171b897\") " pod="openstack/ceilometer-0" Apr 02 14:03:21 crc kubenswrapper[4732]: I0402 14:03:21.030574 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad6cbd5d-4434-4885-bf56-8ee47171b897-config-data\") pod \"ceilometer-0\" (UID: \"ad6cbd5d-4434-4885-bf56-8ee47171b897\") " pod="openstack/ceilometer-0" Apr 02 14:03:21 crc kubenswrapper[4732]: I0402 14:03:21.040011 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xnk8\" (UniqueName: \"kubernetes.io/projected/ad6cbd5d-4434-4885-bf56-8ee47171b897-kube-api-access-2xnk8\") pod \"ceilometer-0\" (UID: \"ad6cbd5d-4434-4885-bf56-8ee47171b897\") " pod="openstack/ceilometer-0" Apr 02 14:03:21 crc kubenswrapper[4732]: I0402 14:03:21.148703 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 02 14:03:21 crc kubenswrapper[4732]: I0402 14:03:21.623230 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 02 14:03:21 crc kubenswrapper[4732]: W0402 14:03:21.623880 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad6cbd5d_4434_4885_bf56_8ee47171b897.slice/crio-5e611e5c0268d2c23b8dc2e1145c31eeedcf8b3f3f32b401ff1410f55959547a WatchSource:0}: Error finding container 5e611e5c0268d2c23b8dc2e1145c31eeedcf8b3f3f32b401ff1410f55959547a: Status 404 returned error can't find the container with id 5e611e5c0268d2c23b8dc2e1145c31eeedcf8b3f3f32b401ff1410f55959547a Apr 02 14:03:21 crc kubenswrapper[4732]: I0402 14:03:21.749402 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad6cbd5d-4434-4885-bf56-8ee47171b897","Type":"ContainerStarted","Data":"5e611e5c0268d2c23b8dc2e1145c31eeedcf8b3f3f32b401ff1410f55959547a"} Apr 02 14:03:22 crc kubenswrapper[4732]: I0402 14:03:22.692214 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbe56802-47d3-4977-ad5c-1f4223d984e4" path="/var/lib/kubelet/pods/bbe56802-47d3-4977-ad5c-1f4223d984e4/volumes" Apr 02 14:03:22 crc kubenswrapper[4732]: I0402 14:03:22.778660 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad6cbd5d-4434-4885-bf56-8ee47171b897","Type":"ContainerStarted","Data":"ef7991fd6fda437f8e8a1380ce138b14eb3500148c206ba22b7dbbe0d0e7b19b"} Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.285473 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.358253 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528e73de-0003-424b-9f26-7dd54d1942f2-combined-ca-bundle\") pod \"528e73de-0003-424b-9f26-7dd54d1942f2\" (UID: \"528e73de-0003-424b-9f26-7dd54d1942f2\") " Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.358462 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pfld\" (UniqueName: \"kubernetes.io/projected/528e73de-0003-424b-9f26-7dd54d1942f2-kube-api-access-6pfld\") pod \"528e73de-0003-424b-9f26-7dd54d1942f2\" (UID: \"528e73de-0003-424b-9f26-7dd54d1942f2\") " Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.358536 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/528e73de-0003-424b-9f26-7dd54d1942f2-config-data\") pod \"528e73de-0003-424b-9f26-7dd54d1942f2\" (UID: \"528e73de-0003-424b-9f26-7dd54d1942f2\") " Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.358557 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/528e73de-0003-424b-9f26-7dd54d1942f2-logs\") pod \"528e73de-0003-424b-9f26-7dd54d1942f2\" (UID: \"528e73de-0003-424b-9f26-7dd54d1942f2\") " Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.359698 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/528e73de-0003-424b-9f26-7dd54d1942f2-logs" (OuterVolumeSpecName: "logs") pod "528e73de-0003-424b-9f26-7dd54d1942f2" (UID: "528e73de-0003-424b-9f26-7dd54d1942f2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.366589 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/528e73de-0003-424b-9f26-7dd54d1942f2-kube-api-access-6pfld" (OuterVolumeSpecName: "kube-api-access-6pfld") pod "528e73de-0003-424b-9f26-7dd54d1942f2" (UID: "528e73de-0003-424b-9f26-7dd54d1942f2"). InnerVolumeSpecName "kube-api-access-6pfld". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.397258 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/528e73de-0003-424b-9f26-7dd54d1942f2-config-data" (OuterVolumeSpecName: "config-data") pod "528e73de-0003-424b-9f26-7dd54d1942f2" (UID: "528e73de-0003-424b-9f26-7dd54d1942f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.405703 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/528e73de-0003-424b-9f26-7dd54d1942f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "528e73de-0003-424b-9f26-7dd54d1942f2" (UID: "528e73de-0003-424b-9f26-7dd54d1942f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.461319 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/528e73de-0003-424b-9f26-7dd54d1942f2-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.461755 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/528e73de-0003-424b-9f26-7dd54d1942f2-logs\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.461765 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528e73de-0003-424b-9f26-7dd54d1942f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.461776 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pfld\" (UniqueName: \"kubernetes.io/projected/528e73de-0003-424b-9f26-7dd54d1942f2-kube-api-access-6pfld\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.788843 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad6cbd5d-4434-4885-bf56-8ee47171b897","Type":"ContainerStarted","Data":"fc80cecc6f55363b7dea1afb654d7146cd395ad8e2d266fb2b2a8cba8cfbf3fa"} Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.788884 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad6cbd5d-4434-4885-bf56-8ee47171b897","Type":"ContainerStarted","Data":"793524b5ce8f488183a894c5f46a6609b4aed9fb970f95a39d9e2b2eb479438d"} Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.790911 4732 generic.go:334] "Generic (PLEG): container finished" podID="528e73de-0003-424b-9f26-7dd54d1942f2" containerID="5819edcdf83ec3c7ea34879074ac0ababcaeb29f71ee732cc169fcd2c92c4358" exitCode=0 Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.790946 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"528e73de-0003-424b-9f26-7dd54d1942f2","Type":"ContainerDied","Data":"5819edcdf83ec3c7ea34879074ac0ababcaeb29f71ee732cc169fcd2c92c4358"} Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.790964 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"528e73de-0003-424b-9f26-7dd54d1942f2","Type":"ContainerDied","Data":"2197897d2d4af884722456de6d28e3d53266723c638dcc4cbfb36f587abfc4f3"} Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.790981 4732 scope.go:117] "RemoveContainer" containerID="5819edcdf83ec3c7ea34879074ac0ababcaeb29f71ee732cc169fcd2c92c4358" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.791097 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.836631 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.855236 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.861935 4732 scope.go:117] "RemoveContainer" containerID="701851997acc62fba000c6ed45cd23fa4b96f1745fb951c822c9e909dfb07771" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.868552 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Apr 02 14:03:23 crc kubenswrapper[4732]: E0402 14:03:23.869120 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="528e73de-0003-424b-9f26-7dd54d1942f2" containerName="nova-api-api" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.869144 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="528e73de-0003-424b-9f26-7dd54d1942f2" containerName="nova-api-api" Apr 02 14:03:23 crc kubenswrapper[4732]: E0402 14:03:23.869173 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="528e73de-0003-424b-9f26-7dd54d1942f2" containerName="nova-api-log" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.869183 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="528e73de-0003-424b-9f26-7dd54d1942f2" containerName="nova-api-log" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.869470 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="528e73de-0003-424b-9f26-7dd54d1942f2" containerName="nova-api-api" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.869497 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="528e73de-0003-424b-9f26-7dd54d1942f2" containerName="nova-api-log" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.870826 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.875803 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.875839 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.876119 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.878206 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.894291 4732 scope.go:117] "RemoveContainer" containerID="5819edcdf83ec3c7ea34879074ac0ababcaeb29f71ee732cc169fcd2c92c4358" Apr 02 14:03:23 crc kubenswrapper[4732]: E0402 14:03:23.897782 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5819edcdf83ec3c7ea34879074ac0ababcaeb29f71ee732cc169fcd2c92c4358\": container with ID starting with 5819edcdf83ec3c7ea34879074ac0ababcaeb29f71ee732cc169fcd2c92c4358 not found: ID does not exist" containerID="5819edcdf83ec3c7ea34879074ac0ababcaeb29f71ee732cc169fcd2c92c4358" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.897908 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5819edcdf83ec3c7ea34879074ac0ababcaeb29f71ee732cc169fcd2c92c4358"} err="failed to get container status \"5819edcdf83ec3c7ea34879074ac0ababcaeb29f71ee732cc169fcd2c92c4358\": rpc error: code = NotFound desc = could not find container \"5819edcdf83ec3c7ea34879074ac0ababcaeb29f71ee732cc169fcd2c92c4358\": container with ID starting with 5819edcdf83ec3c7ea34879074ac0ababcaeb29f71ee732cc169fcd2c92c4358 not found: ID does not exist" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.897988 4732 scope.go:117] "RemoveContainer" containerID="701851997acc62fba000c6ed45cd23fa4b96f1745fb951c822c9e909dfb07771" Apr 02 14:03:23 crc kubenswrapper[4732]: E0402 14:03:23.902700 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"701851997acc62fba000c6ed45cd23fa4b96f1745fb951c822c9e909dfb07771\": container with ID starting with 701851997acc62fba000c6ed45cd23fa4b96f1745fb951c822c9e909dfb07771 not found: ID does not exist" containerID="701851997acc62fba000c6ed45cd23fa4b96f1745fb951c822c9e909dfb07771" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.902801 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"701851997acc62fba000c6ed45cd23fa4b96f1745fb951c822c9e909dfb07771"} err="failed to get container status \"701851997acc62fba000c6ed45cd23fa4b96f1745fb951c822c9e909dfb07771\": rpc error: code = NotFound desc = could not find container \"701851997acc62fba000c6ed45cd23fa4b96f1745fb951c822c9e909dfb07771\": container with ID starting with 701851997acc62fba000c6ed45cd23fa4b96f1745fb951c822c9e909dfb07771 not found: ID does not exist" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.971671 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6983768-ac75-4ee9-9786-4c46b45c428f-config-data\") pod \"nova-api-0\" (UID: \"f6983768-ac75-4ee9-9786-4c46b45c428f\") " pod="openstack/nova-api-0" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.971789 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6983768-ac75-4ee9-9786-4c46b45c428f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f6983768-ac75-4ee9-9786-4c46b45c428f\") " pod="openstack/nova-api-0" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.971813 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6983768-ac75-4ee9-9786-4c46b45c428f-public-tls-certs\") pod \"nova-api-0\" (UID: \"f6983768-ac75-4ee9-9786-4c46b45c428f\") " pod="openstack/nova-api-0" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.971859 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcnmr\" (UniqueName: \"kubernetes.io/projected/f6983768-ac75-4ee9-9786-4c46b45c428f-kube-api-access-tcnmr\") pod \"nova-api-0\" (UID: \"f6983768-ac75-4ee9-9786-4c46b45c428f\") " pod="openstack/nova-api-0" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.971904 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6983768-ac75-4ee9-9786-4c46b45c428f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f6983768-ac75-4ee9-9786-4c46b45c428f\") " pod="openstack/nova-api-0" Apr 02 14:03:23 crc kubenswrapper[4732]: I0402 14:03:23.971924 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6983768-ac75-4ee9-9786-4c46b45c428f-logs\") pod \"nova-api-0\" (UID: \"f6983768-ac75-4ee9-9786-4c46b45c428f\") " pod="openstack/nova-api-0" Apr 02 14:03:24 crc kubenswrapper[4732]: I0402 14:03:24.073934 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6983768-ac75-4ee9-9786-4c46b45c428f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f6983768-ac75-4ee9-9786-4c46b45c428f\") " pod="openstack/nova-api-0" Apr 02 14:03:24 crc kubenswrapper[4732]: I0402 14:03:24.073975 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6983768-ac75-4ee9-9786-4c46b45c428f-public-tls-certs\") pod \"nova-api-0\" (UID: \"f6983768-ac75-4ee9-9786-4c46b45c428f\") " pod="openstack/nova-api-0" Apr 02 14:03:24 crc kubenswrapper[4732]: I0402 14:03:24.074019 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcnmr\" (UniqueName: \"kubernetes.io/projected/f6983768-ac75-4ee9-9786-4c46b45c428f-kube-api-access-tcnmr\") pod \"nova-api-0\" (UID: \"f6983768-ac75-4ee9-9786-4c46b45c428f\") " pod="openstack/nova-api-0" Apr 02 14:03:24 crc kubenswrapper[4732]: I0402 14:03:24.074059 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6983768-ac75-4ee9-9786-4c46b45c428f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f6983768-ac75-4ee9-9786-4c46b45c428f\") " pod="openstack/nova-api-0" Apr 02 14:03:24 crc kubenswrapper[4732]: I0402 14:03:24.074078 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6983768-ac75-4ee9-9786-4c46b45c428f-logs\") pod \"nova-api-0\" (UID: \"f6983768-ac75-4ee9-9786-4c46b45c428f\") " pod="openstack/nova-api-0" Apr 02 14:03:24 crc kubenswrapper[4732]: I0402 14:03:24.074100 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6983768-ac75-4ee9-9786-4c46b45c428f-config-data\") pod \"nova-api-0\" (UID: \"f6983768-ac75-4ee9-9786-4c46b45c428f\") " pod="openstack/nova-api-0" Apr 02 14:03:24 crc kubenswrapper[4732]: I0402 14:03:24.074919 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6983768-ac75-4ee9-9786-4c46b45c428f-logs\") pod \"nova-api-0\" (UID: \"f6983768-ac75-4ee9-9786-4c46b45c428f\") " pod="openstack/nova-api-0" Apr 02 14:03:24 crc kubenswrapper[4732]: I0402 14:03:24.079314 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6983768-ac75-4ee9-9786-4c46b45c428f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f6983768-ac75-4ee9-9786-4c46b45c428f\") " pod="openstack/nova-api-0" Apr 02 14:03:24 crc kubenswrapper[4732]: I0402 14:03:24.079766 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6983768-ac75-4ee9-9786-4c46b45c428f-config-data\") pod \"nova-api-0\" (UID: \"f6983768-ac75-4ee9-9786-4c46b45c428f\") " pod="openstack/nova-api-0" Apr 02 14:03:24 crc kubenswrapper[4732]: I0402 14:03:24.100529 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcnmr\" (UniqueName: \"kubernetes.io/projected/f6983768-ac75-4ee9-9786-4c46b45c428f-kube-api-access-tcnmr\") pod \"nova-api-0\" (UID: \"f6983768-ac75-4ee9-9786-4c46b45c428f\") " pod="openstack/nova-api-0" Apr 02 14:03:24 crc kubenswrapper[4732]: I0402 14:03:24.102645 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6983768-ac75-4ee9-9786-4c46b45c428f-public-tls-certs\") pod \"nova-api-0\" (UID: \"f6983768-ac75-4ee9-9786-4c46b45c428f\") " pod="openstack/nova-api-0" Apr 02 14:03:24 crc kubenswrapper[4732]: I0402 14:03:24.104415 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6983768-ac75-4ee9-9786-4c46b45c428f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f6983768-ac75-4ee9-9786-4c46b45c428f\") " pod="openstack/nova-api-0" Apr 02 14:03:24 crc kubenswrapper[4732]: I0402 14:03:24.218224 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 02 14:03:24 crc kubenswrapper[4732]: I0402 14:03:24.328322 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:03:24 crc kubenswrapper[4732]: I0402 14:03:24.357781 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:03:24 crc kubenswrapper[4732]: I0402 14:03:24.693263 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="528e73de-0003-424b-9f26-7dd54d1942f2" path="/var/lib/kubelet/pods/528e73de-0003-424b-9f26-7dd54d1942f2/volumes" Apr 02 14:03:24 crc kubenswrapper[4732]: I0402 14:03:24.694421 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Apr 02 14:03:24 crc kubenswrapper[4732]: I0402 14:03:24.800960 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f6983768-ac75-4ee9-9786-4c46b45c428f","Type":"ContainerStarted","Data":"2074d94acba0f9080edd2c2711b97f09cc45e2742f4ed271abe8b164931f44d9"} Apr 02 14:03:24 crc kubenswrapper[4732]: I0402 14:03:24.818632 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Apr 02 14:03:25 crc kubenswrapper[4732]: I0402 14:03:25.078414 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-mzjkp"] Apr 02 14:03:25 crc kubenswrapper[4732]: I0402 14:03:25.080317 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mzjkp" Apr 02 14:03:25 crc kubenswrapper[4732]: I0402 14:03:25.082508 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Apr 02 14:03:25 crc kubenswrapper[4732]: I0402 14:03:25.082508 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Apr 02 14:03:25 crc kubenswrapper[4732]: I0402 14:03:25.088004 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mzjkp"] Apr 02 14:03:25 crc kubenswrapper[4732]: I0402 14:03:25.105176 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a66a025-735c-4c9a-b4cd-06046d7d3881-scripts\") pod \"nova-cell1-cell-mapping-mzjkp\" (UID: \"6a66a025-735c-4c9a-b4cd-06046d7d3881\") " pod="openstack/nova-cell1-cell-mapping-mzjkp" Apr 02 14:03:25 crc kubenswrapper[4732]: I0402 14:03:25.105398 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a66a025-735c-4c9a-b4cd-06046d7d3881-config-data\") pod \"nova-cell1-cell-mapping-mzjkp\" (UID: \"6a66a025-735c-4c9a-b4cd-06046d7d3881\") " pod="openstack/nova-cell1-cell-mapping-mzjkp" Apr 02 14:03:25 crc kubenswrapper[4732]: I0402 14:03:25.105710 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a66a025-735c-4c9a-b4cd-06046d7d3881-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mzjkp\" (UID: \"6a66a025-735c-4c9a-b4cd-06046d7d3881\") " pod="openstack/nova-cell1-cell-mapping-mzjkp" Apr 02 14:03:25 crc kubenswrapper[4732]: I0402 14:03:25.106016 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtdxm\" (UniqueName: \"kubernetes.io/projected/6a66a025-735c-4c9a-b4cd-06046d7d3881-kube-api-access-rtdxm\") pod \"nova-cell1-cell-mapping-mzjkp\" (UID: \"6a66a025-735c-4c9a-b4cd-06046d7d3881\") " pod="openstack/nova-cell1-cell-mapping-mzjkp" Apr 02 14:03:25 crc kubenswrapper[4732]: I0402 14:03:25.207542 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a66a025-735c-4c9a-b4cd-06046d7d3881-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mzjkp\" (UID: \"6a66a025-735c-4c9a-b4cd-06046d7d3881\") " pod="openstack/nova-cell1-cell-mapping-mzjkp" Apr 02 14:03:25 crc kubenswrapper[4732]: I0402 14:03:25.207654 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtdxm\" (UniqueName: \"kubernetes.io/projected/6a66a025-735c-4c9a-b4cd-06046d7d3881-kube-api-access-rtdxm\") pod \"nova-cell1-cell-mapping-mzjkp\" (UID: \"6a66a025-735c-4c9a-b4cd-06046d7d3881\") " pod="openstack/nova-cell1-cell-mapping-mzjkp" Apr 02 14:03:25 crc kubenswrapper[4732]: I0402 14:03:25.207734 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a66a025-735c-4c9a-b4cd-06046d7d3881-scripts\") pod \"nova-cell1-cell-mapping-mzjkp\" (UID: \"6a66a025-735c-4c9a-b4cd-06046d7d3881\") " pod="openstack/nova-cell1-cell-mapping-mzjkp" Apr 02 14:03:25 crc kubenswrapper[4732]: I0402 14:03:25.207754 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a66a025-735c-4c9a-b4cd-06046d7d3881-config-data\") pod \"nova-cell1-cell-mapping-mzjkp\" (UID: \"6a66a025-735c-4c9a-b4cd-06046d7d3881\") " pod="openstack/nova-cell1-cell-mapping-mzjkp" Apr 02 14:03:25 crc kubenswrapper[4732]: I0402 14:03:25.213397 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a66a025-735c-4c9a-b4cd-06046d7d3881-config-data\") pod \"nova-cell1-cell-mapping-mzjkp\" (UID: \"6a66a025-735c-4c9a-b4cd-06046d7d3881\") " pod="openstack/nova-cell1-cell-mapping-mzjkp" Apr 02 14:03:25 crc kubenswrapper[4732]: I0402 14:03:25.213882 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a66a025-735c-4c9a-b4cd-06046d7d3881-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mzjkp\" (UID: \"6a66a025-735c-4c9a-b4cd-06046d7d3881\") " pod="openstack/nova-cell1-cell-mapping-mzjkp" Apr 02 14:03:25 crc kubenswrapper[4732]: I0402 14:03:25.213894 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a66a025-735c-4c9a-b4cd-06046d7d3881-scripts\") pod \"nova-cell1-cell-mapping-mzjkp\" (UID: \"6a66a025-735c-4c9a-b4cd-06046d7d3881\") " pod="openstack/nova-cell1-cell-mapping-mzjkp" Apr 02 14:03:25 crc kubenswrapper[4732]: I0402 14:03:25.224878 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtdxm\" (UniqueName: \"kubernetes.io/projected/6a66a025-735c-4c9a-b4cd-06046d7d3881-kube-api-access-rtdxm\") pod \"nova-cell1-cell-mapping-mzjkp\" (UID: \"6a66a025-735c-4c9a-b4cd-06046d7d3881\") " pod="openstack/nova-cell1-cell-mapping-mzjkp" Apr 02 14:03:25 crc kubenswrapper[4732]: I0402 14:03:25.446491 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mzjkp" Apr 02 14:03:25 crc kubenswrapper[4732]: I0402 14:03:25.826002 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad6cbd5d-4434-4885-bf56-8ee47171b897","Type":"ContainerStarted","Data":"dff2820783563536225432398b16ec6d0c0c335ed7ef627e3fd74950977eea4f"} Apr 02 14:03:25 crc kubenswrapper[4732]: I0402 14:03:25.826056 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Apr 02 14:03:25 crc kubenswrapper[4732]: I0402 14:03:25.829774 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f6983768-ac75-4ee9-9786-4c46b45c428f","Type":"ContainerStarted","Data":"fa6eec165173de6ab1d738e7b412a7d3f02c1033b5f787a929f005d54e5f2bbd"} Apr 02 14:03:25 crc kubenswrapper[4732]: I0402 14:03:25.829843 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f6983768-ac75-4ee9-9786-4c46b45c428f","Type":"ContainerStarted","Data":"66602a1bf8154f811b73e0d3f5226afdfc2fc36ae1d03711e564f72fbd3d2ec6"} Apr 02 14:03:25 crc kubenswrapper[4732]: I0402 14:03:25.877239 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.8772215599999997 podStartE2EDuration="2.87722156s" podCreationTimestamp="2026-04-02 14:03:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:03:25.876004017 +0000 UTC m=+1562.780411590" watchObservedRunningTime="2026-04-02 14:03:25.87722156 +0000 UTC m=+1562.781629113" Apr 02 14:03:25 crc kubenswrapper[4732]: I0402 14:03:25.884460 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.237107542 podStartE2EDuration="5.884439929s" podCreationTimestamp="2026-04-02 14:03:20 +0000 UTC" firstStartedPulling="2026-04-02 14:03:21.626534254 +0000 UTC m=+1558.530941807" lastFinishedPulling="2026-04-02 14:03:25.273866641 +0000 UTC m=+1562.178274194" observedRunningTime="2026-04-02 14:03:25.851256147 +0000 UTC m=+1562.755663700" watchObservedRunningTime="2026-04-02 14:03:25.884439929 +0000 UTC m=+1562.788847492" Apr 02 14:03:25 crc kubenswrapper[4732]: I0402 14:03:25.911536 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mzjkp"] Apr 02 14:03:25 crc kubenswrapper[4732]: W0402 14:03:25.913070 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a66a025_735c_4c9a_b4cd_06046d7d3881.slice/crio-da0b0719a46234a4a1e7c202e2afde8923cb6b34722116b512dd4291ea76cb29 WatchSource:0}: Error finding container da0b0719a46234a4a1e7c202e2afde8923cb6b34722116b512dd4291ea76cb29: Status 404 returned error can't find the container with id da0b0719a46234a4a1e7c202e2afde8923cb6b34722116b512dd4291ea76cb29 Apr 02 14:03:26 crc kubenswrapper[4732]: I0402 14:03:26.849382 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mzjkp" event={"ID":"6a66a025-735c-4c9a-b4cd-06046d7d3881","Type":"ContainerStarted","Data":"e8d9d5d06641186f39eedf48abb0fb9d40836867d71ca997e46a75ff5e9aa91a"} Apr 02 14:03:26 crc kubenswrapper[4732]: I0402 14:03:26.849945 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mzjkp" event={"ID":"6a66a025-735c-4c9a-b4cd-06046d7d3881","Type":"ContainerStarted","Data":"da0b0719a46234a4a1e7c202e2afde8923cb6b34722116b512dd4291ea76cb29"} Apr 02 14:03:26 crc kubenswrapper[4732]: I0402 14:03:26.868055 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-mzjkp" podStartSLOduration=1.868038124 podStartE2EDuration="1.868038124s" podCreationTimestamp="2026-04-02 14:03:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:03:26.866036008 +0000 UTC m=+1563.770443601" watchObservedRunningTime="2026-04-02 14:03:26.868038124 +0000 UTC m=+1563.772445677" Apr 02 14:03:27 crc kubenswrapper[4732]: I0402 14:03:27.393751 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" Apr 02 14:03:27 crc kubenswrapper[4732]: I0402 14:03:27.456770 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-tjr4k"] Apr 02 14:03:27 crc kubenswrapper[4732]: I0402 14:03:27.457039 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" podUID="bdbfa898-0999-49da-9194-ff7bf15c955a" containerName="dnsmasq-dns" containerID="cri-o://7683803f1b4e4c9804e6849c443a9edba0c3c79982d1204a8622cb6bb94f982e" gracePeriod=10 Apr 02 14:03:27 crc kubenswrapper[4732]: I0402 14:03:27.861699 4732 generic.go:334] "Generic (PLEG): container finished" podID="bdbfa898-0999-49da-9194-ff7bf15c955a" containerID="7683803f1b4e4c9804e6849c443a9edba0c3c79982d1204a8622cb6bb94f982e" exitCode=0 Apr 02 14:03:27 crc kubenswrapper[4732]: I0402 14:03:27.861856 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" event={"ID":"bdbfa898-0999-49da-9194-ff7bf15c955a","Type":"ContainerDied","Data":"7683803f1b4e4c9804e6849c443a9edba0c3c79982d1204a8622cb6bb94f982e"} Apr 02 14:03:28 crc kubenswrapper[4732]: I0402 14:03:28.001977 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" Apr 02 14:03:28 crc kubenswrapper[4732]: I0402 14:03:28.077597 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-ovsdbserver-nb\") pod \"bdbfa898-0999-49da-9194-ff7bf15c955a\" (UID: \"bdbfa898-0999-49da-9194-ff7bf15c955a\") " Apr 02 14:03:28 crc kubenswrapper[4732]: I0402 14:03:28.077826 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-dns-swift-storage-0\") pod \"bdbfa898-0999-49da-9194-ff7bf15c955a\" (UID: \"bdbfa898-0999-49da-9194-ff7bf15c955a\") " Apr 02 14:03:28 crc kubenswrapper[4732]: I0402 14:03:28.077901 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-ovsdbserver-sb\") pod \"bdbfa898-0999-49da-9194-ff7bf15c955a\" (UID: \"bdbfa898-0999-49da-9194-ff7bf15c955a\") " Apr 02 14:03:28 crc kubenswrapper[4732]: I0402 14:03:28.077929 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-config\") pod \"bdbfa898-0999-49da-9194-ff7bf15c955a\" (UID: \"bdbfa898-0999-49da-9194-ff7bf15c955a\") " Apr 02 14:03:28 crc kubenswrapper[4732]: I0402 14:03:28.077957 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8996\" (UniqueName: \"kubernetes.io/projected/bdbfa898-0999-49da-9194-ff7bf15c955a-kube-api-access-l8996\") pod \"bdbfa898-0999-49da-9194-ff7bf15c955a\" (UID: \"bdbfa898-0999-49da-9194-ff7bf15c955a\") " Apr 02 14:03:28 crc kubenswrapper[4732]: I0402 14:03:28.078059 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-dns-svc\") pod \"bdbfa898-0999-49da-9194-ff7bf15c955a\" (UID: \"bdbfa898-0999-49da-9194-ff7bf15c955a\") " Apr 02 14:03:28 crc kubenswrapper[4732]: I0402 14:03:28.085669 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdbfa898-0999-49da-9194-ff7bf15c955a-kube-api-access-l8996" (OuterVolumeSpecName: "kube-api-access-l8996") pod "bdbfa898-0999-49da-9194-ff7bf15c955a" (UID: "bdbfa898-0999-49da-9194-ff7bf15c955a"). InnerVolumeSpecName "kube-api-access-l8996". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:03:28 crc kubenswrapper[4732]: I0402 14:03:28.149236 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bdbfa898-0999-49da-9194-ff7bf15c955a" (UID: "bdbfa898-0999-49da-9194-ff7bf15c955a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:03:28 crc kubenswrapper[4732]: I0402 14:03:28.159676 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bdbfa898-0999-49da-9194-ff7bf15c955a" (UID: "bdbfa898-0999-49da-9194-ff7bf15c955a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:03:28 crc kubenswrapper[4732]: I0402 14:03:28.161596 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bdbfa898-0999-49da-9194-ff7bf15c955a" (UID: "bdbfa898-0999-49da-9194-ff7bf15c955a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:03:28 crc kubenswrapper[4732]: I0402 14:03:28.163172 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bdbfa898-0999-49da-9194-ff7bf15c955a" (UID: "bdbfa898-0999-49da-9194-ff7bf15c955a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:03:28 crc kubenswrapper[4732]: I0402 14:03:28.169292 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-config" (OuterVolumeSpecName: "config") pod "bdbfa898-0999-49da-9194-ff7bf15c955a" (UID: "bdbfa898-0999-49da-9194-ff7bf15c955a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:03:28 crc kubenswrapper[4732]: I0402 14:03:28.179816 4732 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:28 crc kubenswrapper[4732]: I0402 14:03:28.179938 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:28 crc kubenswrapper[4732]: I0402 14:03:28.180025 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-config\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:28 crc kubenswrapper[4732]: I0402 14:03:28.180080 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8996\" (UniqueName: \"kubernetes.io/projected/bdbfa898-0999-49da-9194-ff7bf15c955a-kube-api-access-l8996\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:28 crc kubenswrapper[4732]: I0402 14:03:28.180133 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:28 crc kubenswrapper[4732]: I0402 14:03:28.180272 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdbfa898-0999-49da-9194-ff7bf15c955a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:28 crc kubenswrapper[4732]: I0402 14:03:28.873745 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" event={"ID":"bdbfa898-0999-49da-9194-ff7bf15c955a","Type":"ContainerDied","Data":"163b2e846d09b9a03cf17ea555e6647944ddc665e524767fe6e4f21f700331ea"} Apr 02 14:03:28 crc kubenswrapper[4732]: I0402 14:03:28.874084 4732 scope.go:117] "RemoveContainer" containerID="7683803f1b4e4c9804e6849c443a9edba0c3c79982d1204a8622cb6bb94f982e" Apr 02 14:03:28 crc kubenswrapper[4732]: I0402 14:03:28.874237 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-tjr4k" Apr 02 14:03:28 crc kubenswrapper[4732]: I0402 14:03:28.902794 4732 scope.go:117] "RemoveContainer" containerID="6130e8c315cc1739ad437865979ed3ee0d0c4838e16bad5d65a0e855b6764253" Apr 02 14:03:28 crc kubenswrapper[4732]: I0402 14:03:28.903272 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-tjr4k"] Apr 02 14:03:28 crc kubenswrapper[4732]: I0402 14:03:28.916689 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-tjr4k"] Apr 02 14:03:30 crc kubenswrapper[4732]: I0402 14:03:30.695293 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdbfa898-0999-49da-9194-ff7bf15c955a" path="/var/lib/kubelet/pods/bdbfa898-0999-49da-9194-ff7bf15c955a/volumes" Apr 02 14:03:30 crc kubenswrapper[4732]: I0402 14:03:30.893641 4732 generic.go:334] "Generic (PLEG): container finished" podID="6a66a025-735c-4c9a-b4cd-06046d7d3881" containerID="e8d9d5d06641186f39eedf48abb0fb9d40836867d71ca997e46a75ff5e9aa91a" exitCode=0 Apr 02 14:03:30 crc kubenswrapper[4732]: I0402 14:03:30.893687 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mzjkp" event={"ID":"6a66a025-735c-4c9a-b4cd-06046d7d3881","Type":"ContainerDied","Data":"e8d9d5d06641186f39eedf48abb0fb9d40836867d71ca997e46a75ff5e9aa91a"} Apr 02 14:03:31 crc kubenswrapper[4732]: I0402 14:03:31.924916 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:03:31 crc kubenswrapper[4732]: I0402 14:03:31.925307 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:03:32 crc kubenswrapper[4732]: I0402 14:03:32.276990 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mzjkp" Apr 02 14:03:32 crc kubenswrapper[4732]: I0402 14:03:32.359521 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a66a025-735c-4c9a-b4cd-06046d7d3881-scripts\") pod \"6a66a025-735c-4c9a-b4cd-06046d7d3881\" (UID: \"6a66a025-735c-4c9a-b4cd-06046d7d3881\") " Apr 02 14:03:32 crc kubenswrapper[4732]: I0402 14:03:32.359696 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a66a025-735c-4c9a-b4cd-06046d7d3881-combined-ca-bundle\") pod \"6a66a025-735c-4c9a-b4cd-06046d7d3881\" (UID: \"6a66a025-735c-4c9a-b4cd-06046d7d3881\") " Apr 02 14:03:32 crc kubenswrapper[4732]: I0402 14:03:32.359729 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtdxm\" (UniqueName: \"kubernetes.io/projected/6a66a025-735c-4c9a-b4cd-06046d7d3881-kube-api-access-rtdxm\") pod \"6a66a025-735c-4c9a-b4cd-06046d7d3881\" (UID: \"6a66a025-735c-4c9a-b4cd-06046d7d3881\") " Apr 02 14:03:32 crc kubenswrapper[4732]: I0402 14:03:32.359767 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a66a025-735c-4c9a-b4cd-06046d7d3881-config-data\") pod \"6a66a025-735c-4c9a-b4cd-06046d7d3881\" (UID: \"6a66a025-735c-4c9a-b4cd-06046d7d3881\") " Apr 02 14:03:32 crc kubenswrapper[4732]: I0402 14:03:32.365786 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a66a025-735c-4c9a-b4cd-06046d7d3881-scripts" (OuterVolumeSpecName: "scripts") pod "6a66a025-735c-4c9a-b4cd-06046d7d3881" (UID: "6a66a025-735c-4c9a-b4cd-06046d7d3881"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:03:32 crc kubenswrapper[4732]: I0402 14:03:32.366247 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a66a025-735c-4c9a-b4cd-06046d7d3881-kube-api-access-rtdxm" (OuterVolumeSpecName: "kube-api-access-rtdxm") pod "6a66a025-735c-4c9a-b4cd-06046d7d3881" (UID: "6a66a025-735c-4c9a-b4cd-06046d7d3881"). InnerVolumeSpecName "kube-api-access-rtdxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:03:32 crc kubenswrapper[4732]: I0402 14:03:32.394548 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a66a025-735c-4c9a-b4cd-06046d7d3881-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a66a025-735c-4c9a-b4cd-06046d7d3881" (UID: "6a66a025-735c-4c9a-b4cd-06046d7d3881"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:03:32 crc kubenswrapper[4732]: I0402 14:03:32.397043 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a66a025-735c-4c9a-b4cd-06046d7d3881-config-data" (OuterVolumeSpecName: "config-data") pod "6a66a025-735c-4c9a-b4cd-06046d7d3881" (UID: "6a66a025-735c-4c9a-b4cd-06046d7d3881"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:03:32 crc kubenswrapper[4732]: I0402 14:03:32.461841 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a66a025-735c-4c9a-b4cd-06046d7d3881-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:32 crc kubenswrapper[4732]: I0402 14:03:32.461874 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtdxm\" (UniqueName: \"kubernetes.io/projected/6a66a025-735c-4c9a-b4cd-06046d7d3881-kube-api-access-rtdxm\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:32 crc kubenswrapper[4732]: I0402 14:03:32.461886 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a66a025-735c-4c9a-b4cd-06046d7d3881-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:32 crc kubenswrapper[4732]: I0402 14:03:32.461896 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a66a025-735c-4c9a-b4cd-06046d7d3881-scripts\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:32 crc kubenswrapper[4732]: I0402 14:03:32.915299 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mzjkp" event={"ID":"6a66a025-735c-4c9a-b4cd-06046d7d3881","Type":"ContainerDied","Data":"da0b0719a46234a4a1e7c202e2afde8923cb6b34722116b512dd4291ea76cb29"} Apr 02 14:03:32 crc kubenswrapper[4732]: I0402 14:03:32.915344 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da0b0719a46234a4a1e7c202e2afde8923cb6b34722116b512dd4291ea76cb29" Apr 02 14:03:32 crc kubenswrapper[4732]: I0402 14:03:32.915395 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mzjkp" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.090064 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.090349 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f6983768-ac75-4ee9-9786-4c46b45c428f" containerName="nova-api-log" containerID="cri-o://66602a1bf8154f811b73e0d3f5226afdfc2fc36ae1d03711e564f72fbd3d2ec6" gracePeriod=30 Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.090831 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f6983768-ac75-4ee9-9786-4c46b45c428f" containerName="nova-api-api" containerID="cri-o://fa6eec165173de6ab1d738e7b412a7d3f02c1033b5f787a929f005d54e5f2bbd" gracePeriod=30 Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.104112 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.104321 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f7d18cc6-1dc3-4ed6-a8a4-822678468b70" containerName="nova-scheduler-scheduler" containerID="cri-o://553271ae9fa33a76a3a65beeb9372a3380897000327ade77ae956c1a1ba7cffd" gracePeriod=30 Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.132887 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.133331 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="737abec3-8d91-4234-83fc-ebbdbd59812e" containerName="nova-metadata-log" containerID="cri-o://d436a18c3276554f7ff3affa860edb60ae090d140dfd34f92ab6855bd15d983c" gracePeriod=30 Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.133441 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="737abec3-8d91-4234-83fc-ebbdbd59812e" containerName="nova-metadata-metadata" containerID="cri-o://3da6bf392b2c2b310dca2b6ee59b744d6cddff583caaf6fc88bfecfccdf461b1" gracePeriod=30 Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.632325 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.683152 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcnmr\" (UniqueName: \"kubernetes.io/projected/f6983768-ac75-4ee9-9786-4c46b45c428f-kube-api-access-tcnmr\") pod \"f6983768-ac75-4ee9-9786-4c46b45c428f\" (UID: \"f6983768-ac75-4ee9-9786-4c46b45c428f\") " Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.683435 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6983768-ac75-4ee9-9786-4c46b45c428f-internal-tls-certs\") pod \"f6983768-ac75-4ee9-9786-4c46b45c428f\" (UID: \"f6983768-ac75-4ee9-9786-4c46b45c428f\") " Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.683567 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6983768-ac75-4ee9-9786-4c46b45c428f-config-data\") pod \"f6983768-ac75-4ee9-9786-4c46b45c428f\" (UID: \"f6983768-ac75-4ee9-9786-4c46b45c428f\") " Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.683582 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6983768-ac75-4ee9-9786-4c46b45c428f-public-tls-certs\") pod \"f6983768-ac75-4ee9-9786-4c46b45c428f\" (UID: \"f6983768-ac75-4ee9-9786-4c46b45c428f\") " Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.683661 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6983768-ac75-4ee9-9786-4c46b45c428f-logs\") pod \"f6983768-ac75-4ee9-9786-4c46b45c428f\" (UID: \"f6983768-ac75-4ee9-9786-4c46b45c428f\") " Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.683685 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6983768-ac75-4ee9-9786-4c46b45c428f-combined-ca-bundle\") pod \"f6983768-ac75-4ee9-9786-4c46b45c428f\" (UID: \"f6983768-ac75-4ee9-9786-4c46b45c428f\") " Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.684168 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6983768-ac75-4ee9-9786-4c46b45c428f-logs" (OuterVolumeSpecName: "logs") pod "f6983768-ac75-4ee9-9786-4c46b45c428f" (UID: "f6983768-ac75-4ee9-9786-4c46b45c428f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.693372 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6983768-ac75-4ee9-9786-4c46b45c428f-kube-api-access-tcnmr" (OuterVolumeSpecName: "kube-api-access-tcnmr") pod "f6983768-ac75-4ee9-9786-4c46b45c428f" (UID: "f6983768-ac75-4ee9-9786-4c46b45c428f"). InnerVolumeSpecName "kube-api-access-tcnmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.711767 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6983768-ac75-4ee9-9786-4c46b45c428f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6983768-ac75-4ee9-9786-4c46b45c428f" (UID: "f6983768-ac75-4ee9-9786-4c46b45c428f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.715773 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6983768-ac75-4ee9-9786-4c46b45c428f-config-data" (OuterVolumeSpecName: "config-data") pod "f6983768-ac75-4ee9-9786-4c46b45c428f" (UID: "f6983768-ac75-4ee9-9786-4c46b45c428f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.744463 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6983768-ac75-4ee9-9786-4c46b45c428f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f6983768-ac75-4ee9-9786-4c46b45c428f" (UID: "f6983768-ac75-4ee9-9786-4c46b45c428f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.747358 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6983768-ac75-4ee9-9786-4c46b45c428f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f6983768-ac75-4ee9-9786-4c46b45c428f" (UID: "f6983768-ac75-4ee9-9786-4c46b45c428f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.787056 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6983768-ac75-4ee9-9786-4c46b45c428f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.787091 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcnmr\" (UniqueName: \"kubernetes.io/projected/f6983768-ac75-4ee9-9786-4c46b45c428f-kube-api-access-tcnmr\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.787103 4732 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6983768-ac75-4ee9-9786-4c46b45c428f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.787111 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6983768-ac75-4ee9-9786-4c46b45c428f-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.787119 4732 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6983768-ac75-4ee9-9786-4c46b45c428f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.787127 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6983768-ac75-4ee9-9786-4c46b45c428f-logs\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.925553 4732 generic.go:334] "Generic (PLEG): container finished" podID="737abec3-8d91-4234-83fc-ebbdbd59812e" containerID="d436a18c3276554f7ff3affa860edb60ae090d140dfd34f92ab6855bd15d983c" exitCode=143 Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.925629 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"737abec3-8d91-4234-83fc-ebbdbd59812e","Type":"ContainerDied","Data":"d436a18c3276554f7ff3affa860edb60ae090d140dfd34f92ab6855bd15d983c"} Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.927394 4732 generic.go:334] "Generic (PLEG): container finished" podID="f6983768-ac75-4ee9-9786-4c46b45c428f" containerID="fa6eec165173de6ab1d738e7b412a7d3f02c1033b5f787a929f005d54e5f2bbd" exitCode=0 Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.927435 4732 generic.go:334] "Generic (PLEG): container finished" podID="f6983768-ac75-4ee9-9786-4c46b45c428f" containerID="66602a1bf8154f811b73e0d3f5226afdfc2fc36ae1d03711e564f72fbd3d2ec6" exitCode=143 Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.927460 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f6983768-ac75-4ee9-9786-4c46b45c428f","Type":"ContainerDied","Data":"fa6eec165173de6ab1d738e7b412a7d3f02c1033b5f787a929f005d54e5f2bbd"} Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.927488 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f6983768-ac75-4ee9-9786-4c46b45c428f","Type":"ContainerDied","Data":"66602a1bf8154f811b73e0d3f5226afdfc2fc36ae1d03711e564f72fbd3d2ec6"} Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.927500 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f6983768-ac75-4ee9-9786-4c46b45c428f","Type":"ContainerDied","Data":"2074d94acba0f9080edd2c2711b97f09cc45e2742f4ed271abe8b164931f44d9"} Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.927505 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.927517 4732 scope.go:117] "RemoveContainer" containerID="fa6eec165173de6ab1d738e7b412a7d3f02c1033b5f787a929f005d54e5f2bbd" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.950503 4732 scope.go:117] "RemoveContainer" containerID="66602a1bf8154f811b73e0d3f5226afdfc2fc36ae1d03711e564f72fbd3d2ec6" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.967926 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.980402 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.983621 4732 scope.go:117] "RemoveContainer" containerID="fa6eec165173de6ab1d738e7b412a7d3f02c1033b5f787a929f005d54e5f2bbd" Apr 02 14:03:33 crc kubenswrapper[4732]: E0402 14:03:33.984111 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa6eec165173de6ab1d738e7b412a7d3f02c1033b5f787a929f005d54e5f2bbd\": container with ID starting with fa6eec165173de6ab1d738e7b412a7d3f02c1033b5f787a929f005d54e5f2bbd not found: ID does not exist" containerID="fa6eec165173de6ab1d738e7b412a7d3f02c1033b5f787a929f005d54e5f2bbd" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.984156 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa6eec165173de6ab1d738e7b412a7d3f02c1033b5f787a929f005d54e5f2bbd"} err="failed to get container status \"fa6eec165173de6ab1d738e7b412a7d3f02c1033b5f787a929f005d54e5f2bbd\": rpc error: code = NotFound desc = could not find container \"fa6eec165173de6ab1d738e7b412a7d3f02c1033b5f787a929f005d54e5f2bbd\": container with ID starting with fa6eec165173de6ab1d738e7b412a7d3f02c1033b5f787a929f005d54e5f2bbd not found: ID does not exist" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.984185 4732 scope.go:117] "RemoveContainer" containerID="66602a1bf8154f811b73e0d3f5226afdfc2fc36ae1d03711e564f72fbd3d2ec6" Apr 02 14:03:33 crc kubenswrapper[4732]: E0402 14:03:33.984518 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66602a1bf8154f811b73e0d3f5226afdfc2fc36ae1d03711e564f72fbd3d2ec6\": container with ID starting with 66602a1bf8154f811b73e0d3f5226afdfc2fc36ae1d03711e564f72fbd3d2ec6 not found: ID does not exist" containerID="66602a1bf8154f811b73e0d3f5226afdfc2fc36ae1d03711e564f72fbd3d2ec6" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.984551 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66602a1bf8154f811b73e0d3f5226afdfc2fc36ae1d03711e564f72fbd3d2ec6"} err="failed to get container status \"66602a1bf8154f811b73e0d3f5226afdfc2fc36ae1d03711e564f72fbd3d2ec6\": rpc error: code = NotFound desc = could not find container \"66602a1bf8154f811b73e0d3f5226afdfc2fc36ae1d03711e564f72fbd3d2ec6\": container with ID starting with 66602a1bf8154f811b73e0d3f5226afdfc2fc36ae1d03711e564f72fbd3d2ec6 not found: ID does not exist" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.984569 4732 scope.go:117] "RemoveContainer" containerID="fa6eec165173de6ab1d738e7b412a7d3f02c1033b5f787a929f005d54e5f2bbd" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.990703 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa6eec165173de6ab1d738e7b412a7d3f02c1033b5f787a929f005d54e5f2bbd"} err="failed to get container status \"fa6eec165173de6ab1d738e7b412a7d3f02c1033b5f787a929f005d54e5f2bbd\": rpc error: code = NotFound desc = could not find container \"fa6eec165173de6ab1d738e7b412a7d3f02c1033b5f787a929f005d54e5f2bbd\": container with ID starting with fa6eec165173de6ab1d738e7b412a7d3f02c1033b5f787a929f005d54e5f2bbd not found: ID does not exist" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.990756 4732 scope.go:117] "RemoveContainer" containerID="66602a1bf8154f811b73e0d3f5226afdfc2fc36ae1d03711e564f72fbd3d2ec6" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.991396 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66602a1bf8154f811b73e0d3f5226afdfc2fc36ae1d03711e564f72fbd3d2ec6"} err="failed to get container status \"66602a1bf8154f811b73e0d3f5226afdfc2fc36ae1d03711e564f72fbd3d2ec6\": rpc error: code = NotFound desc = could not find container \"66602a1bf8154f811b73e0d3f5226afdfc2fc36ae1d03711e564f72fbd3d2ec6\": container with ID starting with 66602a1bf8154f811b73e0d3f5226afdfc2fc36ae1d03711e564f72fbd3d2ec6 not found: ID does not exist" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.994804 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Apr 02 14:03:33 crc kubenswrapper[4732]: E0402 14:03:33.995290 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a66a025-735c-4c9a-b4cd-06046d7d3881" containerName="nova-manage" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.995313 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a66a025-735c-4c9a-b4cd-06046d7d3881" containerName="nova-manage" Apr 02 14:03:33 crc kubenswrapper[4732]: E0402 14:03:33.995344 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6983768-ac75-4ee9-9786-4c46b45c428f" containerName="nova-api-log" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.995353 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6983768-ac75-4ee9-9786-4c46b45c428f" containerName="nova-api-log" Apr 02 14:03:33 crc kubenswrapper[4732]: E0402 14:03:33.995368 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6983768-ac75-4ee9-9786-4c46b45c428f" containerName="nova-api-api" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.995376 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6983768-ac75-4ee9-9786-4c46b45c428f" containerName="nova-api-api" Apr 02 14:03:33 crc kubenswrapper[4732]: E0402 14:03:33.995385 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdbfa898-0999-49da-9194-ff7bf15c955a" containerName="init" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.995393 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdbfa898-0999-49da-9194-ff7bf15c955a" containerName="init" Apr 02 14:03:33 crc kubenswrapper[4732]: E0402 14:03:33.995409 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdbfa898-0999-49da-9194-ff7bf15c955a" containerName="dnsmasq-dns" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.995417 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdbfa898-0999-49da-9194-ff7bf15c955a" containerName="dnsmasq-dns" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.995653 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a66a025-735c-4c9a-b4cd-06046d7d3881" containerName="nova-manage" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.995671 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6983768-ac75-4ee9-9786-4c46b45c428f" containerName="nova-api-log" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.995686 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdbfa898-0999-49da-9194-ff7bf15c955a" containerName="dnsmasq-dns" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.995706 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6983768-ac75-4ee9-9786-4c46b45c428f" containerName="nova-api-api" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.997261 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.999129 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.999152 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Apr 02 14:03:33 crc kubenswrapper[4732]: I0402 14:03:33.999234 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.006794 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.195448 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16-config-data\") pod \"nova-api-0\" (UID: \"3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16\") " pod="openstack/nova-api-0" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.195512 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16\") " pod="openstack/nova-api-0" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.195538 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tchv\" (UniqueName: \"kubernetes.io/projected/3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16-kube-api-access-8tchv\") pod \"nova-api-0\" (UID: \"3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16\") " pod="openstack/nova-api-0" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.195623 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16-logs\") pod \"nova-api-0\" (UID: \"3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16\") " pod="openstack/nova-api-0" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.195684 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16-public-tls-certs\") pod \"nova-api-0\" (UID: \"3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16\") " pod="openstack/nova-api-0" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.195717 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16\") " pod="openstack/nova-api-0" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.297433 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16-config-data\") pod \"nova-api-0\" (UID: \"3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16\") " pod="openstack/nova-api-0" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.297532 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16\") " pod="openstack/nova-api-0" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.297574 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tchv\" (UniqueName: \"kubernetes.io/projected/3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16-kube-api-access-8tchv\") pod \"nova-api-0\" (UID: \"3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16\") " pod="openstack/nova-api-0" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.297666 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16-logs\") pod \"nova-api-0\" (UID: \"3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16\") " pod="openstack/nova-api-0" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.297737 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16-public-tls-certs\") pod \"nova-api-0\" (UID: \"3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16\") " pod="openstack/nova-api-0" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.297771 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16\") " pod="openstack/nova-api-0" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.298193 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16-logs\") pod \"nova-api-0\" (UID: \"3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16\") " pod="openstack/nova-api-0" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.302281 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16\") " pod="openstack/nova-api-0" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.303752 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16\") " pod="openstack/nova-api-0" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.304605 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16-public-tls-certs\") pod \"nova-api-0\" (UID: \"3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16\") " pod="openstack/nova-api-0" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.315813 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16-config-data\") pod \"nova-api-0\" (UID: \"3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16\") " pod="openstack/nova-api-0" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.318389 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tchv\" (UniqueName: \"kubernetes.io/projected/3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16-kube-api-access-8tchv\") pod \"nova-api-0\" (UID: \"3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16\") " pod="openstack/nova-api-0" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.344784 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.692544 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6983768-ac75-4ee9-9786-4c46b45c428f" path="/var/lib/kubelet/pods/f6983768-ac75-4ee9-9786-4c46b45c428f/volumes" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.816206 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Apr 02 14:03:34 crc kubenswrapper[4732]: W0402 14:03:34.817541 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ab608f9_1ada_4c9c_85a1_43fc8cf5cc16.slice/crio-26a4051f0476069a34ce335c1774f245624dfe76920e3e4d2b28df3aca608e43 WatchSource:0}: Error finding container 26a4051f0476069a34ce335c1774f245624dfe76920e3e4d2b28df3aca608e43: Status 404 returned error can't find the container with id 26a4051f0476069a34ce335c1774f245624dfe76920e3e4d2b28df3aca608e43 Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.817792 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.909428 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7d18cc6-1dc3-4ed6-a8a4-822678468b70-combined-ca-bundle\") pod \"f7d18cc6-1dc3-4ed6-a8a4-822678468b70\" (UID: \"f7d18cc6-1dc3-4ed6-a8a4-822678468b70\") " Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.909533 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7d18cc6-1dc3-4ed6-a8a4-822678468b70-config-data\") pod \"f7d18cc6-1dc3-4ed6-a8a4-822678468b70\" (UID: \"f7d18cc6-1dc3-4ed6-a8a4-822678468b70\") " Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.909686 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgzxp\" (UniqueName: \"kubernetes.io/projected/f7d18cc6-1dc3-4ed6-a8a4-822678468b70-kube-api-access-kgzxp\") pod \"f7d18cc6-1dc3-4ed6-a8a4-822678468b70\" (UID: \"f7d18cc6-1dc3-4ed6-a8a4-822678468b70\") " Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.914251 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7d18cc6-1dc3-4ed6-a8a4-822678468b70-kube-api-access-kgzxp" (OuterVolumeSpecName: "kube-api-access-kgzxp") pod "f7d18cc6-1dc3-4ed6-a8a4-822678468b70" (UID: "f7d18cc6-1dc3-4ed6-a8a4-822678468b70"). InnerVolumeSpecName "kube-api-access-kgzxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.937055 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16","Type":"ContainerStarted","Data":"26a4051f0476069a34ce335c1774f245624dfe76920e3e4d2b28df3aca608e43"} Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.939606 4732 generic.go:334] "Generic (PLEG): container finished" podID="f7d18cc6-1dc3-4ed6-a8a4-822678468b70" containerID="553271ae9fa33a76a3a65beeb9372a3380897000327ade77ae956c1a1ba7cffd" exitCode=0 Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.939647 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f7d18cc6-1dc3-4ed6-a8a4-822678468b70","Type":"ContainerDied","Data":"553271ae9fa33a76a3a65beeb9372a3380897000327ade77ae956c1a1ba7cffd"} Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.939664 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f7d18cc6-1dc3-4ed6-a8a4-822678468b70","Type":"ContainerDied","Data":"22a2ce890f802e32c8d3f892ec3e0c107dc57d74066f650b3aaf601fe736453a"} Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.939679 4732 scope.go:117] "RemoveContainer" containerID="553271ae9fa33a76a3a65beeb9372a3380897000327ade77ae956c1a1ba7cffd" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.939785 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.973996 4732 scope.go:117] "RemoveContainer" containerID="553271ae9fa33a76a3a65beeb9372a3380897000327ade77ae956c1a1ba7cffd" Apr 02 14:03:34 crc kubenswrapper[4732]: E0402 14:03:34.975487 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"553271ae9fa33a76a3a65beeb9372a3380897000327ade77ae956c1a1ba7cffd\": container with ID starting with 553271ae9fa33a76a3a65beeb9372a3380897000327ade77ae956c1a1ba7cffd not found: ID does not exist" containerID="553271ae9fa33a76a3a65beeb9372a3380897000327ade77ae956c1a1ba7cffd" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.975551 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"553271ae9fa33a76a3a65beeb9372a3380897000327ade77ae956c1a1ba7cffd"} err="failed to get container status \"553271ae9fa33a76a3a65beeb9372a3380897000327ade77ae956c1a1ba7cffd\": rpc error: code = NotFound desc = could not find container \"553271ae9fa33a76a3a65beeb9372a3380897000327ade77ae956c1a1ba7cffd\": container with ID starting with 553271ae9fa33a76a3a65beeb9372a3380897000327ade77ae956c1a1ba7cffd not found: ID does not exist" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.975816 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7d18cc6-1dc3-4ed6-a8a4-822678468b70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7d18cc6-1dc3-4ed6-a8a4-822678468b70" (UID: "f7d18cc6-1dc3-4ed6-a8a4-822678468b70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:03:34 crc kubenswrapper[4732]: I0402 14:03:34.979213 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7d18cc6-1dc3-4ed6-a8a4-822678468b70-config-data" (OuterVolumeSpecName: "config-data") pod "f7d18cc6-1dc3-4ed6-a8a4-822678468b70" (UID: "f7d18cc6-1dc3-4ed6-a8a4-822678468b70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:03:35 crc kubenswrapper[4732]: I0402 14:03:35.012491 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7d18cc6-1dc3-4ed6-a8a4-822678468b70-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:35 crc kubenswrapper[4732]: I0402 14:03:35.012531 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgzxp\" (UniqueName: \"kubernetes.io/projected/f7d18cc6-1dc3-4ed6-a8a4-822678468b70-kube-api-access-kgzxp\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:35 crc kubenswrapper[4732]: I0402 14:03:35.012544 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7d18cc6-1dc3-4ed6-a8a4-822678468b70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:35 crc kubenswrapper[4732]: I0402 14:03:35.277289 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Apr 02 14:03:35 crc kubenswrapper[4732]: I0402 14:03:35.287683 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Apr 02 14:03:35 crc kubenswrapper[4732]: I0402 14:03:35.303556 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Apr 02 14:03:35 crc kubenswrapper[4732]: E0402 14:03:35.304060 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7d18cc6-1dc3-4ed6-a8a4-822678468b70" containerName="nova-scheduler-scheduler" Apr 02 14:03:35 crc kubenswrapper[4732]: I0402 14:03:35.304078 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7d18cc6-1dc3-4ed6-a8a4-822678468b70" containerName="nova-scheduler-scheduler" Apr 02 14:03:35 crc kubenswrapper[4732]: I0402 14:03:35.304252 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7d18cc6-1dc3-4ed6-a8a4-822678468b70" containerName="nova-scheduler-scheduler" Apr 02 14:03:35 crc kubenswrapper[4732]: I0402 14:03:35.304869 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Apr 02 14:03:35 crc kubenswrapper[4732]: I0402 14:03:35.306669 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Apr 02 14:03:35 crc kubenswrapper[4732]: I0402 14:03:35.321981 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Apr 02 14:03:35 crc kubenswrapper[4732]: I0402 14:03:35.422291 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f287z\" (UniqueName: \"kubernetes.io/projected/c6916cdf-dcaa-4e17-b33c-3fc6684abb46-kube-api-access-f287z\") pod \"nova-scheduler-0\" (UID: \"c6916cdf-dcaa-4e17-b33c-3fc6684abb46\") " pod="openstack/nova-scheduler-0" Apr 02 14:03:35 crc kubenswrapper[4732]: I0402 14:03:35.422350 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6916cdf-dcaa-4e17-b33c-3fc6684abb46-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c6916cdf-dcaa-4e17-b33c-3fc6684abb46\") " pod="openstack/nova-scheduler-0" Apr 02 14:03:35 crc kubenswrapper[4732]: I0402 14:03:35.422402 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6916cdf-dcaa-4e17-b33c-3fc6684abb46-config-data\") pod \"nova-scheduler-0\" (UID: \"c6916cdf-dcaa-4e17-b33c-3fc6684abb46\") " pod="openstack/nova-scheduler-0" Apr 02 14:03:35 crc kubenswrapper[4732]: I0402 14:03:35.523932 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6916cdf-dcaa-4e17-b33c-3fc6684abb46-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c6916cdf-dcaa-4e17-b33c-3fc6684abb46\") " pod="openstack/nova-scheduler-0" Apr 02 14:03:35 crc kubenswrapper[4732]: I0402 14:03:35.524058 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6916cdf-dcaa-4e17-b33c-3fc6684abb46-config-data\") pod \"nova-scheduler-0\" (UID: \"c6916cdf-dcaa-4e17-b33c-3fc6684abb46\") " pod="openstack/nova-scheduler-0" Apr 02 14:03:35 crc kubenswrapper[4732]: I0402 14:03:35.524880 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f287z\" (UniqueName: \"kubernetes.io/projected/c6916cdf-dcaa-4e17-b33c-3fc6684abb46-kube-api-access-f287z\") pod \"nova-scheduler-0\" (UID: \"c6916cdf-dcaa-4e17-b33c-3fc6684abb46\") " pod="openstack/nova-scheduler-0" Apr 02 14:03:35 crc kubenswrapper[4732]: I0402 14:03:35.528380 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6916cdf-dcaa-4e17-b33c-3fc6684abb46-config-data\") pod \"nova-scheduler-0\" (UID: \"c6916cdf-dcaa-4e17-b33c-3fc6684abb46\") " pod="openstack/nova-scheduler-0" Apr 02 14:03:35 crc kubenswrapper[4732]: I0402 14:03:35.532308 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6916cdf-dcaa-4e17-b33c-3fc6684abb46-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c6916cdf-dcaa-4e17-b33c-3fc6684abb46\") " pod="openstack/nova-scheduler-0" Apr 02 14:03:35 crc kubenswrapper[4732]: I0402 14:03:35.547229 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f287z\" (UniqueName: \"kubernetes.io/projected/c6916cdf-dcaa-4e17-b33c-3fc6684abb46-kube-api-access-f287z\") pod \"nova-scheduler-0\" (UID: \"c6916cdf-dcaa-4e17-b33c-3fc6684abb46\") " pod="openstack/nova-scheduler-0" Apr 02 14:03:35 crc kubenswrapper[4732]: I0402 14:03:35.633483 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Apr 02 14:03:35 crc kubenswrapper[4732]: I0402 14:03:35.952127 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16","Type":"ContainerStarted","Data":"1f5d7544e4eb0bf1c120a6777fb77026894132ffd18fae534ae2f072ad73ef5d"} Apr 02 14:03:35 crc kubenswrapper[4732]: I0402 14:03:35.952425 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16","Type":"ContainerStarted","Data":"bc0366a4102d7eb9abff9b37b8f49bcddb294427cbce480d1569a9080001cfc7"} Apr 02 14:03:35 crc kubenswrapper[4732]: I0402 14:03:35.982546 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.982526622 podStartE2EDuration="2.982526622s" podCreationTimestamp="2026-04-02 14:03:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:03:35.972198838 +0000 UTC m=+1572.876606411" watchObservedRunningTime="2026-04-02 14:03:35.982526622 +0000 UTC m=+1572.886934175" Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.119378 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Apr 02 14:03:36 crc kubenswrapper[4732]: W0402 14:03:36.123035 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6916cdf_dcaa_4e17_b33c_3fc6684abb46.slice/crio-7c43f8a3f090eedfde0a6f0fc46a2a8a88e872dc9858bb5e7886c0f9f74a9bdd WatchSource:0}: Error finding container 7c43f8a3f090eedfde0a6f0fc46a2a8a88e872dc9858bb5e7886c0f9f74a9bdd: Status 404 returned error can't find the container with id 7c43f8a3f090eedfde0a6f0fc46a2a8a88e872dc9858bb5e7886c0f9f74a9bdd Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.695342 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7d18cc6-1dc3-4ed6-a8a4-822678468b70" path="/var/lib/kubelet/pods/f7d18cc6-1dc3-4ed6-a8a4-822678468b70/volumes" Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.734276 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.755755 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/737abec3-8d91-4234-83fc-ebbdbd59812e-nova-metadata-tls-certs\") pod \"737abec3-8d91-4234-83fc-ebbdbd59812e\" (UID: \"737abec3-8d91-4234-83fc-ebbdbd59812e\") " Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.756176 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zn727\" (UniqueName: \"kubernetes.io/projected/737abec3-8d91-4234-83fc-ebbdbd59812e-kube-api-access-zn727\") pod \"737abec3-8d91-4234-83fc-ebbdbd59812e\" (UID: \"737abec3-8d91-4234-83fc-ebbdbd59812e\") " Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.756204 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737abec3-8d91-4234-83fc-ebbdbd59812e-combined-ca-bundle\") pod \"737abec3-8d91-4234-83fc-ebbdbd59812e\" (UID: \"737abec3-8d91-4234-83fc-ebbdbd59812e\") " Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.756317 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/737abec3-8d91-4234-83fc-ebbdbd59812e-logs\") pod \"737abec3-8d91-4234-83fc-ebbdbd59812e\" (UID: \"737abec3-8d91-4234-83fc-ebbdbd59812e\") " Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.756890 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/737abec3-8d91-4234-83fc-ebbdbd59812e-logs" (OuterVolumeSpecName: "logs") pod "737abec3-8d91-4234-83fc-ebbdbd59812e" (UID: "737abec3-8d91-4234-83fc-ebbdbd59812e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.756986 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737abec3-8d91-4234-83fc-ebbdbd59812e-config-data\") pod \"737abec3-8d91-4234-83fc-ebbdbd59812e\" (UID: \"737abec3-8d91-4234-83fc-ebbdbd59812e\") " Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.758255 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/737abec3-8d91-4234-83fc-ebbdbd59812e-logs\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.774906 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/737abec3-8d91-4234-83fc-ebbdbd59812e-kube-api-access-zn727" (OuterVolumeSpecName: "kube-api-access-zn727") pod "737abec3-8d91-4234-83fc-ebbdbd59812e" (UID: "737abec3-8d91-4234-83fc-ebbdbd59812e"). InnerVolumeSpecName "kube-api-access-zn727". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.826827 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/737abec3-8d91-4234-83fc-ebbdbd59812e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "737abec3-8d91-4234-83fc-ebbdbd59812e" (UID: "737abec3-8d91-4234-83fc-ebbdbd59812e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.845021 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/737abec3-8d91-4234-83fc-ebbdbd59812e-config-data" (OuterVolumeSpecName: "config-data") pod "737abec3-8d91-4234-83fc-ebbdbd59812e" (UID: "737abec3-8d91-4234-83fc-ebbdbd59812e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.861069 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737abec3-8d91-4234-83fc-ebbdbd59812e-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.861121 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zn727\" (UniqueName: \"kubernetes.io/projected/737abec3-8d91-4234-83fc-ebbdbd59812e-kube-api-access-zn727\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.861133 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737abec3-8d91-4234-83fc-ebbdbd59812e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.879229 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/737abec3-8d91-4234-83fc-ebbdbd59812e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "737abec3-8d91-4234-83fc-ebbdbd59812e" (UID: "737abec3-8d91-4234-83fc-ebbdbd59812e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.962127 4732 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/737abec3-8d91-4234-83fc-ebbdbd59812e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.970103 4732 generic.go:334] "Generic (PLEG): container finished" podID="737abec3-8d91-4234-83fc-ebbdbd59812e" containerID="3da6bf392b2c2b310dca2b6ee59b744d6cddff583caaf6fc88bfecfccdf461b1" exitCode=0 Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.970235 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"737abec3-8d91-4234-83fc-ebbdbd59812e","Type":"ContainerDied","Data":"3da6bf392b2c2b310dca2b6ee59b744d6cddff583caaf6fc88bfecfccdf461b1"} Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.970268 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.970311 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"737abec3-8d91-4234-83fc-ebbdbd59812e","Type":"ContainerDied","Data":"01bcc02a9242e9f31701771b32216137c04538dbd5f487ce1739fb397445d13c"} Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.970342 4732 scope.go:117] "RemoveContainer" containerID="3da6bf392b2c2b310dca2b6ee59b744d6cddff583caaf6fc88bfecfccdf461b1" Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.972711 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c6916cdf-dcaa-4e17-b33c-3fc6684abb46","Type":"ContainerStarted","Data":"5acc0ac6744565a03a412f0b58ea8db6a80d3fdab8e1a16c992a685a71da7e9a"} Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.972762 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c6916cdf-dcaa-4e17-b33c-3fc6684abb46","Type":"ContainerStarted","Data":"7c43f8a3f090eedfde0a6f0fc46a2a8a88e872dc9858bb5e7886c0f9f74a9bdd"} Apr 02 14:03:36 crc kubenswrapper[4732]: I0402 14:03:36.997283 4732 scope.go:117] "RemoveContainer" containerID="d436a18c3276554f7ff3affa860edb60ae090d140dfd34f92ab6855bd15d983c" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.009016 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.008996706 podStartE2EDuration="2.008996706s" podCreationTimestamp="2026-04-02 14:03:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:03:37.003733321 +0000 UTC m=+1573.908140894" watchObservedRunningTime="2026-04-02 14:03:37.008996706 +0000 UTC m=+1573.913404249" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.030958 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.032349 4732 scope.go:117] "RemoveContainer" containerID="3da6bf392b2c2b310dca2b6ee59b744d6cddff583caaf6fc88bfecfccdf461b1" Apr 02 14:03:37 crc kubenswrapper[4732]: E0402 14:03:37.032841 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3da6bf392b2c2b310dca2b6ee59b744d6cddff583caaf6fc88bfecfccdf461b1\": container with ID starting with 3da6bf392b2c2b310dca2b6ee59b744d6cddff583caaf6fc88bfecfccdf461b1 not found: ID does not exist" containerID="3da6bf392b2c2b310dca2b6ee59b744d6cddff583caaf6fc88bfecfccdf461b1" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.032877 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3da6bf392b2c2b310dca2b6ee59b744d6cddff583caaf6fc88bfecfccdf461b1"} err="failed to get container status \"3da6bf392b2c2b310dca2b6ee59b744d6cddff583caaf6fc88bfecfccdf461b1\": rpc error: code = NotFound desc = could not find container \"3da6bf392b2c2b310dca2b6ee59b744d6cddff583caaf6fc88bfecfccdf461b1\": container with ID starting with 3da6bf392b2c2b310dca2b6ee59b744d6cddff583caaf6fc88bfecfccdf461b1 not found: ID does not exist" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.032897 4732 scope.go:117] "RemoveContainer" containerID="d436a18c3276554f7ff3affa860edb60ae090d140dfd34f92ab6855bd15d983c" Apr 02 14:03:37 crc kubenswrapper[4732]: E0402 14:03:37.033498 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d436a18c3276554f7ff3affa860edb60ae090d140dfd34f92ab6855bd15d983c\": container with ID starting with d436a18c3276554f7ff3affa860edb60ae090d140dfd34f92ab6855bd15d983c not found: ID does not exist" containerID="d436a18c3276554f7ff3affa860edb60ae090d140dfd34f92ab6855bd15d983c" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.033528 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d436a18c3276554f7ff3affa860edb60ae090d140dfd34f92ab6855bd15d983c"} err="failed to get container status \"d436a18c3276554f7ff3affa860edb60ae090d140dfd34f92ab6855bd15d983c\": rpc error: code = NotFound desc = could not find container \"d436a18c3276554f7ff3affa860edb60ae090d140dfd34f92ab6855bd15d983c\": container with ID starting with d436a18c3276554f7ff3affa860edb60ae090d140dfd34f92ab6855bd15d983c not found: ID does not exist" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.042295 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.061936 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Apr 02 14:03:37 crc kubenswrapper[4732]: E0402 14:03:37.062506 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737abec3-8d91-4234-83fc-ebbdbd59812e" containerName="nova-metadata-log" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.062526 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="737abec3-8d91-4234-83fc-ebbdbd59812e" containerName="nova-metadata-log" Apr 02 14:03:37 crc kubenswrapper[4732]: E0402 14:03:37.062545 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737abec3-8d91-4234-83fc-ebbdbd59812e" containerName="nova-metadata-metadata" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.062555 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="737abec3-8d91-4234-83fc-ebbdbd59812e" containerName="nova-metadata-metadata" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.062836 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="737abec3-8d91-4234-83fc-ebbdbd59812e" containerName="nova-metadata-metadata" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.062888 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="737abec3-8d91-4234-83fc-ebbdbd59812e" containerName="nova-metadata-log" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.064111 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.069906 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.070135 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.076422 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.176155 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4-logs\") pod \"nova-metadata-0\" (UID: \"c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4\") " pod="openstack/nova-metadata-0" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.176391 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4\") " pod="openstack/nova-metadata-0" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.176449 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4-config-data\") pod \"nova-metadata-0\" (UID: \"c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4\") " pod="openstack/nova-metadata-0" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.176550 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4\") " pod="openstack/nova-metadata-0" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.176600 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjnkk\" (UniqueName: \"kubernetes.io/projected/c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4-kube-api-access-gjnkk\") pod \"nova-metadata-0\" (UID: \"c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4\") " pod="openstack/nova-metadata-0" Apr 02 14:03:37 crc kubenswrapper[4732]: E0402 14:03:37.277376 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod737abec3_8d91_4234_83fc_ebbdbd59812e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod737abec3_8d91_4234_83fc_ebbdbd59812e.slice/crio-01bcc02a9242e9f31701771b32216137c04538dbd5f487ce1739fb397445d13c\": RecentStats: unable to find data in memory cache]" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.278660 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4-logs\") pod \"nova-metadata-0\" (UID: \"c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4\") " pod="openstack/nova-metadata-0" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.279213 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4-logs\") pod \"nova-metadata-0\" (UID: \"c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4\") " pod="openstack/nova-metadata-0" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.279469 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4\") " pod="openstack/nova-metadata-0" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.279579 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4-config-data\") pod \"nova-metadata-0\" (UID: \"c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4\") " pod="openstack/nova-metadata-0" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.279710 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4\") " pod="openstack/nova-metadata-0" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.279822 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjnkk\" (UniqueName: \"kubernetes.io/projected/c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4-kube-api-access-gjnkk\") pod \"nova-metadata-0\" (UID: \"c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4\") " pod="openstack/nova-metadata-0" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.283321 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4-config-data\") pod \"nova-metadata-0\" (UID: \"c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4\") " pod="openstack/nova-metadata-0" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.283662 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4\") " pod="openstack/nova-metadata-0" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.284590 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4\") " pod="openstack/nova-metadata-0" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.298923 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjnkk\" (UniqueName: \"kubernetes.io/projected/c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4-kube-api-access-gjnkk\") pod \"nova-metadata-0\" (UID: \"c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4\") " pod="openstack/nova-metadata-0" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.410185 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 02 14:03:37 crc kubenswrapper[4732]: I0402 14:03:37.852515 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Apr 02 14:03:37 crc kubenswrapper[4732]: W0402 14:03:37.876571 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8e7f93a_58bc_4949_9fa4_7ee18fdaa7e4.slice/crio-5fa5d88e772dcc9d2db58faa6db613e299077929fc0daf7dcb9a4155603d8832 WatchSource:0}: Error finding container 5fa5d88e772dcc9d2db58faa6db613e299077929fc0daf7dcb9a4155603d8832: Status 404 returned error can't find the container with id 5fa5d88e772dcc9d2db58faa6db613e299077929fc0daf7dcb9a4155603d8832 Apr 02 14:03:38 crc kubenswrapper[4732]: I0402 14:03:38.019938 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4","Type":"ContainerStarted","Data":"5fa5d88e772dcc9d2db58faa6db613e299077929fc0daf7dcb9a4155603d8832"} Apr 02 14:03:38 crc kubenswrapper[4732]: I0402 14:03:38.692279 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="737abec3-8d91-4234-83fc-ebbdbd59812e" path="/var/lib/kubelet/pods/737abec3-8d91-4234-83fc-ebbdbd59812e/volumes" Apr 02 14:03:39 crc kubenswrapper[4732]: I0402 14:03:39.035240 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4","Type":"ContainerStarted","Data":"4a08d9cf93ac3d00fd7aadd4310e361fc02f2cfe67085d88dc007cd9e1c57d67"} Apr 02 14:03:39 crc kubenswrapper[4732]: I0402 14:03:39.035294 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4","Type":"ContainerStarted","Data":"b0f666e0c098699edf0736778fe19bdfa52d3fd51e6e71cf0ed34717d1eb0bdb"} Apr 02 14:03:40 crc kubenswrapper[4732]: I0402 14:03:40.633775 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Apr 02 14:03:44 crc kubenswrapper[4732]: I0402 14:03:44.345656 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Apr 02 14:03:44 crc kubenswrapper[4732]: I0402 14:03:44.346069 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Apr 02 14:03:45 crc kubenswrapper[4732]: I0402 14:03:45.360766 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Apr 02 14:03:45 crc kubenswrapper[4732]: I0402 14:03:45.360912 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Apr 02 14:03:45 crc kubenswrapper[4732]: I0402 14:03:45.633652 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Apr 02 14:03:45 crc kubenswrapper[4732]: I0402 14:03:45.663266 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Apr 02 14:03:45 crc kubenswrapper[4732]: I0402 14:03:45.682991 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=8.682977463 podStartE2EDuration="8.682977463s" podCreationTimestamp="2026-04-02 14:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:03:39.052981097 +0000 UTC m=+1575.957388670" watchObservedRunningTime="2026-04-02 14:03:45.682977463 +0000 UTC m=+1582.587385016" Apr 02 14:03:46 crc kubenswrapper[4732]: I0402 14:03:46.132132 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Apr 02 14:03:47 crc kubenswrapper[4732]: I0402 14:03:47.410359 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Apr 02 14:03:47 crc kubenswrapper[4732]: I0402 14:03:47.410719 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Apr 02 14:03:48 crc kubenswrapper[4732]: I0402 14:03:48.420904 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 02 14:03:48 crc kubenswrapper[4732]: I0402 14:03:48.420905 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Apr 02 14:03:51 crc kubenswrapper[4732]: I0402 14:03:51.156904 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Apr 02 14:03:52 crc kubenswrapper[4732]: I0402 14:03:52.345251 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Apr 02 14:03:52 crc kubenswrapper[4732]: I0402 14:03:52.345327 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Apr 02 14:03:54 crc kubenswrapper[4732]: I0402 14:03:54.351250 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Apr 02 14:03:54 crc kubenswrapper[4732]: I0402 14:03:54.357526 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Apr 02 14:03:54 crc kubenswrapper[4732]: I0402 14:03:54.360199 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Apr 02 14:03:55 crc kubenswrapper[4732]: I0402 14:03:55.184750 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Apr 02 14:03:55 crc kubenswrapper[4732]: I0402 14:03:55.411182 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Apr 02 14:03:55 crc kubenswrapper[4732]: I0402 14:03:55.411233 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Apr 02 14:03:57 crc kubenswrapper[4732]: I0402 14:03:57.415963 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Apr 02 14:03:57 crc kubenswrapper[4732]: I0402 14:03:57.417387 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Apr 02 14:03:57 crc kubenswrapper[4732]: I0402 14:03:57.423040 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Apr 02 14:03:57 crc kubenswrapper[4732]: I0402 14:03:57.423283 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Apr 02 14:04:00 crc kubenswrapper[4732]: I0402 14:04:00.139070 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585644-vgwpc"] Apr 02 14:04:00 crc kubenswrapper[4732]: I0402 14:04:00.140626 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585644-vgwpc" Apr 02 14:04:00 crc kubenswrapper[4732]: I0402 14:04:00.143632 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:04:00 crc kubenswrapper[4732]: I0402 14:04:00.143657 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:04:00 crc kubenswrapper[4732]: I0402 14:04:00.144997 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:04:00 crc kubenswrapper[4732]: I0402 14:04:00.158013 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585644-vgwpc"] Apr 02 14:04:00 crc kubenswrapper[4732]: I0402 14:04:00.223642 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqp69\" (UniqueName: \"kubernetes.io/projected/891746ea-0d72-4bb9-aba3-a0f140bb31b4-kube-api-access-lqp69\") pod \"auto-csr-approver-29585644-vgwpc\" (UID: \"891746ea-0d72-4bb9-aba3-a0f140bb31b4\") " pod="openshift-infra/auto-csr-approver-29585644-vgwpc" Apr 02 14:04:00 crc kubenswrapper[4732]: I0402 14:04:00.326313 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqp69\" (UniqueName: \"kubernetes.io/projected/891746ea-0d72-4bb9-aba3-a0f140bb31b4-kube-api-access-lqp69\") pod \"auto-csr-approver-29585644-vgwpc\" (UID: \"891746ea-0d72-4bb9-aba3-a0f140bb31b4\") " pod="openshift-infra/auto-csr-approver-29585644-vgwpc" Apr 02 14:04:00 crc kubenswrapper[4732]: I0402 14:04:00.352438 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqp69\" (UniqueName: \"kubernetes.io/projected/891746ea-0d72-4bb9-aba3-a0f140bb31b4-kube-api-access-lqp69\") pod \"auto-csr-approver-29585644-vgwpc\" (UID: \"891746ea-0d72-4bb9-aba3-a0f140bb31b4\") " pod="openshift-infra/auto-csr-approver-29585644-vgwpc" Apr 02 14:04:00 crc kubenswrapper[4732]: I0402 14:04:00.462130 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585644-vgwpc" Apr 02 14:04:00 crc kubenswrapper[4732]: I0402 14:04:00.902464 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585644-vgwpc"] Apr 02 14:04:01 crc kubenswrapper[4732]: I0402 14:04:01.234369 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585644-vgwpc" event={"ID":"891746ea-0d72-4bb9-aba3-a0f140bb31b4","Type":"ContainerStarted","Data":"a6447ef0dec7f3d5a3b632b3749498556ee222b21209a70e49f2d920d2f35c1a"} Apr 02 14:04:01 crc kubenswrapper[4732]: I0402 14:04:01.925291 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:04:01 crc kubenswrapper[4732]: I0402 14:04:01.925719 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:04:01 crc kubenswrapper[4732]: I0402 14:04:01.925790 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 14:04:01 crc kubenswrapper[4732]: I0402 14:04:01.926598 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"de6153f9349b412a56e88983b18d3d8fdd63881d0461412cebd345d437c6871b"} pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 02 14:04:01 crc kubenswrapper[4732]: I0402 14:04:01.926693 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" containerID="cri-o://de6153f9349b412a56e88983b18d3d8fdd63881d0461412cebd345d437c6871b" gracePeriod=600 Apr 02 14:04:02 crc kubenswrapper[4732]: I0402 14:04:02.246791 4732 generic.go:334] "Generic (PLEG): container finished" podID="38409e5e-4545-49da-8f6c-4bfb30582878" containerID="de6153f9349b412a56e88983b18d3d8fdd63881d0461412cebd345d437c6871b" exitCode=0 Apr 02 14:04:02 crc kubenswrapper[4732]: I0402 14:04:02.246853 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerDied","Data":"de6153f9349b412a56e88983b18d3d8fdd63881d0461412cebd345d437c6871b"} Apr 02 14:04:02 crc kubenswrapper[4732]: I0402 14:04:02.246894 4732 scope.go:117] "RemoveContainer" containerID="7fb2687018e193fb92c41619c313936d4cbab14821cf21277c10428a796150c1" Apr 02 14:04:03 crc kubenswrapper[4732]: I0402 14:04:03.262141 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerStarted","Data":"0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383"} Apr 02 14:04:03 crc kubenswrapper[4732]: I0402 14:04:03.267076 4732 generic.go:334] "Generic (PLEG): container finished" podID="891746ea-0d72-4bb9-aba3-a0f140bb31b4" containerID="d653dc1b373bc973247840f58437e041e1d7e3215ec38a57c61328d8bdf9a166" exitCode=0 Apr 02 14:04:03 crc kubenswrapper[4732]: I0402 14:04:03.267122 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585644-vgwpc" event={"ID":"891746ea-0d72-4bb9-aba3-a0f140bb31b4","Type":"ContainerDied","Data":"d653dc1b373bc973247840f58437e041e1d7e3215ec38a57c61328d8bdf9a166"} Apr 02 14:04:04 crc kubenswrapper[4732]: I0402 14:04:04.615328 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585644-vgwpc" Apr 02 14:04:04 crc kubenswrapper[4732]: I0402 14:04:04.721540 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqp69\" (UniqueName: \"kubernetes.io/projected/891746ea-0d72-4bb9-aba3-a0f140bb31b4-kube-api-access-lqp69\") pod \"891746ea-0d72-4bb9-aba3-a0f140bb31b4\" (UID: \"891746ea-0d72-4bb9-aba3-a0f140bb31b4\") " Apr 02 14:04:05 crc kubenswrapper[4732]: I0402 14:04:05.239999 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/891746ea-0d72-4bb9-aba3-a0f140bb31b4-kube-api-access-lqp69" (OuterVolumeSpecName: "kube-api-access-lqp69") pod "891746ea-0d72-4bb9-aba3-a0f140bb31b4" (UID: "891746ea-0d72-4bb9-aba3-a0f140bb31b4"). InnerVolumeSpecName "kube-api-access-lqp69". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:04:05 crc kubenswrapper[4732]: I0402 14:04:05.242484 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqp69\" (UniqueName: \"kubernetes.io/projected/891746ea-0d72-4bb9-aba3-a0f140bb31b4-kube-api-access-lqp69\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:05 crc kubenswrapper[4732]: I0402 14:04:05.296887 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585644-vgwpc" event={"ID":"891746ea-0d72-4bb9-aba3-a0f140bb31b4","Type":"ContainerDied","Data":"a6447ef0dec7f3d5a3b632b3749498556ee222b21209a70e49f2d920d2f35c1a"} Apr 02 14:04:05 crc kubenswrapper[4732]: I0402 14:04:05.297202 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6447ef0dec7f3d5a3b632b3749498556ee222b21209a70e49f2d920d2f35c1a" Apr 02 14:04:05 crc kubenswrapper[4732]: I0402 14:04:05.297131 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585644-vgwpc" Apr 02 14:04:05 crc kubenswrapper[4732]: I0402 14:04:05.691179 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585638-glv6j"] Apr 02 14:04:05 crc kubenswrapper[4732]: I0402 14:04:05.699761 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585638-glv6j"] Apr 02 14:04:06 crc kubenswrapper[4732]: I0402 14:04:06.271491 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Apr 02 14:04:06 crc kubenswrapper[4732]: I0402 14:04:06.409016 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dqcfr"] Apr 02 14:04:06 crc kubenswrapper[4732]: E0402 14:04:06.409644 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="891746ea-0d72-4bb9-aba3-a0f140bb31b4" containerName="oc" Apr 02 14:04:06 crc kubenswrapper[4732]: I0402 14:04:06.409659 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="891746ea-0d72-4bb9-aba3-a0f140bb31b4" containerName="oc" Apr 02 14:04:06 crc kubenswrapper[4732]: I0402 14:04:06.409846 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="891746ea-0d72-4bb9-aba3-a0f140bb31b4" containerName="oc" Apr 02 14:04:06 crc kubenswrapper[4732]: I0402 14:04:06.411121 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dqcfr" Apr 02 14:04:06 crc kubenswrapper[4732]: I0402 14:04:06.436832 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dqcfr"] Apr 02 14:04:06 crc kubenswrapper[4732]: I0402 14:04:06.565639 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12d89da9-6d93-473e-846c-5fc96a22a5de-catalog-content\") pod \"redhat-operators-dqcfr\" (UID: \"12d89da9-6d93-473e-846c-5fc96a22a5de\") " pod="openshift-marketplace/redhat-operators-dqcfr" Apr 02 14:04:06 crc kubenswrapper[4732]: I0402 14:04:06.565698 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12d89da9-6d93-473e-846c-5fc96a22a5de-utilities\") pod \"redhat-operators-dqcfr\" (UID: \"12d89da9-6d93-473e-846c-5fc96a22a5de\") " pod="openshift-marketplace/redhat-operators-dqcfr" Apr 02 14:04:06 crc kubenswrapper[4732]: I0402 14:04:06.565741 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwxdq\" (UniqueName: \"kubernetes.io/projected/12d89da9-6d93-473e-846c-5fc96a22a5de-kube-api-access-rwxdq\") pod \"redhat-operators-dqcfr\" (UID: \"12d89da9-6d93-473e-846c-5fc96a22a5de\") " pod="openshift-marketplace/redhat-operators-dqcfr" Apr 02 14:04:06 crc kubenswrapper[4732]: I0402 14:04:06.667799 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12d89da9-6d93-473e-846c-5fc96a22a5de-catalog-content\") pod \"redhat-operators-dqcfr\" (UID: \"12d89da9-6d93-473e-846c-5fc96a22a5de\") " pod="openshift-marketplace/redhat-operators-dqcfr" Apr 02 14:04:06 crc kubenswrapper[4732]: I0402 14:04:06.667854 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12d89da9-6d93-473e-846c-5fc96a22a5de-utilities\") pod \"redhat-operators-dqcfr\" (UID: \"12d89da9-6d93-473e-846c-5fc96a22a5de\") " pod="openshift-marketplace/redhat-operators-dqcfr" Apr 02 14:04:06 crc kubenswrapper[4732]: I0402 14:04:06.667879 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwxdq\" (UniqueName: \"kubernetes.io/projected/12d89da9-6d93-473e-846c-5fc96a22a5de-kube-api-access-rwxdq\") pod \"redhat-operators-dqcfr\" (UID: \"12d89da9-6d93-473e-846c-5fc96a22a5de\") " pod="openshift-marketplace/redhat-operators-dqcfr" Apr 02 14:04:06 crc kubenswrapper[4732]: I0402 14:04:06.668755 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12d89da9-6d93-473e-846c-5fc96a22a5de-catalog-content\") pod \"redhat-operators-dqcfr\" (UID: \"12d89da9-6d93-473e-846c-5fc96a22a5de\") " pod="openshift-marketplace/redhat-operators-dqcfr" Apr 02 14:04:06 crc kubenswrapper[4732]: I0402 14:04:06.668998 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12d89da9-6d93-473e-846c-5fc96a22a5de-utilities\") pod \"redhat-operators-dqcfr\" (UID: \"12d89da9-6d93-473e-846c-5fc96a22a5de\") " pod="openshift-marketplace/redhat-operators-dqcfr" Apr 02 14:04:06 crc kubenswrapper[4732]: I0402 14:04:06.688843 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwxdq\" (UniqueName: \"kubernetes.io/projected/12d89da9-6d93-473e-846c-5fc96a22a5de-kube-api-access-rwxdq\") pod \"redhat-operators-dqcfr\" (UID: \"12d89da9-6d93-473e-846c-5fc96a22a5de\") " pod="openshift-marketplace/redhat-operators-dqcfr" Apr 02 14:04:06 crc kubenswrapper[4732]: I0402 14:04:06.709367 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e874a7dc-8608-47fc-bf14-3eefa3fe6e6f" path="/var/lib/kubelet/pods/e874a7dc-8608-47fc-bf14-3eefa3fe6e6f/volumes" Apr 02 14:04:06 crc kubenswrapper[4732]: I0402 14:04:06.729165 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dqcfr" Apr 02 14:04:07 crc kubenswrapper[4732]: I0402 14:04:07.173323 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Apr 02 14:04:07 crc kubenswrapper[4732]: I0402 14:04:07.459134 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dqcfr"] Apr 02 14:04:07 crc kubenswrapper[4732]: W0402 14:04:07.461704 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12d89da9_6d93_473e_846c_5fc96a22a5de.slice/crio-1c962dc618a824d734e47e2461ba770067d0bbca61e17c1c8e5fcf7c4cd35af0 WatchSource:0}: Error finding container 1c962dc618a824d734e47e2461ba770067d0bbca61e17c1c8e5fcf7c4cd35af0: Status 404 returned error can't find the container with id 1c962dc618a824d734e47e2461ba770067d0bbca61e17c1c8e5fcf7c4cd35af0 Apr 02 14:04:08 crc kubenswrapper[4732]: I0402 14:04:08.328143 4732 generic.go:334] "Generic (PLEG): container finished" podID="12d89da9-6d93-473e-846c-5fc96a22a5de" containerID="939e0082a5f47c151714b4bc0ecb06e185fd81056a1143cb4d40fcd57cb826b2" exitCode=0 Apr 02 14:04:08 crc kubenswrapper[4732]: I0402 14:04:08.328317 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dqcfr" event={"ID":"12d89da9-6d93-473e-846c-5fc96a22a5de","Type":"ContainerDied","Data":"939e0082a5f47c151714b4bc0ecb06e185fd81056a1143cb4d40fcd57cb826b2"} Apr 02 14:04:08 crc kubenswrapper[4732]: I0402 14:04:08.328496 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dqcfr" event={"ID":"12d89da9-6d93-473e-846c-5fc96a22a5de","Type":"ContainerStarted","Data":"1c962dc618a824d734e47e2461ba770067d0bbca61e17c1c8e5fcf7c4cd35af0"} Apr 02 14:04:09 crc kubenswrapper[4732]: I0402 14:04:09.803694 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="56762f05-a513-4f47-8cf7-5d19bb58c5bd" containerName="rabbitmq" containerID="cri-o://c59797a594162dd89aca027cf2a8334e52ae126e765ce2f5b5f5aca4eab7131a" gracePeriod=57 Apr 02 14:04:10 crc kubenswrapper[4732]: I0402 14:04:10.168931 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="b0bb93d2-9da7-4667-9079-b403332d31e0" containerName="rabbitmq" containerID="cri-o://9717b2925f56b229efa61c94052925ad34c23f5599b515eb928ba046da74a28f" gracePeriod=58 Apr 02 14:04:10 crc kubenswrapper[4732]: I0402 14:04:10.349285 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dqcfr" event={"ID":"12d89da9-6d93-473e-846c-5fc96a22a5de","Type":"ContainerStarted","Data":"20f3cf0ff8ce5c8959c574dfc25e305dfe5237896b24e9aeffec71d67d2987ad"} Apr 02 14:04:12 crc kubenswrapper[4732]: I0402 14:04:12.369068 4732 generic.go:334] "Generic (PLEG): container finished" podID="b0bb93d2-9da7-4667-9079-b403332d31e0" containerID="9717b2925f56b229efa61c94052925ad34c23f5599b515eb928ba046da74a28f" exitCode=0 Apr 02 14:04:12 crc kubenswrapper[4732]: I0402 14:04:12.369131 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b0bb93d2-9da7-4667-9079-b403332d31e0","Type":"ContainerDied","Data":"9717b2925f56b229efa61c94052925ad34c23f5599b515eb928ba046da74a28f"} Apr 02 14:04:12 crc kubenswrapper[4732]: I0402 14:04:12.372502 4732 generic.go:334] "Generic (PLEG): container finished" podID="12d89da9-6d93-473e-846c-5fc96a22a5de" containerID="20f3cf0ff8ce5c8959c574dfc25e305dfe5237896b24e9aeffec71d67d2987ad" exitCode=0 Apr 02 14:04:12 crc kubenswrapper[4732]: I0402 14:04:12.372552 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dqcfr" event={"ID":"12d89da9-6d93-473e-846c-5fc96a22a5de","Type":"ContainerDied","Data":"20f3cf0ff8ce5c8959c574dfc25e305dfe5237896b24e9aeffec71d67d2987ad"} Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.028725 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.113401 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b0bb93d2-9da7-4667-9079-b403332d31e0-rabbitmq-plugins\") pod \"b0bb93d2-9da7-4667-9079-b403332d31e0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.113587 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b0bb93d2-9da7-4667-9079-b403332d31e0-plugins-conf\") pod \"b0bb93d2-9da7-4667-9079-b403332d31e0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.113728 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b0bb93d2-9da7-4667-9079-b403332d31e0-server-conf\") pod \"b0bb93d2-9da7-4667-9079-b403332d31e0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.113754 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b0bb93d2-9da7-4667-9079-b403332d31e0-erlang-cookie-secret\") pod \"b0bb93d2-9da7-4667-9079-b403332d31e0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.113783 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b0bb93d2-9da7-4667-9079-b403332d31e0-rabbitmq-tls\") pod \"b0bb93d2-9da7-4667-9079-b403332d31e0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.113813 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mb6bz\" (UniqueName: \"kubernetes.io/projected/b0bb93d2-9da7-4667-9079-b403332d31e0-kube-api-access-mb6bz\") pod \"b0bb93d2-9da7-4667-9079-b403332d31e0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.113855 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b0bb93d2-9da7-4667-9079-b403332d31e0-config-data\") pod \"b0bb93d2-9da7-4667-9079-b403332d31e0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.113900 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b0bb93d2-9da7-4667-9079-b403332d31e0-pod-info\") pod \"b0bb93d2-9da7-4667-9079-b403332d31e0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.113890 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0bb93d2-9da7-4667-9079-b403332d31e0-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b0bb93d2-9da7-4667-9079-b403332d31e0" (UID: "b0bb93d2-9da7-4667-9079-b403332d31e0"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.113927 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b0bb93d2-9da7-4667-9079-b403332d31e0-rabbitmq-erlang-cookie\") pod \"b0bb93d2-9da7-4667-9079-b403332d31e0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.113957 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b0bb93d2-9da7-4667-9079-b403332d31e0-rabbitmq-confd\") pod \"b0bb93d2-9da7-4667-9079-b403332d31e0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.113994 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"b0bb93d2-9da7-4667-9079-b403332d31e0\" (UID: \"b0bb93d2-9da7-4667-9079-b403332d31e0\") " Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.114228 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0bb93d2-9da7-4667-9079-b403332d31e0-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b0bb93d2-9da7-4667-9079-b403332d31e0" (UID: "b0bb93d2-9da7-4667-9079-b403332d31e0"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.114391 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b0bb93d2-9da7-4667-9079-b403332d31e0-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.114404 4732 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b0bb93d2-9da7-4667-9079-b403332d31e0-plugins-conf\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.115450 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0bb93d2-9da7-4667-9079-b403332d31e0-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b0bb93d2-9da7-4667-9079-b403332d31e0" (UID: "b0bb93d2-9da7-4667-9079-b403332d31e0"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.125807 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b0bb93d2-9da7-4667-9079-b403332d31e0-pod-info" (OuterVolumeSpecName: "pod-info") pod "b0bb93d2-9da7-4667-9079-b403332d31e0" (UID: "b0bb93d2-9da7-4667-9079-b403332d31e0"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.125837 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "b0bb93d2-9da7-4667-9079-b403332d31e0" (UID: "b0bb93d2-9da7-4667-9079-b403332d31e0"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.134188 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0bb93d2-9da7-4667-9079-b403332d31e0-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b0bb93d2-9da7-4667-9079-b403332d31e0" (UID: "b0bb93d2-9da7-4667-9079-b403332d31e0"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.146247 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0bb93d2-9da7-4667-9079-b403332d31e0-kube-api-access-mb6bz" (OuterVolumeSpecName: "kube-api-access-mb6bz") pod "b0bb93d2-9da7-4667-9079-b403332d31e0" (UID: "b0bb93d2-9da7-4667-9079-b403332d31e0"). InnerVolumeSpecName "kube-api-access-mb6bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.163514 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0bb93d2-9da7-4667-9079-b403332d31e0-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b0bb93d2-9da7-4667-9079-b403332d31e0" (UID: "b0bb93d2-9da7-4667-9079-b403332d31e0"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.169363 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0bb93d2-9da7-4667-9079-b403332d31e0-config-data" (OuterVolumeSpecName: "config-data") pod "b0bb93d2-9da7-4667-9079-b403332d31e0" (UID: "b0bb93d2-9da7-4667-9079-b403332d31e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.170110 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0bb93d2-9da7-4667-9079-b403332d31e0-server-conf" (OuterVolumeSpecName: "server-conf") pod "b0bb93d2-9da7-4667-9079-b403332d31e0" (UID: "b0bb93d2-9da7-4667-9079-b403332d31e0"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.220397 4732 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b0bb93d2-9da7-4667-9079-b403332d31e0-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.220650 4732 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b0bb93d2-9da7-4667-9079-b403332d31e0-server-conf\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.220668 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b0bb93d2-9da7-4667-9079-b403332d31e0-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.220892 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mb6bz\" (UniqueName: \"kubernetes.io/projected/b0bb93d2-9da7-4667-9079-b403332d31e0-kube-api-access-mb6bz\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.220917 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b0bb93d2-9da7-4667-9079-b403332d31e0-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.220928 4732 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b0bb93d2-9da7-4667-9079-b403332d31e0-pod-info\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.220941 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b0bb93d2-9da7-4667-9079-b403332d31e0-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.220973 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.246101 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0bb93d2-9da7-4667-9079-b403332d31e0-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b0bb93d2-9da7-4667-9079-b403332d31e0" (UID: "b0bb93d2-9da7-4667-9079-b403332d31e0"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.259805 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.322531 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b0bb93d2-9da7-4667-9079-b403332d31e0-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.322569 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.385879 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.386802 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b0bb93d2-9da7-4667-9079-b403332d31e0","Type":"ContainerDied","Data":"7c3447dab6d87e81460146e62ebcd33bb9652039825e72e010c68e5b1702fec7"} Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.386848 4732 scope.go:117] "RemoveContainer" containerID="9717b2925f56b229efa61c94052925ad34c23f5599b515eb928ba046da74a28f" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.400521 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dqcfr" event={"ID":"12d89da9-6d93-473e-846c-5fc96a22a5de","Type":"ContainerStarted","Data":"e8be6356ef95a9257b89881f365b30b9aa3dc71f774e30d118ec0acf5c96b65e"} Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.411672 4732 scope.go:117] "RemoveContainer" containerID="991fee5892b181cbf9eaa8f0e526c1dca54ed5e2932b158ac3a0bf0139afeaf4" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.432347 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dqcfr" podStartSLOduration=2.89453178 podStartE2EDuration="7.432330852s" podCreationTimestamp="2026-04-02 14:04:06 +0000 UTC" firstStartedPulling="2026-04-02 14:04:08.330673617 +0000 UTC m=+1605.235081170" lastFinishedPulling="2026-04-02 14:04:12.868472689 +0000 UTC m=+1609.772880242" observedRunningTime="2026-04-02 14:04:13.427172571 +0000 UTC m=+1610.331580134" watchObservedRunningTime="2026-04-02 14:04:13.432330852 +0000 UTC m=+1610.336738405" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.449387 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.463016 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.485057 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Apr 02 14:04:13 crc kubenswrapper[4732]: E0402 14:04:13.485494 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0bb93d2-9da7-4667-9079-b403332d31e0" containerName="setup-container" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.485509 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0bb93d2-9da7-4667-9079-b403332d31e0" containerName="setup-container" Apr 02 14:04:13 crc kubenswrapper[4732]: E0402 14:04:13.485540 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0bb93d2-9da7-4667-9079-b403332d31e0" containerName="rabbitmq" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.485546 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0bb93d2-9da7-4667-9079-b403332d31e0" containerName="rabbitmq" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.485844 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0bb93d2-9da7-4667-9079-b403332d31e0" containerName="rabbitmq" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.486929 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.489998 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.490078 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.490106 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.490133 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.490248 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.490311 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-d2wpm" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.490366 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.501417 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.526083 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/29e95846-a0bc-4d8b-ad4d-457766418564-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.526216 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.526245 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68srx\" (UniqueName: \"kubernetes.io/projected/29e95846-a0bc-4d8b-ad4d-457766418564-kube-api-access-68srx\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.526271 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/29e95846-a0bc-4d8b-ad4d-457766418564-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.526307 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/29e95846-a0bc-4d8b-ad4d-457766418564-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.526450 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/29e95846-a0bc-4d8b-ad4d-457766418564-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.526494 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29e95846-a0bc-4d8b-ad4d-457766418564-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.526549 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/29e95846-a0bc-4d8b-ad4d-457766418564-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.526591 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/29e95846-a0bc-4d8b-ad4d-457766418564-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.526606 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/29e95846-a0bc-4d8b-ad4d-457766418564-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.526651 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/29e95846-a0bc-4d8b-ad4d-457766418564-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.628370 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/29e95846-a0bc-4d8b-ad4d-457766418564-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.628434 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/29e95846-a0bc-4d8b-ad4d-457766418564-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.628455 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/29e95846-a0bc-4d8b-ad4d-457766418564-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.628470 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/29e95846-a0bc-4d8b-ad4d-457766418564-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.628499 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/29e95846-a0bc-4d8b-ad4d-457766418564-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.628558 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.628580 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68srx\" (UniqueName: \"kubernetes.io/projected/29e95846-a0bc-4d8b-ad4d-457766418564-kube-api-access-68srx\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.628607 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/29e95846-a0bc-4d8b-ad4d-457766418564-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.628711 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/29e95846-a0bc-4d8b-ad4d-457766418564-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.628747 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/29e95846-a0bc-4d8b-ad4d-457766418564-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.628784 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29e95846-a0bc-4d8b-ad4d-457766418564-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.629304 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/29e95846-a0bc-4d8b-ad4d-457766418564-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.629364 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/29e95846-a0bc-4d8b-ad4d-457766418564-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.630024 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29e95846-a0bc-4d8b-ad4d-457766418564-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.630195 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.630273 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/29e95846-a0bc-4d8b-ad4d-457766418564-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.635717 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/29e95846-a0bc-4d8b-ad4d-457766418564-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.636592 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/29e95846-a0bc-4d8b-ad4d-457766418564-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.643632 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/29e95846-a0bc-4d8b-ad4d-457766418564-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.643848 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/29e95846-a0bc-4d8b-ad4d-457766418564-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.643942 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/29e95846-a0bc-4d8b-ad4d-457766418564-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.650444 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68srx\" (UniqueName: \"kubernetes.io/projected/29e95846-a0bc-4d8b-ad4d-457766418564-kube-api-access-68srx\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.673084 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"29e95846-a0bc-4d8b-ad4d-457766418564\") " pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:13 crc kubenswrapper[4732]: I0402 14:04:13.808025 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:14 crc kubenswrapper[4732]: I0402 14:04:14.245859 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Apr 02 14:04:14 crc kubenswrapper[4732]: I0402 14:04:14.411585 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"29e95846-a0bc-4d8b-ad4d-457766418564","Type":"ContainerStarted","Data":"fda09d327a775ae135681b5aec8c670a3346a0a80dc7952373331bdc1ce95ef8"} Apr 02 14:04:14 crc kubenswrapper[4732]: I0402 14:04:14.691551 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0bb93d2-9da7-4667-9079-b403332d31e0" path="/var/lib/kubelet/pods/b0bb93d2-9da7-4667-9079-b403332d31e0/volumes" Apr 02 14:04:14 crc kubenswrapper[4732]: I0402 14:04:14.766503 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="56762f05-a513-4f47-8cf7-5d19bb58c5bd" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.95:5671: connect: connection refused" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.393683 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.438321 4732 generic.go:334] "Generic (PLEG): container finished" podID="56762f05-a513-4f47-8cf7-5d19bb58c5bd" containerID="c59797a594162dd89aca027cf2a8334e52ae126e765ce2f5b5f5aca4eab7131a" exitCode=0 Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.438383 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"56762f05-a513-4f47-8cf7-5d19bb58c5bd","Type":"ContainerDied","Data":"c59797a594162dd89aca027cf2a8334e52ae126e765ce2f5b5f5aca4eab7131a"} Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.438410 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"56762f05-a513-4f47-8cf7-5d19bb58c5bd","Type":"ContainerDied","Data":"078cba21939a3f946508dd32dd3c808e4ad7cc6a198efcc2e69a66c6c08a1410"} Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.438428 4732 scope.go:117] "RemoveContainer" containerID="c59797a594162dd89aca027cf2a8334e52ae126e765ce2f5b5f5aca4eab7131a" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.438553 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.444047 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"29e95846-a0bc-4d8b-ad4d-457766418564","Type":"ContainerStarted","Data":"507ce84edd1a878bb3190544118ad341f01b0fd432881674befdfdf970a83ba9"} Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.474144 4732 scope.go:117] "RemoveContainer" containerID="a107fd73961c43c85df6b57282cf11a8acd3acae427dc6edddcade0f4ab33f9d" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.488472 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56762f05-a513-4f47-8cf7-5d19bb58c5bd-rabbitmq-tls\") pod \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.488635 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.488738 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56762f05-a513-4f47-8cf7-5d19bb58c5bd-config-data\") pod \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.488779 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnhpf\" (UniqueName: \"kubernetes.io/projected/56762f05-a513-4f47-8cf7-5d19bb58c5bd-kube-api-access-dnhpf\") pod \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.488874 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56762f05-a513-4f47-8cf7-5d19bb58c5bd-plugins-conf\") pod \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.488910 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56762f05-a513-4f47-8cf7-5d19bb58c5bd-server-conf\") pod \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.489056 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56762f05-a513-4f47-8cf7-5d19bb58c5bd-pod-info\") pod \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.489091 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56762f05-a513-4f47-8cf7-5d19bb58c5bd-rabbitmq-plugins\") pod \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.489127 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56762f05-a513-4f47-8cf7-5d19bb58c5bd-erlang-cookie-secret\") pod \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.489160 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56762f05-a513-4f47-8cf7-5d19bb58c5bd-rabbitmq-erlang-cookie\") pod \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.489217 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56762f05-a513-4f47-8cf7-5d19bb58c5bd-rabbitmq-confd\") pod \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\" (UID: \"56762f05-a513-4f47-8cf7-5d19bb58c5bd\") " Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.492086 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56762f05-a513-4f47-8cf7-5d19bb58c5bd-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "56762f05-a513-4f47-8cf7-5d19bb58c5bd" (UID: "56762f05-a513-4f47-8cf7-5d19bb58c5bd"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.494864 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56762f05-a513-4f47-8cf7-5d19bb58c5bd-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "56762f05-a513-4f47-8cf7-5d19bb58c5bd" (UID: "56762f05-a513-4f47-8cf7-5d19bb58c5bd"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.496122 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/56762f05-a513-4f47-8cf7-5d19bb58c5bd-pod-info" (OuterVolumeSpecName: "pod-info") pod "56762f05-a513-4f47-8cf7-5d19bb58c5bd" (UID: "56762f05-a513-4f47-8cf7-5d19bb58c5bd"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.496295 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56762f05-a513-4f47-8cf7-5d19bb58c5bd-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "56762f05-a513-4f47-8cf7-5d19bb58c5bd" (UID: "56762f05-a513-4f47-8cf7-5d19bb58c5bd"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.497057 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56762f05-a513-4f47-8cf7-5d19bb58c5bd-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "56762f05-a513-4f47-8cf7-5d19bb58c5bd" (UID: "56762f05-a513-4f47-8cf7-5d19bb58c5bd"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.502859 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56762f05-a513-4f47-8cf7-5d19bb58c5bd-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "56762f05-a513-4f47-8cf7-5d19bb58c5bd" (UID: "56762f05-a513-4f47-8cf7-5d19bb58c5bd"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.504987 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56762f05-a513-4f47-8cf7-5d19bb58c5bd-kube-api-access-dnhpf" (OuterVolumeSpecName: "kube-api-access-dnhpf") pod "56762f05-a513-4f47-8cf7-5d19bb58c5bd" (UID: "56762f05-a513-4f47-8cf7-5d19bb58c5bd"). InnerVolumeSpecName "kube-api-access-dnhpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.505579 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "56762f05-a513-4f47-8cf7-5d19bb58c5bd" (UID: "56762f05-a513-4f47-8cf7-5d19bb58c5bd"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.534434 4732 scope.go:117] "RemoveContainer" containerID="c59797a594162dd89aca027cf2a8334e52ae126e765ce2f5b5f5aca4eab7131a" Apr 02 14:04:16 crc kubenswrapper[4732]: E0402 14:04:16.542723 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c59797a594162dd89aca027cf2a8334e52ae126e765ce2f5b5f5aca4eab7131a\": container with ID starting with c59797a594162dd89aca027cf2a8334e52ae126e765ce2f5b5f5aca4eab7131a not found: ID does not exist" containerID="c59797a594162dd89aca027cf2a8334e52ae126e765ce2f5b5f5aca4eab7131a" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.542778 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c59797a594162dd89aca027cf2a8334e52ae126e765ce2f5b5f5aca4eab7131a"} err="failed to get container status \"c59797a594162dd89aca027cf2a8334e52ae126e765ce2f5b5f5aca4eab7131a\": rpc error: code = NotFound desc = could not find container \"c59797a594162dd89aca027cf2a8334e52ae126e765ce2f5b5f5aca4eab7131a\": container with ID starting with c59797a594162dd89aca027cf2a8334e52ae126e765ce2f5b5f5aca4eab7131a not found: ID does not exist" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.542805 4732 scope.go:117] "RemoveContainer" containerID="a107fd73961c43c85df6b57282cf11a8acd3acae427dc6edddcade0f4ab33f9d" Apr 02 14:04:16 crc kubenswrapper[4732]: E0402 14:04:16.544996 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a107fd73961c43c85df6b57282cf11a8acd3acae427dc6edddcade0f4ab33f9d\": container with ID starting with a107fd73961c43c85df6b57282cf11a8acd3acae427dc6edddcade0f4ab33f9d not found: ID does not exist" containerID="a107fd73961c43c85df6b57282cf11a8acd3acae427dc6edddcade0f4ab33f9d" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.545145 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a107fd73961c43c85df6b57282cf11a8acd3acae427dc6edddcade0f4ab33f9d"} err="failed to get container status \"a107fd73961c43c85df6b57282cf11a8acd3acae427dc6edddcade0f4ab33f9d\": rpc error: code = NotFound desc = could not find container \"a107fd73961c43c85df6b57282cf11a8acd3acae427dc6edddcade0f4ab33f9d\": container with ID starting with a107fd73961c43c85df6b57282cf11a8acd3acae427dc6edddcade0f4ab33f9d not found: ID does not exist" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.571412 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56762f05-a513-4f47-8cf7-5d19bb58c5bd-server-conf" (OuterVolumeSpecName: "server-conf") pod "56762f05-a513-4f47-8cf7-5d19bb58c5bd" (UID: "56762f05-a513-4f47-8cf7-5d19bb58c5bd"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.591798 4732 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56762f05-a513-4f47-8cf7-5d19bb58c5bd-plugins-conf\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.592112 4732 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56762f05-a513-4f47-8cf7-5d19bb58c5bd-server-conf\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.592452 4732 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56762f05-a513-4f47-8cf7-5d19bb58c5bd-pod-info\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.592519 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56762f05-a513-4f47-8cf7-5d19bb58c5bd-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.592587 4732 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56762f05-a513-4f47-8cf7-5d19bb58c5bd-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.592663 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56762f05-a513-4f47-8cf7-5d19bb58c5bd-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.592718 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56762f05-a513-4f47-8cf7-5d19bb58c5bd-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.592787 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.592850 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnhpf\" (UniqueName: \"kubernetes.io/projected/56762f05-a513-4f47-8cf7-5d19bb58c5bd-kube-api-access-dnhpf\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.601289 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56762f05-a513-4f47-8cf7-5d19bb58c5bd-config-data" (OuterVolumeSpecName: "config-data") pod "56762f05-a513-4f47-8cf7-5d19bb58c5bd" (UID: "56762f05-a513-4f47-8cf7-5d19bb58c5bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.640463 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.694282 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.694329 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56762f05-a513-4f47-8cf7-5d19bb58c5bd-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.705248 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56762f05-a513-4f47-8cf7-5d19bb58c5bd-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "56762f05-a513-4f47-8cf7-5d19bb58c5bd" (UID: "56762f05-a513-4f47-8cf7-5d19bb58c5bd"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.729921 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dqcfr" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.729965 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dqcfr" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.777343 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.796245 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56762f05-a513-4f47-8cf7-5d19bb58c5bd-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.799748 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.812880 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Apr 02 14:04:16 crc kubenswrapper[4732]: E0402 14:04:16.813455 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56762f05-a513-4f47-8cf7-5d19bb58c5bd" containerName="setup-container" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.813481 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="56762f05-a513-4f47-8cf7-5d19bb58c5bd" containerName="setup-container" Apr 02 14:04:16 crc kubenswrapper[4732]: E0402 14:04:16.813504 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56762f05-a513-4f47-8cf7-5d19bb58c5bd" containerName="rabbitmq" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.813514 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="56762f05-a513-4f47-8cf7-5d19bb58c5bd" containerName="rabbitmq" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.813763 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="56762f05-a513-4f47-8cf7-5d19bb58c5bd" containerName="rabbitmq" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.815040 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.817996 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.818027 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.818305 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.818524 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.818703 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-smt97" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.820408 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.821199 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.824490 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.898680 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.898733 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.898774 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.898807 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.898829 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mksvj\" (UniqueName: \"kubernetes.io/projected/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-kube-api-access-mksvj\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.898892 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-config-data\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.898924 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.898963 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.899002 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.899025 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:16 crc kubenswrapper[4732]: I0402 14:04:16.899048 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.000473 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-config-data\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.000547 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.000599 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.000665 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.000694 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.000712 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.000787 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.000807 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.000832 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.000859 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.000881 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mksvj\" (UniqueName: \"kubernetes.io/projected/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-kube-api-access-mksvj\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.001769 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.001819 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.001793 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.001776 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-config-data\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.001963 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.002577 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.005782 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.006350 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.006855 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.012330 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.029240 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mksvj\" (UniqueName: \"kubernetes.io/projected/c9e1cd50-72d3-4ccc-9f49-c4c1619252fc-kube-api-access-mksvj\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.038262 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc\") " pod="openstack/rabbitmq-server-0" Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.141236 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.641479 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Apr 02 14:04:17 crc kubenswrapper[4732]: W0402 14:04:17.643305 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9e1cd50_72d3_4ccc_9f49_c4c1619252fc.slice/crio-089d928c92e992e1eaeb76df2114790ef3f811d07afbc39d85ead2a11245ccdd WatchSource:0}: Error finding container 089d928c92e992e1eaeb76df2114790ef3f811d07afbc39d85ead2a11245ccdd: Status 404 returned error can't find the container with id 089d928c92e992e1eaeb76df2114790ef3f811d07afbc39d85ead2a11245ccdd Apr 02 14:04:17 crc kubenswrapper[4732]: I0402 14:04:17.785190 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dqcfr" podUID="12d89da9-6d93-473e-846c-5fc96a22a5de" containerName="registry-server" probeResult="failure" output=< Apr 02 14:04:17 crc kubenswrapper[4732]: timeout: failed to connect service ":50051" within 1s Apr 02 14:04:17 crc kubenswrapper[4732]: > Apr 02 14:04:18 crc kubenswrapper[4732]: I0402 14:04:18.471221 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc","Type":"ContainerStarted","Data":"089d928c92e992e1eaeb76df2114790ef3f811d07afbc39d85ead2a11245ccdd"} Apr 02 14:04:18 crc kubenswrapper[4732]: I0402 14:04:18.691953 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56762f05-a513-4f47-8cf7-5d19bb58c5bd" path="/var/lib/kubelet/pods/56762f05-a513-4f47-8cf7-5d19bb58c5bd/volumes" Apr 02 14:04:19 crc kubenswrapper[4732]: I0402 14:04:19.480559 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc","Type":"ContainerStarted","Data":"ba581d578d0457d94560ad24f90a1b0f5578db3a15b7b1b99c0a70c511b484aa"} Apr 02 14:04:20 crc kubenswrapper[4732]: I0402 14:04:20.985281 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-qwlb2"] Apr 02 14:04:20 crc kubenswrapper[4732]: I0402 14:04:20.987842 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:20 crc kubenswrapper[4732]: I0402 14:04:20.991942 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Apr 02 14:04:20 crc kubenswrapper[4732]: I0402 14:04:20.998702 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-qwlb2"] Apr 02 14:04:21 crc kubenswrapper[4732]: I0402 14:04:21.125550 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-qwlb2\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:21 crc kubenswrapper[4732]: I0402 14:04:21.125598 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82tjp\" (UniqueName: \"kubernetes.io/projected/82d82db9-6003-4fa9-84ae-ef1b612462bc-kube-api-access-82tjp\") pod \"dnsmasq-dns-5576978c7c-qwlb2\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:21 crc kubenswrapper[4732]: I0402 14:04:21.125647 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-qwlb2\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:21 crc kubenswrapper[4732]: I0402 14:04:21.125754 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-config\") pod \"dnsmasq-dns-5576978c7c-qwlb2\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:21 crc kubenswrapper[4732]: I0402 14:04:21.125783 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-dns-svc\") pod \"dnsmasq-dns-5576978c7c-qwlb2\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:21 crc kubenswrapper[4732]: I0402 14:04:21.125823 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-qwlb2\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:21 crc kubenswrapper[4732]: I0402 14:04:21.125874 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-qwlb2\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:21 crc kubenswrapper[4732]: I0402 14:04:21.227815 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-qwlb2\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:21 crc kubenswrapper[4732]: I0402 14:04:21.227930 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-qwlb2\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:21 crc kubenswrapper[4732]: I0402 14:04:21.227992 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-qwlb2\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:21 crc kubenswrapper[4732]: I0402 14:04:21.228011 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82tjp\" (UniqueName: \"kubernetes.io/projected/82d82db9-6003-4fa9-84ae-ef1b612462bc-kube-api-access-82tjp\") pod \"dnsmasq-dns-5576978c7c-qwlb2\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:21 crc kubenswrapper[4732]: I0402 14:04:21.228039 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-qwlb2\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:21 crc kubenswrapper[4732]: I0402 14:04:21.228105 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-config\") pod \"dnsmasq-dns-5576978c7c-qwlb2\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:21 crc kubenswrapper[4732]: I0402 14:04:21.228124 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-dns-svc\") pod \"dnsmasq-dns-5576978c7c-qwlb2\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:21 crc kubenswrapper[4732]: I0402 14:04:21.229385 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-dns-svc\") pod \"dnsmasq-dns-5576978c7c-qwlb2\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:21 crc kubenswrapper[4732]: I0402 14:04:21.230052 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-qwlb2\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:21 crc kubenswrapper[4732]: I0402 14:04:21.230671 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-qwlb2\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:21 crc kubenswrapper[4732]: I0402 14:04:21.231215 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-qwlb2\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:21 crc kubenswrapper[4732]: I0402 14:04:21.232572 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-config\") pod \"dnsmasq-dns-5576978c7c-qwlb2\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:21 crc kubenswrapper[4732]: I0402 14:04:21.232979 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-qwlb2\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:21 crc kubenswrapper[4732]: I0402 14:04:21.249476 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82tjp\" (UniqueName: \"kubernetes.io/projected/82d82db9-6003-4fa9-84ae-ef1b612462bc-kube-api-access-82tjp\") pod \"dnsmasq-dns-5576978c7c-qwlb2\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:21 crc kubenswrapper[4732]: I0402 14:04:21.309559 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:21 crc kubenswrapper[4732]: I0402 14:04:21.758702 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-qwlb2"] Apr 02 14:04:22 crc kubenswrapper[4732]: I0402 14:04:22.511914 4732 generic.go:334] "Generic (PLEG): container finished" podID="82d82db9-6003-4fa9-84ae-ef1b612462bc" containerID="af331e6809a269f93fcc8a8c747230e30a0562c868224261869aee82df482c1e" exitCode=0 Apr 02 14:04:22 crc kubenswrapper[4732]: I0402 14:04:22.512015 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" event={"ID":"82d82db9-6003-4fa9-84ae-ef1b612462bc","Type":"ContainerDied","Data":"af331e6809a269f93fcc8a8c747230e30a0562c868224261869aee82df482c1e"} Apr 02 14:04:22 crc kubenswrapper[4732]: I0402 14:04:22.512481 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" event={"ID":"82d82db9-6003-4fa9-84ae-ef1b612462bc","Type":"ContainerStarted","Data":"be35e3a0d1daaf744b6e172b7f20df76ec288e4b983246ad15dc7af2ea733832"} Apr 02 14:04:23 crc kubenswrapper[4732]: I0402 14:04:23.523629 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" event={"ID":"82d82db9-6003-4fa9-84ae-ef1b612462bc","Type":"ContainerStarted","Data":"d03a6617fa08ecb7f6645b8d71fca7283a739e2aad5f9d7556edb053eb9d53ee"} Apr 02 14:04:23 crc kubenswrapper[4732]: I0402 14:04:23.525292 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:23 crc kubenswrapper[4732]: I0402 14:04:23.556189 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" podStartSLOduration=3.5561634250000003 podStartE2EDuration="3.556163425s" podCreationTimestamp="2026-04-02 14:04:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:04:23.54687982 +0000 UTC m=+1620.451287393" watchObservedRunningTime="2026-04-02 14:04:23.556163425 +0000 UTC m=+1620.460570998" Apr 02 14:04:26 crc kubenswrapper[4732]: I0402 14:04:26.785080 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dqcfr" Apr 02 14:04:26 crc kubenswrapper[4732]: I0402 14:04:26.833417 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dqcfr" Apr 02 14:04:27 crc kubenswrapper[4732]: I0402 14:04:27.020651 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dqcfr"] Apr 02 14:04:28 crc kubenswrapper[4732]: I0402 14:04:28.573932 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dqcfr" podUID="12d89da9-6d93-473e-846c-5fc96a22a5de" containerName="registry-server" containerID="cri-o://e8be6356ef95a9257b89881f365b30b9aa3dc71f774e30d118ec0acf5c96b65e" gracePeriod=2 Apr 02 14:04:29 crc kubenswrapper[4732]: I0402 14:04:29.020584 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dqcfr" Apr 02 14:04:29 crc kubenswrapper[4732]: I0402 14:04:29.080008 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12d89da9-6d93-473e-846c-5fc96a22a5de-utilities\") pod \"12d89da9-6d93-473e-846c-5fc96a22a5de\" (UID: \"12d89da9-6d93-473e-846c-5fc96a22a5de\") " Apr 02 14:04:29 crc kubenswrapper[4732]: I0402 14:04:29.080181 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwxdq\" (UniqueName: \"kubernetes.io/projected/12d89da9-6d93-473e-846c-5fc96a22a5de-kube-api-access-rwxdq\") pod \"12d89da9-6d93-473e-846c-5fc96a22a5de\" (UID: \"12d89da9-6d93-473e-846c-5fc96a22a5de\") " Apr 02 14:04:29 crc kubenswrapper[4732]: I0402 14:04:29.080334 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12d89da9-6d93-473e-846c-5fc96a22a5de-catalog-content\") pod \"12d89da9-6d93-473e-846c-5fc96a22a5de\" (UID: \"12d89da9-6d93-473e-846c-5fc96a22a5de\") " Apr 02 14:04:29 crc kubenswrapper[4732]: I0402 14:04:29.080734 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12d89da9-6d93-473e-846c-5fc96a22a5de-utilities" (OuterVolumeSpecName: "utilities") pod "12d89da9-6d93-473e-846c-5fc96a22a5de" (UID: "12d89da9-6d93-473e-846c-5fc96a22a5de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:04:29 crc kubenswrapper[4732]: I0402 14:04:29.085732 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12d89da9-6d93-473e-846c-5fc96a22a5de-kube-api-access-rwxdq" (OuterVolumeSpecName: "kube-api-access-rwxdq") pod "12d89da9-6d93-473e-846c-5fc96a22a5de" (UID: "12d89da9-6d93-473e-846c-5fc96a22a5de"). InnerVolumeSpecName "kube-api-access-rwxdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:04:29 crc kubenswrapper[4732]: I0402 14:04:29.182402 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwxdq\" (UniqueName: \"kubernetes.io/projected/12d89da9-6d93-473e-846c-5fc96a22a5de-kube-api-access-rwxdq\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:29 crc kubenswrapper[4732]: I0402 14:04:29.182437 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12d89da9-6d93-473e-846c-5fc96a22a5de-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:29 crc kubenswrapper[4732]: I0402 14:04:29.210086 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12d89da9-6d93-473e-846c-5fc96a22a5de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12d89da9-6d93-473e-846c-5fc96a22a5de" (UID: "12d89da9-6d93-473e-846c-5fc96a22a5de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:04:29 crc kubenswrapper[4732]: I0402 14:04:29.284187 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12d89da9-6d93-473e-846c-5fc96a22a5de-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:29 crc kubenswrapper[4732]: I0402 14:04:29.587028 4732 generic.go:334] "Generic (PLEG): container finished" podID="12d89da9-6d93-473e-846c-5fc96a22a5de" containerID="e8be6356ef95a9257b89881f365b30b9aa3dc71f774e30d118ec0acf5c96b65e" exitCode=0 Apr 02 14:04:29 crc kubenswrapper[4732]: I0402 14:04:29.587127 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dqcfr" event={"ID":"12d89da9-6d93-473e-846c-5fc96a22a5de","Type":"ContainerDied","Data":"e8be6356ef95a9257b89881f365b30b9aa3dc71f774e30d118ec0acf5c96b65e"} Apr 02 14:04:29 crc kubenswrapper[4732]: I0402 14:04:29.587409 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dqcfr" event={"ID":"12d89da9-6d93-473e-846c-5fc96a22a5de","Type":"ContainerDied","Data":"1c962dc618a824d734e47e2461ba770067d0bbca61e17c1c8e5fcf7c4cd35af0"} Apr 02 14:04:29 crc kubenswrapper[4732]: I0402 14:04:29.587429 4732 scope.go:117] "RemoveContainer" containerID="e8be6356ef95a9257b89881f365b30b9aa3dc71f774e30d118ec0acf5c96b65e" Apr 02 14:04:29 crc kubenswrapper[4732]: I0402 14:04:29.587209 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dqcfr" Apr 02 14:04:29 crc kubenswrapper[4732]: I0402 14:04:29.614176 4732 scope.go:117] "RemoveContainer" containerID="20f3cf0ff8ce5c8959c574dfc25e305dfe5237896b24e9aeffec71d67d2987ad" Apr 02 14:04:29 crc kubenswrapper[4732]: I0402 14:04:29.637484 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dqcfr"] Apr 02 14:04:29 crc kubenswrapper[4732]: I0402 14:04:29.648471 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dqcfr"] Apr 02 14:04:29 crc kubenswrapper[4732]: I0402 14:04:29.660119 4732 scope.go:117] "RemoveContainer" containerID="939e0082a5f47c151714b4bc0ecb06e185fd81056a1143cb4d40fcd57cb826b2" Apr 02 14:04:29 crc kubenswrapper[4732]: I0402 14:04:29.695687 4732 scope.go:117] "RemoveContainer" containerID="e8be6356ef95a9257b89881f365b30b9aa3dc71f774e30d118ec0acf5c96b65e" Apr 02 14:04:29 crc kubenswrapper[4732]: E0402 14:04:29.696126 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8be6356ef95a9257b89881f365b30b9aa3dc71f774e30d118ec0acf5c96b65e\": container with ID starting with e8be6356ef95a9257b89881f365b30b9aa3dc71f774e30d118ec0acf5c96b65e not found: ID does not exist" containerID="e8be6356ef95a9257b89881f365b30b9aa3dc71f774e30d118ec0acf5c96b65e" Apr 02 14:04:29 crc kubenswrapper[4732]: I0402 14:04:29.696167 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8be6356ef95a9257b89881f365b30b9aa3dc71f774e30d118ec0acf5c96b65e"} err="failed to get container status \"e8be6356ef95a9257b89881f365b30b9aa3dc71f774e30d118ec0acf5c96b65e\": rpc error: code = NotFound desc = could not find container \"e8be6356ef95a9257b89881f365b30b9aa3dc71f774e30d118ec0acf5c96b65e\": container with ID starting with e8be6356ef95a9257b89881f365b30b9aa3dc71f774e30d118ec0acf5c96b65e not found: ID does not exist" Apr 02 14:04:29 crc kubenswrapper[4732]: I0402 14:04:29.696192 4732 scope.go:117] "RemoveContainer" containerID="20f3cf0ff8ce5c8959c574dfc25e305dfe5237896b24e9aeffec71d67d2987ad" Apr 02 14:04:29 crc kubenswrapper[4732]: E0402 14:04:29.696448 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20f3cf0ff8ce5c8959c574dfc25e305dfe5237896b24e9aeffec71d67d2987ad\": container with ID starting with 20f3cf0ff8ce5c8959c574dfc25e305dfe5237896b24e9aeffec71d67d2987ad not found: ID does not exist" containerID="20f3cf0ff8ce5c8959c574dfc25e305dfe5237896b24e9aeffec71d67d2987ad" Apr 02 14:04:29 crc kubenswrapper[4732]: I0402 14:04:29.696496 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20f3cf0ff8ce5c8959c574dfc25e305dfe5237896b24e9aeffec71d67d2987ad"} err="failed to get container status \"20f3cf0ff8ce5c8959c574dfc25e305dfe5237896b24e9aeffec71d67d2987ad\": rpc error: code = NotFound desc = could not find container \"20f3cf0ff8ce5c8959c574dfc25e305dfe5237896b24e9aeffec71d67d2987ad\": container with ID starting with 20f3cf0ff8ce5c8959c574dfc25e305dfe5237896b24e9aeffec71d67d2987ad not found: ID does not exist" Apr 02 14:04:29 crc kubenswrapper[4732]: I0402 14:04:29.696513 4732 scope.go:117] "RemoveContainer" containerID="939e0082a5f47c151714b4bc0ecb06e185fd81056a1143cb4d40fcd57cb826b2" Apr 02 14:04:29 crc kubenswrapper[4732]: E0402 14:04:29.696750 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"939e0082a5f47c151714b4bc0ecb06e185fd81056a1143cb4d40fcd57cb826b2\": container with ID starting with 939e0082a5f47c151714b4bc0ecb06e185fd81056a1143cb4d40fcd57cb826b2 not found: ID does not exist" containerID="939e0082a5f47c151714b4bc0ecb06e185fd81056a1143cb4d40fcd57cb826b2" Apr 02 14:04:29 crc kubenswrapper[4732]: I0402 14:04:29.696778 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"939e0082a5f47c151714b4bc0ecb06e185fd81056a1143cb4d40fcd57cb826b2"} err="failed to get container status \"939e0082a5f47c151714b4bc0ecb06e185fd81056a1143cb4d40fcd57cb826b2\": rpc error: code = NotFound desc = could not find container \"939e0082a5f47c151714b4bc0ecb06e185fd81056a1143cb4d40fcd57cb826b2\": container with ID starting with 939e0082a5f47c151714b4bc0ecb06e185fd81056a1143cb4d40fcd57cb826b2 not found: ID does not exist" Apr 02 14:04:30 crc kubenswrapper[4732]: I0402 14:04:30.691350 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12d89da9-6d93-473e-846c-5fc96a22a5de" path="/var/lib/kubelet/pods/12d89da9-6d93-473e-846c-5fc96a22a5de/volumes" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.310780 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.379289 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-d4dtv"] Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.379664 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" podUID="2b032b72-f7bb-4aa9-9519-53f05329a833" containerName="dnsmasq-dns" containerID="cri-o://f96c0913b24199548d07f7ef68e2f2127ac60fa7555ee2a1e923924d659ddf1f" gracePeriod=10 Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.527060 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-2wr76"] Apr 02 14:04:31 crc kubenswrapper[4732]: E0402 14:04:31.527629 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12d89da9-6d93-473e-846c-5fc96a22a5de" containerName="extract-content" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.527650 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="12d89da9-6d93-473e-846c-5fc96a22a5de" containerName="extract-content" Apr 02 14:04:31 crc kubenswrapper[4732]: E0402 14:04:31.527694 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12d89da9-6d93-473e-846c-5fc96a22a5de" containerName="extract-utilities" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.527703 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="12d89da9-6d93-473e-846c-5fc96a22a5de" containerName="extract-utilities" Apr 02 14:04:31 crc kubenswrapper[4732]: E0402 14:04:31.527716 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12d89da9-6d93-473e-846c-5fc96a22a5de" containerName="registry-server" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.527725 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="12d89da9-6d93-473e-846c-5fc96a22a5de" containerName="registry-server" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.528002 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="12d89da9-6d93-473e-846c-5fc96a22a5de" containerName="registry-server" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.532309 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.551484 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-2wr76"] Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.619054 4732 generic.go:334] "Generic (PLEG): container finished" podID="2b032b72-f7bb-4aa9-9519-53f05329a833" containerID="f96c0913b24199548d07f7ef68e2f2127ac60fa7555ee2a1e923924d659ddf1f" exitCode=0 Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.619104 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" event={"ID":"2b032b72-f7bb-4aa9-9519-53f05329a833","Type":"ContainerDied","Data":"f96c0913b24199548d07f7ef68e2f2127ac60fa7555ee2a1e923924d659ddf1f"} Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.628974 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b22602a0-7545-4c2d-8b16-2233288ab360-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-2wr76\" (UID: \"b22602a0-7545-4c2d-8b16-2233288ab360\") " pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.629019 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b22602a0-7545-4c2d-8b16-2233288ab360-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-2wr76\" (UID: \"b22602a0-7545-4c2d-8b16-2233288ab360\") " pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.629052 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b22602a0-7545-4c2d-8b16-2233288ab360-config\") pod \"dnsmasq-dns-8c6f6df99-2wr76\" (UID: \"b22602a0-7545-4c2d-8b16-2233288ab360\") " pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.629093 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9kkg\" (UniqueName: \"kubernetes.io/projected/b22602a0-7545-4c2d-8b16-2233288ab360-kube-api-access-h9kkg\") pod \"dnsmasq-dns-8c6f6df99-2wr76\" (UID: \"b22602a0-7545-4c2d-8b16-2233288ab360\") " pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.629113 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b22602a0-7545-4c2d-8b16-2233288ab360-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-2wr76\" (UID: \"b22602a0-7545-4c2d-8b16-2233288ab360\") " pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.629130 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b22602a0-7545-4c2d-8b16-2233288ab360-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-2wr76\" (UID: \"b22602a0-7545-4c2d-8b16-2233288ab360\") " pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.629158 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b22602a0-7545-4c2d-8b16-2233288ab360-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-2wr76\" (UID: \"b22602a0-7545-4c2d-8b16-2233288ab360\") " pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.730528 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b22602a0-7545-4c2d-8b16-2233288ab360-config\") pod \"dnsmasq-dns-8c6f6df99-2wr76\" (UID: \"b22602a0-7545-4c2d-8b16-2233288ab360\") " pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.730626 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9kkg\" (UniqueName: \"kubernetes.io/projected/b22602a0-7545-4c2d-8b16-2233288ab360-kube-api-access-h9kkg\") pod \"dnsmasq-dns-8c6f6df99-2wr76\" (UID: \"b22602a0-7545-4c2d-8b16-2233288ab360\") " pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.730651 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b22602a0-7545-4c2d-8b16-2233288ab360-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-2wr76\" (UID: \"b22602a0-7545-4c2d-8b16-2233288ab360\") " pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.730680 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b22602a0-7545-4c2d-8b16-2233288ab360-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-2wr76\" (UID: \"b22602a0-7545-4c2d-8b16-2233288ab360\") " pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.730710 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b22602a0-7545-4c2d-8b16-2233288ab360-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-2wr76\" (UID: \"b22602a0-7545-4c2d-8b16-2233288ab360\") " pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.730828 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b22602a0-7545-4c2d-8b16-2233288ab360-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-2wr76\" (UID: \"b22602a0-7545-4c2d-8b16-2233288ab360\") " pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.730853 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b22602a0-7545-4c2d-8b16-2233288ab360-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-2wr76\" (UID: \"b22602a0-7545-4c2d-8b16-2233288ab360\") " pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.732742 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b22602a0-7545-4c2d-8b16-2233288ab360-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-2wr76\" (UID: \"b22602a0-7545-4c2d-8b16-2233288ab360\") " pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.732792 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b22602a0-7545-4c2d-8b16-2233288ab360-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-2wr76\" (UID: \"b22602a0-7545-4c2d-8b16-2233288ab360\") " pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.732879 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b22602a0-7545-4c2d-8b16-2233288ab360-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-2wr76\" (UID: \"b22602a0-7545-4c2d-8b16-2233288ab360\") " pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.733407 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b22602a0-7545-4c2d-8b16-2233288ab360-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-2wr76\" (UID: \"b22602a0-7545-4c2d-8b16-2233288ab360\") " pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.735838 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b22602a0-7545-4c2d-8b16-2233288ab360-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-2wr76\" (UID: \"b22602a0-7545-4c2d-8b16-2233288ab360\") " pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.738588 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b22602a0-7545-4c2d-8b16-2233288ab360-config\") pod \"dnsmasq-dns-8c6f6df99-2wr76\" (UID: \"b22602a0-7545-4c2d-8b16-2233288ab360\") " pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.772859 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9kkg\" (UniqueName: \"kubernetes.io/projected/b22602a0-7545-4c2d-8b16-2233288ab360-kube-api-access-h9kkg\") pod \"dnsmasq-dns-8c6f6df99-2wr76\" (UID: \"b22602a0-7545-4c2d-8b16-2233288ab360\") " pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.857272 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.883359 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.934135 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-dns-swift-storage-0\") pod \"2b032b72-f7bb-4aa9-9519-53f05329a833\" (UID: \"2b032b72-f7bb-4aa9-9519-53f05329a833\") " Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.934188 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-dns-svc\") pod \"2b032b72-f7bb-4aa9-9519-53f05329a833\" (UID: \"2b032b72-f7bb-4aa9-9519-53f05329a833\") " Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.934271 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-config\") pod \"2b032b72-f7bb-4aa9-9519-53f05329a833\" (UID: \"2b032b72-f7bb-4aa9-9519-53f05329a833\") " Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.934329 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-ovsdbserver-nb\") pod \"2b032b72-f7bb-4aa9-9519-53f05329a833\" (UID: \"2b032b72-f7bb-4aa9-9519-53f05329a833\") " Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.934391 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x64wb\" (UniqueName: \"kubernetes.io/projected/2b032b72-f7bb-4aa9-9519-53f05329a833-kube-api-access-x64wb\") pod \"2b032b72-f7bb-4aa9-9519-53f05329a833\" (UID: \"2b032b72-f7bb-4aa9-9519-53f05329a833\") " Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.934427 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-ovsdbserver-sb\") pod \"2b032b72-f7bb-4aa9-9519-53f05329a833\" (UID: \"2b032b72-f7bb-4aa9-9519-53f05329a833\") " Apr 02 14:04:31 crc kubenswrapper[4732]: I0402 14:04:31.940224 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b032b72-f7bb-4aa9-9519-53f05329a833-kube-api-access-x64wb" (OuterVolumeSpecName: "kube-api-access-x64wb") pod "2b032b72-f7bb-4aa9-9519-53f05329a833" (UID: "2b032b72-f7bb-4aa9-9519-53f05329a833"). InnerVolumeSpecName "kube-api-access-x64wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:04:32 crc kubenswrapper[4732]: I0402 14:04:31.999711 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2b032b72-f7bb-4aa9-9519-53f05329a833" (UID: "2b032b72-f7bb-4aa9-9519-53f05329a833"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:04:32 crc kubenswrapper[4732]: I0402 14:04:32.014026 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2b032b72-f7bb-4aa9-9519-53f05329a833" (UID: "2b032b72-f7bb-4aa9-9519-53f05329a833"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:04:32 crc kubenswrapper[4732]: I0402 14:04:32.015202 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2b032b72-f7bb-4aa9-9519-53f05329a833" (UID: "2b032b72-f7bb-4aa9-9519-53f05329a833"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:04:32 crc kubenswrapper[4732]: I0402 14:04:32.017973 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2b032b72-f7bb-4aa9-9519-53f05329a833" (UID: "2b032b72-f7bb-4aa9-9519-53f05329a833"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:04:32 crc kubenswrapper[4732]: I0402 14:04:32.038702 4732 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:32 crc kubenswrapper[4732]: I0402 14:04:32.038750 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:32 crc kubenswrapper[4732]: I0402 14:04:32.038766 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:32 crc kubenswrapper[4732]: I0402 14:04:32.038779 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x64wb\" (UniqueName: \"kubernetes.io/projected/2b032b72-f7bb-4aa9-9519-53f05329a833-kube-api-access-x64wb\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:32 crc kubenswrapper[4732]: I0402 14:04:32.038793 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:32 crc kubenswrapper[4732]: I0402 14:04:32.039449 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-config" (OuterVolumeSpecName: "config") pod "2b032b72-f7bb-4aa9-9519-53f05329a833" (UID: "2b032b72-f7bb-4aa9-9519-53f05329a833"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:04:32 crc kubenswrapper[4732]: I0402 14:04:32.141647 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b032b72-f7bb-4aa9-9519-53f05329a833-config\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:32 crc kubenswrapper[4732]: I0402 14:04:32.370267 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-2wr76"] Apr 02 14:04:32 crc kubenswrapper[4732]: W0402 14:04:32.373507 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb22602a0_7545_4c2d_8b16_2233288ab360.slice/crio-802ce42c8c03e121e39a84c170cb45b3e50e8d3cc5059864bae315c3c8448519 WatchSource:0}: Error finding container 802ce42c8c03e121e39a84c170cb45b3e50e8d3cc5059864bae315c3c8448519: Status 404 returned error can't find the container with id 802ce42c8c03e121e39a84c170cb45b3e50e8d3cc5059864bae315c3c8448519 Apr 02 14:04:32 crc kubenswrapper[4732]: I0402 14:04:32.631115 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" event={"ID":"2b032b72-f7bb-4aa9-9519-53f05329a833","Type":"ContainerDied","Data":"974819f0f1198e96f406a56dbdd66e616e9cc4d23df99ca023a59ac4bf52b9a5"} Apr 02 14:04:32 crc kubenswrapper[4732]: I0402 14:04:32.631145 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-d4dtv" Apr 02 14:04:32 crc kubenswrapper[4732]: I0402 14:04:32.631421 4732 scope.go:117] "RemoveContainer" containerID="f96c0913b24199548d07f7ef68e2f2127ac60fa7555ee2a1e923924d659ddf1f" Apr 02 14:04:32 crc kubenswrapper[4732]: I0402 14:04:32.632972 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" event={"ID":"b22602a0-7545-4c2d-8b16-2233288ab360","Type":"ContainerStarted","Data":"802ce42c8c03e121e39a84c170cb45b3e50e8d3cc5059864bae315c3c8448519"} Apr 02 14:04:32 crc kubenswrapper[4732]: I0402 14:04:32.669537 4732 scope.go:117] "RemoveContainer" containerID="06342f502fa93f7175ea06d4316f631c61c05a407fde251397c55031e384ad5c" Apr 02 14:04:32 crc kubenswrapper[4732]: I0402 14:04:32.678665 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-d4dtv"] Apr 02 14:04:32 crc kubenswrapper[4732]: I0402 14:04:32.701752 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-d4dtv"] Apr 02 14:04:33 crc kubenswrapper[4732]: I0402 14:04:33.646040 4732 generic.go:334] "Generic (PLEG): container finished" podID="b22602a0-7545-4c2d-8b16-2233288ab360" containerID="a11decdd651e56c4dca3ae70d702164e73c4e9dbe854433ecccbe0fbdb1b19f8" exitCode=0 Apr 02 14:04:33 crc kubenswrapper[4732]: I0402 14:04:33.646400 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" event={"ID":"b22602a0-7545-4c2d-8b16-2233288ab360","Type":"ContainerDied","Data":"a11decdd651e56c4dca3ae70d702164e73c4e9dbe854433ecccbe0fbdb1b19f8"} Apr 02 14:04:34 crc kubenswrapper[4732]: I0402 14:04:34.662211 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" event={"ID":"b22602a0-7545-4c2d-8b16-2233288ab360","Type":"ContainerStarted","Data":"9f00f51ca06f00ccba37d13fc22223c1176541d398ff7adfd118c871910ae5cf"} Apr 02 14:04:34 crc kubenswrapper[4732]: I0402 14:04:34.663357 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" Apr 02 14:04:34 crc kubenswrapper[4732]: I0402 14:04:34.692268 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" podStartSLOduration=3.69225124 podStartE2EDuration="3.69225124s" podCreationTimestamp="2026-04-02 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:04:34.683725815 +0000 UTC m=+1631.588133388" watchObservedRunningTime="2026-04-02 14:04:34.69225124 +0000 UTC m=+1631.596658783" Apr 02 14:04:34 crc kubenswrapper[4732]: I0402 14:04:34.696845 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b032b72-f7bb-4aa9-9519-53f05329a833" path="/var/lib/kubelet/pods/2b032b72-f7bb-4aa9-9519-53f05329a833/volumes" Apr 02 14:04:41 crc kubenswrapper[4732]: I0402 14:04:41.884763 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8c6f6df99-2wr76" Apr 02 14:04:41 crc kubenswrapper[4732]: I0402 14:04:41.964680 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-qwlb2"] Apr 02 14:04:41 crc kubenswrapper[4732]: I0402 14:04:41.965230 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" podUID="82d82db9-6003-4fa9-84ae-ef1b612462bc" containerName="dnsmasq-dns" containerID="cri-o://d03a6617fa08ecb7f6645b8d71fca7283a739e2aad5f9d7556edb053eb9d53ee" gracePeriod=10 Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.564447 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.664954 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-dns-svc\") pod \"82d82db9-6003-4fa9-84ae-ef1b612462bc\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.665006 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-config\") pod \"82d82db9-6003-4fa9-84ae-ef1b612462bc\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.665144 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-ovsdbserver-nb\") pod \"82d82db9-6003-4fa9-84ae-ef1b612462bc\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.665237 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82tjp\" (UniqueName: \"kubernetes.io/projected/82d82db9-6003-4fa9-84ae-ef1b612462bc-kube-api-access-82tjp\") pod \"82d82db9-6003-4fa9-84ae-ef1b612462bc\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.665270 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-openstack-edpm-ipam\") pod \"82d82db9-6003-4fa9-84ae-ef1b612462bc\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.665289 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-dns-swift-storage-0\") pod \"82d82db9-6003-4fa9-84ae-ef1b612462bc\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.665342 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-ovsdbserver-sb\") pod \"82d82db9-6003-4fa9-84ae-ef1b612462bc\" (UID: \"82d82db9-6003-4fa9-84ae-ef1b612462bc\") " Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.688998 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82d82db9-6003-4fa9-84ae-ef1b612462bc-kube-api-access-82tjp" (OuterVolumeSpecName: "kube-api-access-82tjp") pod "82d82db9-6003-4fa9-84ae-ef1b612462bc" (UID: "82d82db9-6003-4fa9-84ae-ef1b612462bc"). InnerVolumeSpecName "kube-api-access-82tjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.723642 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "82d82db9-6003-4fa9-84ae-ef1b612462bc" (UID: "82d82db9-6003-4fa9-84ae-ef1b612462bc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.730834 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-config" (OuterVolumeSpecName: "config") pod "82d82db9-6003-4fa9-84ae-ef1b612462bc" (UID: "82d82db9-6003-4fa9-84ae-ef1b612462bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.736181 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "82d82db9-6003-4fa9-84ae-ef1b612462bc" (UID: "82d82db9-6003-4fa9-84ae-ef1b612462bc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.738510 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "82d82db9-6003-4fa9-84ae-ef1b612462bc" (UID: "82d82db9-6003-4fa9-84ae-ef1b612462bc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.742944 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "82d82db9-6003-4fa9-84ae-ef1b612462bc" (UID: "82d82db9-6003-4fa9-84ae-ef1b612462bc"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.748029 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "82d82db9-6003-4fa9-84ae-ef1b612462bc" (UID: "82d82db9-6003-4fa9-84ae-ef1b612462bc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.750150 4732 generic.go:334] "Generic (PLEG): container finished" podID="82d82db9-6003-4fa9-84ae-ef1b612462bc" containerID="d03a6617fa08ecb7f6645b8d71fca7283a739e2aad5f9d7556edb053eb9d53ee" exitCode=0 Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.750250 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.769208 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" event={"ID":"82d82db9-6003-4fa9-84ae-ef1b612462bc","Type":"ContainerDied","Data":"d03a6617fa08ecb7f6645b8d71fca7283a739e2aad5f9d7556edb053eb9d53ee"} Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.769257 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-qwlb2" event={"ID":"82d82db9-6003-4fa9-84ae-ef1b612462bc","Type":"ContainerDied","Data":"be35e3a0d1daaf744b6e172b7f20df76ec288e4b983246ad15dc7af2ea733832"} Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.769277 4732 scope.go:117] "RemoveContainer" containerID="d03a6617fa08ecb7f6645b8d71fca7283a739e2aad5f9d7556edb053eb9d53ee" Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.770531 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82tjp\" (UniqueName: \"kubernetes.io/projected/82d82db9-6003-4fa9-84ae-ef1b612462bc-kube-api-access-82tjp\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.770559 4732 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.770570 4732 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.770581 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.770592 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.770637 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-config\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.770653 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82d82db9-6003-4fa9-84ae-ef1b612462bc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.796542 4732 scope.go:117] "RemoveContainer" containerID="af331e6809a269f93fcc8a8c747230e30a0562c868224261869aee82df482c1e" Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.797444 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-qwlb2"] Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.811074 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-qwlb2"] Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.816095 4732 scope.go:117] "RemoveContainer" containerID="d03a6617fa08ecb7f6645b8d71fca7283a739e2aad5f9d7556edb053eb9d53ee" Apr 02 14:04:42 crc kubenswrapper[4732]: E0402 14:04:42.816501 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d03a6617fa08ecb7f6645b8d71fca7283a739e2aad5f9d7556edb053eb9d53ee\": container with ID starting with d03a6617fa08ecb7f6645b8d71fca7283a739e2aad5f9d7556edb053eb9d53ee not found: ID does not exist" containerID="d03a6617fa08ecb7f6645b8d71fca7283a739e2aad5f9d7556edb053eb9d53ee" Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.816539 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d03a6617fa08ecb7f6645b8d71fca7283a739e2aad5f9d7556edb053eb9d53ee"} err="failed to get container status \"d03a6617fa08ecb7f6645b8d71fca7283a739e2aad5f9d7556edb053eb9d53ee\": rpc error: code = NotFound desc = could not find container \"d03a6617fa08ecb7f6645b8d71fca7283a739e2aad5f9d7556edb053eb9d53ee\": container with ID starting with d03a6617fa08ecb7f6645b8d71fca7283a739e2aad5f9d7556edb053eb9d53ee not found: ID does not exist" Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.816558 4732 scope.go:117] "RemoveContainer" containerID="af331e6809a269f93fcc8a8c747230e30a0562c868224261869aee82df482c1e" Apr 02 14:04:42 crc kubenswrapper[4732]: E0402 14:04:42.816920 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af331e6809a269f93fcc8a8c747230e30a0562c868224261869aee82df482c1e\": container with ID starting with af331e6809a269f93fcc8a8c747230e30a0562c868224261869aee82df482c1e not found: ID does not exist" containerID="af331e6809a269f93fcc8a8c747230e30a0562c868224261869aee82df482c1e" Apr 02 14:04:42 crc kubenswrapper[4732]: I0402 14:04:42.816953 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af331e6809a269f93fcc8a8c747230e30a0562c868224261869aee82df482c1e"} err="failed to get container status \"af331e6809a269f93fcc8a8c747230e30a0562c868224261869aee82df482c1e\": rpc error: code = NotFound desc = could not find container \"af331e6809a269f93fcc8a8c747230e30a0562c868224261869aee82df482c1e\": container with ID starting with af331e6809a269f93fcc8a8c747230e30a0562c868224261869aee82df482c1e not found: ID does not exist" Apr 02 14:04:44 crc kubenswrapper[4732]: I0402 14:04:44.690990 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82d82db9-6003-4fa9-84ae-ef1b612462bc" path="/var/lib/kubelet/pods/82d82db9-6003-4fa9-84ae-ef1b612462bc/volumes" Apr 02 14:04:47 crc kubenswrapper[4732]: I0402 14:04:47.799876 4732 generic.go:334] "Generic (PLEG): container finished" podID="29e95846-a0bc-4d8b-ad4d-457766418564" containerID="507ce84edd1a878bb3190544118ad341f01b0fd432881674befdfdf970a83ba9" exitCode=0 Apr 02 14:04:47 crc kubenswrapper[4732]: I0402 14:04:47.800005 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"29e95846-a0bc-4d8b-ad4d-457766418564","Type":"ContainerDied","Data":"507ce84edd1a878bb3190544118ad341f01b0fd432881674befdfdf970a83ba9"} Apr 02 14:04:48 crc kubenswrapper[4732]: I0402 14:04:48.633311 4732 scope.go:117] "RemoveContainer" containerID="f4dab2ddf2e82a3432454f31237a43e274f0990c5151a6264a973b31c88c0f1f" Apr 02 14:04:48 crc kubenswrapper[4732]: I0402 14:04:48.809865 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"29e95846-a0bc-4d8b-ad4d-457766418564","Type":"ContainerStarted","Data":"4164686c668bfca2dea7b410cabc63ab0bcf4a7d1a7c2f3a1a663bcf80fc4c62"} Apr 02 14:04:48 crc kubenswrapper[4732]: I0402 14:04:48.810827 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:04:48 crc kubenswrapper[4732]: I0402 14:04:48.834163 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.83414385 podStartE2EDuration="35.83414385s" podCreationTimestamp="2026-04-02 14:04:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:04:48.828895036 +0000 UTC m=+1645.733302609" watchObservedRunningTime="2026-04-02 14:04:48.83414385 +0000 UTC m=+1645.738551403" Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.698840 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w"] Apr 02 14:04:50 crc kubenswrapper[4732]: E0402 14:04:50.699958 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b032b72-f7bb-4aa9-9519-53f05329a833" containerName="init" Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.699998 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b032b72-f7bb-4aa9-9519-53f05329a833" containerName="init" Apr 02 14:04:50 crc kubenswrapper[4732]: E0402 14:04:50.700056 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b032b72-f7bb-4aa9-9519-53f05329a833" containerName="dnsmasq-dns" Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.700069 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b032b72-f7bb-4aa9-9519-53f05329a833" containerName="dnsmasq-dns" Apr 02 14:04:50 crc kubenswrapper[4732]: E0402 14:04:50.700087 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d82db9-6003-4fa9-84ae-ef1b612462bc" containerName="init" Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.700097 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d82db9-6003-4fa9-84ae-ef1b612462bc" containerName="init" Apr 02 14:04:50 crc kubenswrapper[4732]: E0402 14:04:50.700130 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d82db9-6003-4fa9-84ae-ef1b612462bc" containerName="dnsmasq-dns" Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.700140 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d82db9-6003-4fa9-84ae-ef1b612462bc" containerName="dnsmasq-dns" Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.700430 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="82d82db9-6003-4fa9-84ae-ef1b612462bc" containerName="dnsmasq-dns" Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.700457 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b032b72-f7bb-4aa9-9519-53f05329a833" containerName="dnsmasq-dns" Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.701472 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w" Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.704124 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.704935 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wdhd4" Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.704964 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.705149 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.705779 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w"] Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.727936 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b7c53a-39ed-4eea-8697-50dc3eb09818-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w\" (UID: \"28b7c53a-39ed-4eea-8697-50dc3eb09818\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w" Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.728023 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mf6l\" (UniqueName: \"kubernetes.io/projected/28b7c53a-39ed-4eea-8697-50dc3eb09818-kube-api-access-9mf6l\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w\" (UID: \"28b7c53a-39ed-4eea-8697-50dc3eb09818\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w" Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.728139 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28b7c53a-39ed-4eea-8697-50dc3eb09818-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w\" (UID: \"28b7c53a-39ed-4eea-8697-50dc3eb09818\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w" Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.728266 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28b7c53a-39ed-4eea-8697-50dc3eb09818-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w\" (UID: \"28b7c53a-39ed-4eea-8697-50dc3eb09818\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w" Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.826378 4732 generic.go:334] "Generic (PLEG): container finished" podID="c9e1cd50-72d3-4ccc-9f49-c4c1619252fc" containerID="ba581d578d0457d94560ad24f90a1b0f5578db3a15b7b1b99c0a70c511b484aa" exitCode=0 Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.826427 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc","Type":"ContainerDied","Data":"ba581d578d0457d94560ad24f90a1b0f5578db3a15b7b1b99c0a70c511b484aa"} Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.829464 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28b7c53a-39ed-4eea-8697-50dc3eb09818-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w\" (UID: \"28b7c53a-39ed-4eea-8697-50dc3eb09818\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w" Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.829592 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28b7c53a-39ed-4eea-8697-50dc3eb09818-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w\" (UID: \"28b7c53a-39ed-4eea-8697-50dc3eb09818\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w" Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.829676 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b7c53a-39ed-4eea-8697-50dc3eb09818-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w\" (UID: \"28b7c53a-39ed-4eea-8697-50dc3eb09818\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w" Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.829719 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mf6l\" (UniqueName: \"kubernetes.io/projected/28b7c53a-39ed-4eea-8697-50dc3eb09818-kube-api-access-9mf6l\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w\" (UID: \"28b7c53a-39ed-4eea-8697-50dc3eb09818\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w" Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.836179 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28b7c53a-39ed-4eea-8697-50dc3eb09818-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w\" (UID: \"28b7c53a-39ed-4eea-8697-50dc3eb09818\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w" Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.838150 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28b7c53a-39ed-4eea-8697-50dc3eb09818-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w\" (UID: \"28b7c53a-39ed-4eea-8697-50dc3eb09818\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w" Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.838343 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b7c53a-39ed-4eea-8697-50dc3eb09818-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w\" (UID: \"28b7c53a-39ed-4eea-8697-50dc3eb09818\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w" Apr 02 14:04:50 crc kubenswrapper[4732]: I0402 14:04:50.850256 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mf6l\" (UniqueName: \"kubernetes.io/projected/28b7c53a-39ed-4eea-8697-50dc3eb09818-kube-api-access-9mf6l\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w\" (UID: \"28b7c53a-39ed-4eea-8697-50dc3eb09818\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w" Apr 02 14:04:51 crc kubenswrapper[4732]: I0402 14:04:51.036804 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w" Apr 02 14:04:51 crc kubenswrapper[4732]: I0402 14:04:51.564901 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w"] Apr 02 14:04:51 crc kubenswrapper[4732]: I0402 14:04:51.848024 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9e1cd50-72d3-4ccc-9f49-c4c1619252fc","Type":"ContainerStarted","Data":"5967402d837ddfa8d50a5dd00c39c2f3d4ac3d2cecb8a6f52896774ff37a82a8"} Apr 02 14:04:51 crc kubenswrapper[4732]: I0402 14:04:51.848257 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Apr 02 14:04:51 crc kubenswrapper[4732]: I0402 14:04:51.849390 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w" event={"ID":"28b7c53a-39ed-4eea-8697-50dc3eb09818","Type":"ContainerStarted","Data":"e17cc6a8012964269a3ddd2945344e59b04259d3032de08902979937e6b0318a"} Apr 02 14:04:51 crc kubenswrapper[4732]: I0402 14:04:51.880882 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=35.880859442 podStartE2EDuration="35.880859442s" podCreationTimestamp="2026-04-02 14:04:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:04:51.873326095 +0000 UTC m=+1648.777733678" watchObservedRunningTime="2026-04-02 14:04:51.880859442 +0000 UTC m=+1648.785266995" Apr 02 14:05:00 crc kubenswrapper[4732]: I0402 14:05:00.938604 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w" event={"ID":"28b7c53a-39ed-4eea-8697-50dc3eb09818","Type":"ContainerStarted","Data":"b00e99301eec7fee8f198f17f97634c2d5f391af61a30b0dd6f4e50f849ecce8"} Apr 02 14:05:00 crc kubenswrapper[4732]: I0402 14:05:00.954877 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w" podStartSLOduration=2.807423458 podStartE2EDuration="10.954858668s" podCreationTimestamp="2026-04-02 14:04:50 +0000 UTC" firstStartedPulling="2026-04-02 14:04:51.570966901 +0000 UTC m=+1648.475374454" lastFinishedPulling="2026-04-02 14:04:59.718402111 +0000 UTC m=+1656.622809664" observedRunningTime="2026-04-02 14:05:00.952180094 +0000 UTC m=+1657.856587657" watchObservedRunningTime="2026-04-02 14:05:00.954858668 +0000 UTC m=+1657.859266221" Apr 02 14:05:03 crc kubenswrapper[4732]: I0402 14:05:03.811871 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Apr 02 14:05:07 crc kubenswrapper[4732]: I0402 14:05:07.143843 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Apr 02 14:05:11 crc kubenswrapper[4732]: I0402 14:05:11.047749 4732 generic.go:334] "Generic (PLEG): container finished" podID="28b7c53a-39ed-4eea-8697-50dc3eb09818" containerID="b00e99301eec7fee8f198f17f97634c2d5f391af61a30b0dd6f4e50f849ecce8" exitCode=0 Apr 02 14:05:11 crc kubenswrapper[4732]: I0402 14:05:11.047818 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w" event={"ID":"28b7c53a-39ed-4eea-8697-50dc3eb09818","Type":"ContainerDied","Data":"b00e99301eec7fee8f198f17f97634c2d5f391af61a30b0dd6f4e50f849ecce8"} Apr 02 14:05:12 crc kubenswrapper[4732]: I0402 14:05:12.431068 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w" Apr 02 14:05:12 crc kubenswrapper[4732]: I0402 14:05:12.545178 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28b7c53a-39ed-4eea-8697-50dc3eb09818-ssh-key-openstack-edpm-ipam\") pod \"28b7c53a-39ed-4eea-8697-50dc3eb09818\" (UID: \"28b7c53a-39ed-4eea-8697-50dc3eb09818\") " Apr 02 14:05:12 crc kubenswrapper[4732]: I0402 14:05:12.545271 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28b7c53a-39ed-4eea-8697-50dc3eb09818-inventory\") pod \"28b7c53a-39ed-4eea-8697-50dc3eb09818\" (UID: \"28b7c53a-39ed-4eea-8697-50dc3eb09818\") " Apr 02 14:05:12 crc kubenswrapper[4732]: I0402 14:05:12.545292 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mf6l\" (UniqueName: \"kubernetes.io/projected/28b7c53a-39ed-4eea-8697-50dc3eb09818-kube-api-access-9mf6l\") pod \"28b7c53a-39ed-4eea-8697-50dc3eb09818\" (UID: \"28b7c53a-39ed-4eea-8697-50dc3eb09818\") " Apr 02 14:05:12 crc kubenswrapper[4732]: I0402 14:05:12.545311 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b7c53a-39ed-4eea-8697-50dc3eb09818-repo-setup-combined-ca-bundle\") pod \"28b7c53a-39ed-4eea-8697-50dc3eb09818\" (UID: \"28b7c53a-39ed-4eea-8697-50dc3eb09818\") " Apr 02 14:05:12 crc kubenswrapper[4732]: I0402 14:05:12.550697 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28b7c53a-39ed-4eea-8697-50dc3eb09818-kube-api-access-9mf6l" (OuterVolumeSpecName: "kube-api-access-9mf6l") pod "28b7c53a-39ed-4eea-8697-50dc3eb09818" (UID: "28b7c53a-39ed-4eea-8697-50dc3eb09818"). InnerVolumeSpecName "kube-api-access-9mf6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:05:12 crc kubenswrapper[4732]: I0402 14:05:12.551954 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28b7c53a-39ed-4eea-8697-50dc3eb09818-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "28b7c53a-39ed-4eea-8697-50dc3eb09818" (UID: "28b7c53a-39ed-4eea-8697-50dc3eb09818"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:05:12 crc kubenswrapper[4732]: I0402 14:05:12.581102 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28b7c53a-39ed-4eea-8697-50dc3eb09818-inventory" (OuterVolumeSpecName: "inventory") pod "28b7c53a-39ed-4eea-8697-50dc3eb09818" (UID: "28b7c53a-39ed-4eea-8697-50dc3eb09818"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:05:12 crc kubenswrapper[4732]: I0402 14:05:12.584252 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28b7c53a-39ed-4eea-8697-50dc3eb09818-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "28b7c53a-39ed-4eea-8697-50dc3eb09818" (UID: "28b7c53a-39ed-4eea-8697-50dc3eb09818"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:05:12 crc kubenswrapper[4732]: I0402 14:05:12.647263 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28b7c53a-39ed-4eea-8697-50dc3eb09818-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 02 14:05:12 crc kubenswrapper[4732]: I0402 14:05:12.647293 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28b7c53a-39ed-4eea-8697-50dc3eb09818-inventory\") on node \"crc\" DevicePath \"\"" Apr 02 14:05:12 crc kubenswrapper[4732]: I0402 14:05:12.647303 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mf6l\" (UniqueName: \"kubernetes.io/projected/28b7c53a-39ed-4eea-8697-50dc3eb09818-kube-api-access-9mf6l\") on node \"crc\" DevicePath \"\"" Apr 02 14:05:12 crc kubenswrapper[4732]: I0402 14:05:12.647313 4732 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b7c53a-39ed-4eea-8697-50dc3eb09818-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:05:13 crc kubenswrapper[4732]: I0402 14:05:13.068581 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w" event={"ID":"28b7c53a-39ed-4eea-8697-50dc3eb09818","Type":"ContainerDied","Data":"e17cc6a8012964269a3ddd2945344e59b04259d3032de08902979937e6b0318a"} Apr 02 14:05:13 crc kubenswrapper[4732]: I0402 14:05:13.068642 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w" Apr 02 14:05:13 crc kubenswrapper[4732]: I0402 14:05:13.068651 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e17cc6a8012964269a3ddd2945344e59b04259d3032de08902979937e6b0318a" Apr 02 14:05:13 crc kubenswrapper[4732]: I0402 14:05:13.147594 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-szt5p"] Apr 02 14:05:13 crc kubenswrapper[4732]: E0402 14:05:13.148570 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28b7c53a-39ed-4eea-8697-50dc3eb09818" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Apr 02 14:05:13 crc kubenswrapper[4732]: I0402 14:05:13.148650 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="28b7c53a-39ed-4eea-8697-50dc3eb09818" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Apr 02 14:05:13 crc kubenswrapper[4732]: I0402 14:05:13.149074 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="28b7c53a-39ed-4eea-8697-50dc3eb09818" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Apr 02 14:05:13 crc kubenswrapper[4732]: I0402 14:05:13.150223 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-szt5p" Apr 02 14:05:13 crc kubenswrapper[4732]: I0402 14:05:13.152627 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 02 14:05:13 crc kubenswrapper[4732]: I0402 14:05:13.152821 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 02 14:05:13 crc kubenswrapper[4732]: I0402 14:05:13.152953 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wdhd4" Apr 02 14:05:13 crc kubenswrapper[4732]: I0402 14:05:13.153126 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 02 14:05:13 crc kubenswrapper[4732]: I0402 14:05:13.159537 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-szt5p"] Apr 02 14:05:13 crc kubenswrapper[4732]: I0402 14:05:13.257126 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2c0401e-94b6-46b0-84f5-59ffac42c2f7-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-szt5p\" (UID: \"d2c0401e-94b6-46b0-84f5-59ffac42c2f7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-szt5p" Apr 02 14:05:13 crc kubenswrapper[4732]: I0402 14:05:13.257316 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbfbp\" (UniqueName: \"kubernetes.io/projected/d2c0401e-94b6-46b0-84f5-59ffac42c2f7-kube-api-access-kbfbp\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-szt5p\" (UID: \"d2c0401e-94b6-46b0-84f5-59ffac42c2f7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-szt5p" Apr 02 14:05:13 crc kubenswrapper[4732]: I0402 14:05:13.257385 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d2c0401e-94b6-46b0-84f5-59ffac42c2f7-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-szt5p\" (UID: \"d2c0401e-94b6-46b0-84f5-59ffac42c2f7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-szt5p" Apr 02 14:05:13 crc kubenswrapper[4732]: I0402 14:05:13.359704 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbfbp\" (UniqueName: \"kubernetes.io/projected/d2c0401e-94b6-46b0-84f5-59ffac42c2f7-kube-api-access-kbfbp\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-szt5p\" (UID: \"d2c0401e-94b6-46b0-84f5-59ffac42c2f7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-szt5p" Apr 02 14:05:13 crc kubenswrapper[4732]: I0402 14:05:13.359820 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d2c0401e-94b6-46b0-84f5-59ffac42c2f7-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-szt5p\" (UID: \"d2c0401e-94b6-46b0-84f5-59ffac42c2f7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-szt5p" Apr 02 14:05:13 crc kubenswrapper[4732]: I0402 14:05:13.359889 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2c0401e-94b6-46b0-84f5-59ffac42c2f7-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-szt5p\" (UID: \"d2c0401e-94b6-46b0-84f5-59ffac42c2f7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-szt5p" Apr 02 14:05:13 crc kubenswrapper[4732]: I0402 14:05:13.366726 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2c0401e-94b6-46b0-84f5-59ffac42c2f7-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-szt5p\" (UID: \"d2c0401e-94b6-46b0-84f5-59ffac42c2f7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-szt5p" Apr 02 14:05:13 crc kubenswrapper[4732]: I0402 14:05:13.367894 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d2c0401e-94b6-46b0-84f5-59ffac42c2f7-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-szt5p\" (UID: \"d2c0401e-94b6-46b0-84f5-59ffac42c2f7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-szt5p" Apr 02 14:05:13 crc kubenswrapper[4732]: I0402 14:05:13.390575 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbfbp\" (UniqueName: \"kubernetes.io/projected/d2c0401e-94b6-46b0-84f5-59ffac42c2f7-kube-api-access-kbfbp\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-szt5p\" (UID: \"d2c0401e-94b6-46b0-84f5-59ffac42c2f7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-szt5p" Apr 02 14:05:13 crc kubenswrapper[4732]: I0402 14:05:13.476998 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-szt5p" Apr 02 14:05:13 crc kubenswrapper[4732]: I0402 14:05:13.987259 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-szt5p"] Apr 02 14:05:14 crc kubenswrapper[4732]: I0402 14:05:14.078056 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-szt5p" event={"ID":"d2c0401e-94b6-46b0-84f5-59ffac42c2f7","Type":"ContainerStarted","Data":"fbcfeb570d0dc1e6a389aa0fc64471aa321aae09f957254b7705f7b3a4e18a19"} Apr 02 14:05:15 crc kubenswrapper[4732]: I0402 14:05:15.088464 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-szt5p" event={"ID":"d2c0401e-94b6-46b0-84f5-59ffac42c2f7","Type":"ContainerStarted","Data":"8a5336aaed4acf1c5b9817fb8e3f6104d307e4c88005c33249771be5cf2d930a"} Apr 02 14:05:15 crc kubenswrapper[4732]: I0402 14:05:15.112251 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-szt5p" podStartSLOduration=1.705150094 podStartE2EDuration="2.112235656s" podCreationTimestamp="2026-04-02 14:05:13 +0000 UTC" firstStartedPulling="2026-04-02 14:05:13.985426054 +0000 UTC m=+1670.889833607" lastFinishedPulling="2026-04-02 14:05:14.392511616 +0000 UTC m=+1671.296919169" observedRunningTime="2026-04-02 14:05:15.106340924 +0000 UTC m=+1672.010748497" watchObservedRunningTime="2026-04-02 14:05:15.112235656 +0000 UTC m=+1672.016643199" Apr 02 14:05:17 crc kubenswrapper[4732]: I0402 14:05:17.108334 4732 generic.go:334] "Generic (PLEG): container finished" podID="d2c0401e-94b6-46b0-84f5-59ffac42c2f7" containerID="8a5336aaed4acf1c5b9817fb8e3f6104d307e4c88005c33249771be5cf2d930a" exitCode=0 Apr 02 14:05:17 crc kubenswrapper[4732]: I0402 14:05:17.108431 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-szt5p" event={"ID":"d2c0401e-94b6-46b0-84f5-59ffac42c2f7","Type":"ContainerDied","Data":"8a5336aaed4acf1c5b9817fb8e3f6104d307e4c88005c33249771be5cf2d930a"} Apr 02 14:05:18 crc kubenswrapper[4732]: I0402 14:05:18.524098 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-szt5p" Apr 02 14:05:18 crc kubenswrapper[4732]: I0402 14:05:18.550306 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d2c0401e-94b6-46b0-84f5-59ffac42c2f7-ssh-key-openstack-edpm-ipam\") pod \"d2c0401e-94b6-46b0-84f5-59ffac42c2f7\" (UID: \"d2c0401e-94b6-46b0-84f5-59ffac42c2f7\") " Apr 02 14:05:18 crc kubenswrapper[4732]: I0402 14:05:18.550538 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2c0401e-94b6-46b0-84f5-59ffac42c2f7-inventory\") pod \"d2c0401e-94b6-46b0-84f5-59ffac42c2f7\" (UID: \"d2c0401e-94b6-46b0-84f5-59ffac42c2f7\") " Apr 02 14:05:18 crc kubenswrapper[4732]: I0402 14:05:18.550750 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbfbp\" (UniqueName: \"kubernetes.io/projected/d2c0401e-94b6-46b0-84f5-59ffac42c2f7-kube-api-access-kbfbp\") pod \"d2c0401e-94b6-46b0-84f5-59ffac42c2f7\" (UID: \"d2c0401e-94b6-46b0-84f5-59ffac42c2f7\") " Apr 02 14:05:18 crc kubenswrapper[4732]: I0402 14:05:18.557533 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2c0401e-94b6-46b0-84f5-59ffac42c2f7-kube-api-access-kbfbp" (OuterVolumeSpecName: "kube-api-access-kbfbp") pod "d2c0401e-94b6-46b0-84f5-59ffac42c2f7" (UID: "d2c0401e-94b6-46b0-84f5-59ffac42c2f7"). InnerVolumeSpecName "kube-api-access-kbfbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:05:18 crc kubenswrapper[4732]: I0402 14:05:18.589811 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2c0401e-94b6-46b0-84f5-59ffac42c2f7-inventory" (OuterVolumeSpecName: "inventory") pod "d2c0401e-94b6-46b0-84f5-59ffac42c2f7" (UID: "d2c0401e-94b6-46b0-84f5-59ffac42c2f7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:05:18 crc kubenswrapper[4732]: I0402 14:05:18.597819 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2c0401e-94b6-46b0-84f5-59ffac42c2f7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d2c0401e-94b6-46b0-84f5-59ffac42c2f7" (UID: "d2c0401e-94b6-46b0-84f5-59ffac42c2f7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:05:18 crc kubenswrapper[4732]: I0402 14:05:18.652901 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbfbp\" (UniqueName: \"kubernetes.io/projected/d2c0401e-94b6-46b0-84f5-59ffac42c2f7-kube-api-access-kbfbp\") on node \"crc\" DevicePath \"\"" Apr 02 14:05:18 crc kubenswrapper[4732]: I0402 14:05:18.652941 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d2c0401e-94b6-46b0-84f5-59ffac42c2f7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 02 14:05:18 crc kubenswrapper[4732]: I0402 14:05:18.652953 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2c0401e-94b6-46b0-84f5-59ffac42c2f7-inventory\") on node \"crc\" DevicePath \"\"" Apr 02 14:05:19 crc kubenswrapper[4732]: I0402 14:05:19.129851 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-szt5p" event={"ID":"d2c0401e-94b6-46b0-84f5-59ffac42c2f7","Type":"ContainerDied","Data":"fbcfeb570d0dc1e6a389aa0fc64471aa321aae09f957254b7705f7b3a4e18a19"} Apr 02 14:05:19 crc kubenswrapper[4732]: I0402 14:05:19.129889 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbcfeb570d0dc1e6a389aa0fc64471aa321aae09f957254b7705f7b3a4e18a19" Apr 02 14:05:19 crc kubenswrapper[4732]: I0402 14:05:19.129936 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-szt5p" Apr 02 14:05:19 crc kubenswrapper[4732]: I0402 14:05:19.191139 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr"] Apr 02 14:05:19 crc kubenswrapper[4732]: E0402 14:05:19.191592 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c0401e-94b6-46b0-84f5-59ffac42c2f7" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Apr 02 14:05:19 crc kubenswrapper[4732]: I0402 14:05:19.191622 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c0401e-94b6-46b0-84f5-59ffac42c2f7" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Apr 02 14:05:19 crc kubenswrapper[4732]: I0402 14:05:19.191832 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c0401e-94b6-46b0-84f5-59ffac42c2f7" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Apr 02 14:05:19 crc kubenswrapper[4732]: I0402 14:05:19.192518 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr" Apr 02 14:05:19 crc kubenswrapper[4732]: I0402 14:05:19.195169 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 02 14:05:19 crc kubenswrapper[4732]: I0402 14:05:19.195482 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wdhd4" Apr 02 14:05:19 crc kubenswrapper[4732]: I0402 14:05:19.196736 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 02 14:05:19 crc kubenswrapper[4732]: I0402 14:05:19.196917 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 02 14:05:19 crc kubenswrapper[4732]: I0402 14:05:19.200779 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr"] Apr 02 14:05:19 crc kubenswrapper[4732]: I0402 14:05:19.262588 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a67d60f0-3912-4fc4-96b7-f96831ff23d3-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr\" (UID: \"a67d60f0-3912-4fc4-96b7-f96831ff23d3\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr" Apr 02 14:05:19 crc kubenswrapper[4732]: I0402 14:05:19.262722 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a67d60f0-3912-4fc4-96b7-f96831ff23d3-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr\" (UID: \"a67d60f0-3912-4fc4-96b7-f96831ff23d3\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr" Apr 02 14:05:19 crc kubenswrapper[4732]: I0402 14:05:19.262810 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a67d60f0-3912-4fc4-96b7-f96831ff23d3-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr\" (UID: \"a67d60f0-3912-4fc4-96b7-f96831ff23d3\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr" Apr 02 14:05:19 crc kubenswrapper[4732]: I0402 14:05:19.262858 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbzv2\" (UniqueName: \"kubernetes.io/projected/a67d60f0-3912-4fc4-96b7-f96831ff23d3-kube-api-access-dbzv2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr\" (UID: \"a67d60f0-3912-4fc4-96b7-f96831ff23d3\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr" Apr 02 14:05:19 crc kubenswrapper[4732]: I0402 14:05:19.364869 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a67d60f0-3912-4fc4-96b7-f96831ff23d3-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr\" (UID: \"a67d60f0-3912-4fc4-96b7-f96831ff23d3\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr" Apr 02 14:05:19 crc kubenswrapper[4732]: I0402 14:05:19.364980 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a67d60f0-3912-4fc4-96b7-f96831ff23d3-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr\" (UID: \"a67d60f0-3912-4fc4-96b7-f96831ff23d3\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr" Apr 02 14:05:19 crc kubenswrapper[4732]: I0402 14:05:19.365040 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a67d60f0-3912-4fc4-96b7-f96831ff23d3-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr\" (UID: \"a67d60f0-3912-4fc4-96b7-f96831ff23d3\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr" Apr 02 14:05:19 crc kubenswrapper[4732]: I0402 14:05:19.365059 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbzv2\" (UniqueName: \"kubernetes.io/projected/a67d60f0-3912-4fc4-96b7-f96831ff23d3-kube-api-access-dbzv2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr\" (UID: \"a67d60f0-3912-4fc4-96b7-f96831ff23d3\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr" Apr 02 14:05:19 crc kubenswrapper[4732]: I0402 14:05:19.369738 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a67d60f0-3912-4fc4-96b7-f96831ff23d3-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr\" (UID: \"a67d60f0-3912-4fc4-96b7-f96831ff23d3\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr" Apr 02 14:05:19 crc kubenswrapper[4732]: I0402 14:05:19.370070 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a67d60f0-3912-4fc4-96b7-f96831ff23d3-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr\" (UID: \"a67d60f0-3912-4fc4-96b7-f96831ff23d3\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr" Apr 02 14:05:19 crc kubenswrapper[4732]: I0402 14:05:19.370479 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a67d60f0-3912-4fc4-96b7-f96831ff23d3-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr\" (UID: \"a67d60f0-3912-4fc4-96b7-f96831ff23d3\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr" Apr 02 14:05:19 crc kubenswrapper[4732]: I0402 14:05:19.383582 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbzv2\" (UniqueName: \"kubernetes.io/projected/a67d60f0-3912-4fc4-96b7-f96831ff23d3-kube-api-access-dbzv2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr\" (UID: \"a67d60f0-3912-4fc4-96b7-f96831ff23d3\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr" Apr 02 14:05:19 crc kubenswrapper[4732]: I0402 14:05:19.507637 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr" Apr 02 14:05:20 crc kubenswrapper[4732]: I0402 14:05:20.000256 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr"] Apr 02 14:05:20 crc kubenswrapper[4732]: I0402 14:05:20.139973 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr" event={"ID":"a67d60f0-3912-4fc4-96b7-f96831ff23d3","Type":"ContainerStarted","Data":"0d082c0183e43eb12e00b62651ce5ae68da6024a1b583eea904c8d4daacc0f56"} Apr 02 14:05:21 crc kubenswrapper[4732]: I0402 14:05:21.152177 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr" event={"ID":"a67d60f0-3912-4fc4-96b7-f96831ff23d3","Type":"ContainerStarted","Data":"64b1cbbf5c8dc33f37abc0fce039f671184f9c49425ba27d84d2d2122aedd2c3"} Apr 02 14:05:48 crc kubenswrapper[4732]: I0402 14:05:48.795107 4732 scope.go:117] "RemoveContainer" containerID="5bdcee330725e133a8edcdc53f64fb04d483a11824993f03fbddd3ec3984ced2" Apr 02 14:05:48 crc kubenswrapper[4732]: I0402 14:05:48.833179 4732 scope.go:117] "RemoveContainer" containerID="1f06a6a01b70864db610c9b5a52f02e68dca501f9a004dbc96b89f1f376d64ba" Apr 02 14:06:00 crc kubenswrapper[4732]: I0402 14:06:00.137531 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr" podStartSLOduration=40.739369737 podStartE2EDuration="41.137515714s" podCreationTimestamp="2026-04-02 14:05:19 +0000 UTC" firstStartedPulling="2026-04-02 14:05:20.006908136 +0000 UTC m=+1676.911315689" lastFinishedPulling="2026-04-02 14:05:20.405054113 +0000 UTC m=+1677.309461666" observedRunningTime="2026-04-02 14:05:21.177258415 +0000 UTC m=+1678.081665978" watchObservedRunningTime="2026-04-02 14:06:00.137515714 +0000 UTC m=+1717.041923267" Apr 02 14:06:00 crc kubenswrapper[4732]: I0402 14:06:00.146032 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585646-ndcmk"] Apr 02 14:06:00 crc kubenswrapper[4732]: I0402 14:06:00.147446 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585646-ndcmk" Apr 02 14:06:00 crc kubenswrapper[4732]: I0402 14:06:00.153102 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:06:00 crc kubenswrapper[4732]: I0402 14:06:00.153411 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:06:00 crc kubenswrapper[4732]: I0402 14:06:00.153952 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:06:00 crc kubenswrapper[4732]: I0402 14:06:00.162037 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585646-ndcmk"] Apr 02 14:06:00 crc kubenswrapper[4732]: I0402 14:06:00.268008 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74hdz\" (UniqueName: \"kubernetes.io/projected/5484215b-23cc-462c-bfce-d0ad533381b7-kube-api-access-74hdz\") pod \"auto-csr-approver-29585646-ndcmk\" (UID: \"5484215b-23cc-462c-bfce-d0ad533381b7\") " pod="openshift-infra/auto-csr-approver-29585646-ndcmk" Apr 02 14:06:00 crc kubenswrapper[4732]: I0402 14:06:00.369325 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74hdz\" (UniqueName: \"kubernetes.io/projected/5484215b-23cc-462c-bfce-d0ad533381b7-kube-api-access-74hdz\") pod \"auto-csr-approver-29585646-ndcmk\" (UID: \"5484215b-23cc-462c-bfce-d0ad533381b7\") " pod="openshift-infra/auto-csr-approver-29585646-ndcmk" Apr 02 14:06:00 crc kubenswrapper[4732]: I0402 14:06:00.389889 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74hdz\" (UniqueName: \"kubernetes.io/projected/5484215b-23cc-462c-bfce-d0ad533381b7-kube-api-access-74hdz\") pod \"auto-csr-approver-29585646-ndcmk\" (UID: \"5484215b-23cc-462c-bfce-d0ad533381b7\") " pod="openshift-infra/auto-csr-approver-29585646-ndcmk" Apr 02 14:06:00 crc kubenswrapper[4732]: I0402 14:06:00.464852 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585646-ndcmk" Apr 02 14:06:00 crc kubenswrapper[4732]: I0402 14:06:00.923376 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585646-ndcmk"] Apr 02 14:06:01 crc kubenswrapper[4732]: I0402 14:06:01.553999 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585646-ndcmk" event={"ID":"5484215b-23cc-462c-bfce-d0ad533381b7","Type":"ContainerStarted","Data":"48ea15ee6fe3616a310bc93dfe6f8006f18cb7b3978197fbfd769d75e6dcab0c"} Apr 02 14:06:02 crc kubenswrapper[4732]: I0402 14:06:02.564597 4732 generic.go:334] "Generic (PLEG): container finished" podID="5484215b-23cc-462c-bfce-d0ad533381b7" containerID="c25956aa6d98725a91263ba38314314c23dbb1b2d706189468ac93300a22d052" exitCode=0 Apr 02 14:06:02 crc kubenswrapper[4732]: I0402 14:06:02.564727 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585646-ndcmk" event={"ID":"5484215b-23cc-462c-bfce-d0ad533381b7","Type":"ContainerDied","Data":"c25956aa6d98725a91263ba38314314c23dbb1b2d706189468ac93300a22d052"} Apr 02 14:06:03 crc kubenswrapper[4732]: I0402 14:06:03.957499 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585646-ndcmk" Apr 02 14:06:04 crc kubenswrapper[4732]: I0402 14:06:04.071495 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74hdz\" (UniqueName: \"kubernetes.io/projected/5484215b-23cc-462c-bfce-d0ad533381b7-kube-api-access-74hdz\") pod \"5484215b-23cc-462c-bfce-d0ad533381b7\" (UID: \"5484215b-23cc-462c-bfce-d0ad533381b7\") " Apr 02 14:06:04 crc kubenswrapper[4732]: I0402 14:06:04.078343 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5484215b-23cc-462c-bfce-d0ad533381b7-kube-api-access-74hdz" (OuterVolumeSpecName: "kube-api-access-74hdz") pod "5484215b-23cc-462c-bfce-d0ad533381b7" (UID: "5484215b-23cc-462c-bfce-d0ad533381b7"). InnerVolumeSpecName "kube-api-access-74hdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:06:04 crc kubenswrapper[4732]: I0402 14:06:04.174266 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74hdz\" (UniqueName: \"kubernetes.io/projected/5484215b-23cc-462c-bfce-d0ad533381b7-kube-api-access-74hdz\") on node \"crc\" DevicePath \"\"" Apr 02 14:06:04 crc kubenswrapper[4732]: I0402 14:06:04.582785 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585646-ndcmk" event={"ID":"5484215b-23cc-462c-bfce-d0ad533381b7","Type":"ContainerDied","Data":"48ea15ee6fe3616a310bc93dfe6f8006f18cb7b3978197fbfd769d75e6dcab0c"} Apr 02 14:06:04 crc kubenswrapper[4732]: I0402 14:06:04.582826 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48ea15ee6fe3616a310bc93dfe6f8006f18cb7b3978197fbfd769d75e6dcab0c" Apr 02 14:06:04 crc kubenswrapper[4732]: I0402 14:06:04.583120 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585646-ndcmk" Apr 02 14:06:05 crc kubenswrapper[4732]: I0402 14:06:05.046477 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585640-289ph"] Apr 02 14:06:05 crc kubenswrapper[4732]: I0402 14:06:05.060333 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585640-289ph"] Apr 02 14:06:06 crc kubenswrapper[4732]: I0402 14:06:06.690878 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99deaef2-ca21-4254-9c26-8200edbbd497" path="/var/lib/kubelet/pods/99deaef2-ca21-4254-9c26-8200edbbd497/volumes" Apr 02 14:06:31 crc kubenswrapper[4732]: I0402 14:06:31.925244 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:06:31 crc kubenswrapper[4732]: I0402 14:06:31.925879 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:06:48 crc kubenswrapper[4732]: I0402 14:06:48.923744 4732 scope.go:117] "RemoveContainer" containerID="99dfb7ae772e0081507ee489bd4b99216c5fdab104692fd5f576570e100b64f4" Apr 02 14:06:48 crc kubenswrapper[4732]: I0402 14:06:48.976488 4732 scope.go:117] "RemoveContainer" containerID="6391b06adfefe9e1c2f919abfdf05ce2375ff6ce7ab5fd3ec5d64ca7a1a910b6" Apr 02 14:06:49 crc kubenswrapper[4732]: I0402 14:06:49.011432 4732 scope.go:117] "RemoveContainer" containerID="af6f7141a9c8da30c13a335a189e8641b675cd71fc8da314fc45e4b603f98b2b" Apr 02 14:06:49 crc kubenswrapper[4732]: I0402 14:06:49.048509 4732 scope.go:117] "RemoveContainer" containerID="2e7264b3c1df24ae1ba0205d9b07561f3b4ba9436cfa3bfe65c7212df43dc004" Apr 02 14:06:55 crc kubenswrapper[4732]: I0402 14:06:55.305234 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hj6wc"] Apr 02 14:06:55 crc kubenswrapper[4732]: E0402 14:06:55.306169 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5484215b-23cc-462c-bfce-d0ad533381b7" containerName="oc" Apr 02 14:06:55 crc kubenswrapper[4732]: I0402 14:06:55.306186 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5484215b-23cc-462c-bfce-d0ad533381b7" containerName="oc" Apr 02 14:06:55 crc kubenswrapper[4732]: I0402 14:06:55.306374 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="5484215b-23cc-462c-bfce-d0ad533381b7" containerName="oc" Apr 02 14:06:55 crc kubenswrapper[4732]: I0402 14:06:55.307707 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hj6wc" Apr 02 14:06:55 crc kubenswrapper[4732]: I0402 14:06:55.317485 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hj6wc"] Apr 02 14:06:55 crc kubenswrapper[4732]: I0402 14:06:55.355514 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e720223-247b-4d57-a833-bd386df20f26-catalog-content\") pod \"community-operators-hj6wc\" (UID: \"0e720223-247b-4d57-a833-bd386df20f26\") " pod="openshift-marketplace/community-operators-hj6wc" Apr 02 14:06:55 crc kubenswrapper[4732]: I0402 14:06:55.355824 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9vjd\" (UniqueName: \"kubernetes.io/projected/0e720223-247b-4d57-a833-bd386df20f26-kube-api-access-m9vjd\") pod \"community-operators-hj6wc\" (UID: \"0e720223-247b-4d57-a833-bd386df20f26\") " pod="openshift-marketplace/community-operators-hj6wc" Apr 02 14:06:55 crc kubenswrapper[4732]: I0402 14:06:55.356005 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e720223-247b-4d57-a833-bd386df20f26-utilities\") pod \"community-operators-hj6wc\" (UID: \"0e720223-247b-4d57-a833-bd386df20f26\") " pod="openshift-marketplace/community-operators-hj6wc" Apr 02 14:06:55 crc kubenswrapper[4732]: I0402 14:06:55.458333 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e720223-247b-4d57-a833-bd386df20f26-utilities\") pod \"community-operators-hj6wc\" (UID: \"0e720223-247b-4d57-a833-bd386df20f26\") " pod="openshift-marketplace/community-operators-hj6wc" Apr 02 14:06:55 crc kubenswrapper[4732]: I0402 14:06:55.458418 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e720223-247b-4d57-a833-bd386df20f26-catalog-content\") pod \"community-operators-hj6wc\" (UID: \"0e720223-247b-4d57-a833-bd386df20f26\") " pod="openshift-marketplace/community-operators-hj6wc" Apr 02 14:06:55 crc kubenswrapper[4732]: I0402 14:06:55.458506 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9vjd\" (UniqueName: \"kubernetes.io/projected/0e720223-247b-4d57-a833-bd386df20f26-kube-api-access-m9vjd\") pod \"community-operators-hj6wc\" (UID: \"0e720223-247b-4d57-a833-bd386df20f26\") " pod="openshift-marketplace/community-operators-hj6wc" Apr 02 14:06:55 crc kubenswrapper[4732]: I0402 14:06:55.458913 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e720223-247b-4d57-a833-bd386df20f26-utilities\") pod \"community-operators-hj6wc\" (UID: \"0e720223-247b-4d57-a833-bd386df20f26\") " pod="openshift-marketplace/community-operators-hj6wc" Apr 02 14:06:55 crc kubenswrapper[4732]: I0402 14:06:55.459080 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e720223-247b-4d57-a833-bd386df20f26-catalog-content\") pod \"community-operators-hj6wc\" (UID: \"0e720223-247b-4d57-a833-bd386df20f26\") " pod="openshift-marketplace/community-operators-hj6wc" Apr 02 14:06:55 crc kubenswrapper[4732]: I0402 14:06:55.479414 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9vjd\" (UniqueName: \"kubernetes.io/projected/0e720223-247b-4d57-a833-bd386df20f26-kube-api-access-m9vjd\") pod \"community-operators-hj6wc\" (UID: \"0e720223-247b-4d57-a833-bd386df20f26\") " pod="openshift-marketplace/community-operators-hj6wc" Apr 02 14:06:55 crc kubenswrapper[4732]: I0402 14:06:55.639742 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hj6wc" Apr 02 14:06:56 crc kubenswrapper[4732]: I0402 14:06:56.182312 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hj6wc"] Apr 02 14:06:57 crc kubenswrapper[4732]: I0402 14:06:57.140073 4732 generic.go:334] "Generic (PLEG): container finished" podID="0e720223-247b-4d57-a833-bd386df20f26" containerID="b316189728a4417ad59b830327674fe6f1a1c5c558297e3c66721a708f2c7d6a" exitCode=0 Apr 02 14:06:57 crc kubenswrapper[4732]: I0402 14:06:57.140156 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hj6wc" event={"ID":"0e720223-247b-4d57-a833-bd386df20f26","Type":"ContainerDied","Data":"b316189728a4417ad59b830327674fe6f1a1c5c558297e3c66721a708f2c7d6a"} Apr 02 14:06:57 crc kubenswrapper[4732]: I0402 14:06:57.141092 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hj6wc" event={"ID":"0e720223-247b-4d57-a833-bd386df20f26","Type":"ContainerStarted","Data":"fa022bfbf003c45f902b1f15683240c55c8f28658271e7a771123b6672de7727"} Apr 02 14:06:58 crc kubenswrapper[4732]: I0402 14:06:58.154489 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hj6wc" event={"ID":"0e720223-247b-4d57-a833-bd386df20f26","Type":"ContainerStarted","Data":"439ff6ddbe0ac2e03c63361a7b8920abf3bbaa67b423fad2a7db87db016d3331"} Apr 02 14:06:59 crc kubenswrapper[4732]: I0402 14:06:59.166801 4732 generic.go:334] "Generic (PLEG): container finished" podID="0e720223-247b-4d57-a833-bd386df20f26" containerID="439ff6ddbe0ac2e03c63361a7b8920abf3bbaa67b423fad2a7db87db016d3331" exitCode=0 Apr 02 14:06:59 crc kubenswrapper[4732]: I0402 14:06:59.166895 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hj6wc" event={"ID":"0e720223-247b-4d57-a833-bd386df20f26","Type":"ContainerDied","Data":"439ff6ddbe0ac2e03c63361a7b8920abf3bbaa67b423fad2a7db87db016d3331"} Apr 02 14:07:00 crc kubenswrapper[4732]: I0402 14:07:00.178512 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hj6wc" event={"ID":"0e720223-247b-4d57-a833-bd386df20f26","Type":"ContainerStarted","Data":"7911c18dc357c28d78ee0851736a1637872d3ebf1f022e76129416cc2b725471"} Apr 02 14:07:00 crc kubenswrapper[4732]: I0402 14:07:00.202161 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hj6wc" podStartSLOduration=2.724466144 podStartE2EDuration="5.202145627s" podCreationTimestamp="2026-04-02 14:06:55 +0000 UTC" firstStartedPulling="2026-04-02 14:06:57.142814729 +0000 UTC m=+1774.047222332" lastFinishedPulling="2026-04-02 14:06:59.620494262 +0000 UTC m=+1776.524901815" observedRunningTime="2026-04-02 14:07:00.193441252 +0000 UTC m=+1777.097848815" watchObservedRunningTime="2026-04-02 14:07:00.202145627 +0000 UTC m=+1777.106553180" Apr 02 14:07:01 crc kubenswrapper[4732]: I0402 14:07:01.924407 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:07:01 crc kubenswrapper[4732]: I0402 14:07:01.924713 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:07:05 crc kubenswrapper[4732]: I0402 14:07:05.640048 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hj6wc" Apr 02 14:07:05 crc kubenswrapper[4732]: I0402 14:07:05.641737 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hj6wc" Apr 02 14:07:05 crc kubenswrapper[4732]: I0402 14:07:05.699678 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hj6wc" Apr 02 14:07:06 crc kubenswrapper[4732]: I0402 14:07:06.285698 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hj6wc" Apr 02 14:07:08 crc kubenswrapper[4732]: I0402 14:07:08.488324 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hj6wc"] Apr 02 14:07:09 crc kubenswrapper[4732]: I0402 14:07:09.274433 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hj6wc" podUID="0e720223-247b-4d57-a833-bd386df20f26" containerName="registry-server" containerID="cri-o://7911c18dc357c28d78ee0851736a1637872d3ebf1f022e76129416cc2b725471" gracePeriod=2 Apr 02 14:07:09 crc kubenswrapper[4732]: I0402 14:07:09.705167 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hj6wc" Apr 02 14:07:09 crc kubenswrapper[4732]: I0402 14:07:09.749325 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9vjd\" (UniqueName: \"kubernetes.io/projected/0e720223-247b-4d57-a833-bd386df20f26-kube-api-access-m9vjd\") pod \"0e720223-247b-4d57-a833-bd386df20f26\" (UID: \"0e720223-247b-4d57-a833-bd386df20f26\") " Apr 02 14:07:09 crc kubenswrapper[4732]: I0402 14:07:09.749516 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e720223-247b-4d57-a833-bd386df20f26-catalog-content\") pod \"0e720223-247b-4d57-a833-bd386df20f26\" (UID: \"0e720223-247b-4d57-a833-bd386df20f26\") " Apr 02 14:07:09 crc kubenswrapper[4732]: I0402 14:07:09.749542 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e720223-247b-4d57-a833-bd386df20f26-utilities\") pod \"0e720223-247b-4d57-a833-bd386df20f26\" (UID: \"0e720223-247b-4d57-a833-bd386df20f26\") " Apr 02 14:07:09 crc kubenswrapper[4732]: I0402 14:07:09.751672 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e720223-247b-4d57-a833-bd386df20f26-utilities" (OuterVolumeSpecName: "utilities") pod "0e720223-247b-4d57-a833-bd386df20f26" (UID: "0e720223-247b-4d57-a833-bd386df20f26"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:07:09 crc kubenswrapper[4732]: I0402 14:07:09.755259 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e720223-247b-4d57-a833-bd386df20f26-kube-api-access-m9vjd" (OuterVolumeSpecName: "kube-api-access-m9vjd") pod "0e720223-247b-4d57-a833-bd386df20f26" (UID: "0e720223-247b-4d57-a833-bd386df20f26"). InnerVolumeSpecName "kube-api-access-m9vjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:07:09 crc kubenswrapper[4732]: I0402 14:07:09.805567 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e720223-247b-4d57-a833-bd386df20f26-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e720223-247b-4d57-a833-bd386df20f26" (UID: "0e720223-247b-4d57-a833-bd386df20f26"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:07:09 crc kubenswrapper[4732]: I0402 14:07:09.852431 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9vjd\" (UniqueName: \"kubernetes.io/projected/0e720223-247b-4d57-a833-bd386df20f26-kube-api-access-m9vjd\") on node \"crc\" DevicePath \"\"" Apr 02 14:07:09 crc kubenswrapper[4732]: I0402 14:07:09.852465 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e720223-247b-4d57-a833-bd386df20f26-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 14:07:09 crc kubenswrapper[4732]: I0402 14:07:09.852475 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e720223-247b-4d57-a833-bd386df20f26-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 14:07:10 crc kubenswrapper[4732]: I0402 14:07:10.291320 4732 generic.go:334] "Generic (PLEG): container finished" podID="0e720223-247b-4d57-a833-bd386df20f26" containerID="7911c18dc357c28d78ee0851736a1637872d3ebf1f022e76129416cc2b725471" exitCode=0 Apr 02 14:07:10 crc kubenswrapper[4732]: I0402 14:07:10.291375 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hj6wc" event={"ID":"0e720223-247b-4d57-a833-bd386df20f26","Type":"ContainerDied","Data":"7911c18dc357c28d78ee0851736a1637872d3ebf1f022e76129416cc2b725471"} Apr 02 14:07:10 crc kubenswrapper[4732]: I0402 14:07:10.291406 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hj6wc" event={"ID":"0e720223-247b-4d57-a833-bd386df20f26","Type":"ContainerDied","Data":"fa022bfbf003c45f902b1f15683240c55c8f28658271e7a771123b6672de7727"} Apr 02 14:07:10 crc kubenswrapper[4732]: I0402 14:07:10.291431 4732 scope.go:117] "RemoveContainer" containerID="7911c18dc357c28d78ee0851736a1637872d3ebf1f022e76129416cc2b725471" Apr 02 14:07:10 crc kubenswrapper[4732]: I0402 14:07:10.291514 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hj6wc" Apr 02 14:07:10 crc kubenswrapper[4732]: I0402 14:07:10.328546 4732 scope.go:117] "RemoveContainer" containerID="439ff6ddbe0ac2e03c63361a7b8920abf3bbaa67b423fad2a7db87db016d3331" Apr 02 14:07:10 crc kubenswrapper[4732]: I0402 14:07:10.333240 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hj6wc"] Apr 02 14:07:10 crc kubenswrapper[4732]: I0402 14:07:10.343413 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hj6wc"] Apr 02 14:07:10 crc kubenswrapper[4732]: I0402 14:07:10.371229 4732 scope.go:117] "RemoveContainer" containerID="b316189728a4417ad59b830327674fe6f1a1c5c558297e3c66721a708f2c7d6a" Apr 02 14:07:10 crc kubenswrapper[4732]: I0402 14:07:10.394866 4732 scope.go:117] "RemoveContainer" containerID="7911c18dc357c28d78ee0851736a1637872d3ebf1f022e76129416cc2b725471" Apr 02 14:07:10 crc kubenswrapper[4732]: E0402 14:07:10.395525 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7911c18dc357c28d78ee0851736a1637872d3ebf1f022e76129416cc2b725471\": container with ID starting with 7911c18dc357c28d78ee0851736a1637872d3ebf1f022e76129416cc2b725471 not found: ID does not exist" containerID="7911c18dc357c28d78ee0851736a1637872d3ebf1f022e76129416cc2b725471" Apr 02 14:07:10 crc kubenswrapper[4732]: I0402 14:07:10.395571 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7911c18dc357c28d78ee0851736a1637872d3ebf1f022e76129416cc2b725471"} err="failed to get container status \"7911c18dc357c28d78ee0851736a1637872d3ebf1f022e76129416cc2b725471\": rpc error: code = NotFound desc = could not find container \"7911c18dc357c28d78ee0851736a1637872d3ebf1f022e76129416cc2b725471\": container with ID starting with 7911c18dc357c28d78ee0851736a1637872d3ebf1f022e76129416cc2b725471 not found: ID does not exist" Apr 02 14:07:10 crc kubenswrapper[4732]: I0402 14:07:10.395627 4732 scope.go:117] "RemoveContainer" containerID="439ff6ddbe0ac2e03c63361a7b8920abf3bbaa67b423fad2a7db87db016d3331" Apr 02 14:07:10 crc kubenswrapper[4732]: E0402 14:07:10.396062 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"439ff6ddbe0ac2e03c63361a7b8920abf3bbaa67b423fad2a7db87db016d3331\": container with ID starting with 439ff6ddbe0ac2e03c63361a7b8920abf3bbaa67b423fad2a7db87db016d3331 not found: ID does not exist" containerID="439ff6ddbe0ac2e03c63361a7b8920abf3bbaa67b423fad2a7db87db016d3331" Apr 02 14:07:10 crc kubenswrapper[4732]: I0402 14:07:10.396085 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"439ff6ddbe0ac2e03c63361a7b8920abf3bbaa67b423fad2a7db87db016d3331"} err="failed to get container status \"439ff6ddbe0ac2e03c63361a7b8920abf3bbaa67b423fad2a7db87db016d3331\": rpc error: code = NotFound desc = could not find container \"439ff6ddbe0ac2e03c63361a7b8920abf3bbaa67b423fad2a7db87db016d3331\": container with ID starting with 439ff6ddbe0ac2e03c63361a7b8920abf3bbaa67b423fad2a7db87db016d3331 not found: ID does not exist" Apr 02 14:07:10 crc kubenswrapper[4732]: I0402 14:07:10.396119 4732 scope.go:117] "RemoveContainer" containerID="b316189728a4417ad59b830327674fe6f1a1c5c558297e3c66721a708f2c7d6a" Apr 02 14:07:10 crc kubenswrapper[4732]: E0402 14:07:10.396381 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b316189728a4417ad59b830327674fe6f1a1c5c558297e3c66721a708f2c7d6a\": container with ID starting with b316189728a4417ad59b830327674fe6f1a1c5c558297e3c66721a708f2c7d6a not found: ID does not exist" containerID="b316189728a4417ad59b830327674fe6f1a1c5c558297e3c66721a708f2c7d6a" Apr 02 14:07:10 crc kubenswrapper[4732]: I0402 14:07:10.396417 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b316189728a4417ad59b830327674fe6f1a1c5c558297e3c66721a708f2c7d6a"} err="failed to get container status \"b316189728a4417ad59b830327674fe6f1a1c5c558297e3c66721a708f2c7d6a\": rpc error: code = NotFound desc = could not find container \"b316189728a4417ad59b830327674fe6f1a1c5c558297e3c66721a708f2c7d6a\": container with ID starting with b316189728a4417ad59b830327674fe6f1a1c5c558297e3c66721a708f2c7d6a not found: ID does not exist" Apr 02 14:07:10 crc kubenswrapper[4732]: I0402 14:07:10.702164 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e720223-247b-4d57-a833-bd386df20f26" path="/var/lib/kubelet/pods/0e720223-247b-4d57-a833-bd386df20f26/volumes" Apr 02 14:07:19 crc kubenswrapper[4732]: I0402 14:07:19.366123 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pb9g5"] Apr 02 14:07:19 crc kubenswrapper[4732]: E0402 14:07:19.367196 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e720223-247b-4d57-a833-bd386df20f26" containerName="extract-content" Apr 02 14:07:19 crc kubenswrapper[4732]: I0402 14:07:19.367216 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e720223-247b-4d57-a833-bd386df20f26" containerName="extract-content" Apr 02 14:07:19 crc kubenswrapper[4732]: E0402 14:07:19.367240 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e720223-247b-4d57-a833-bd386df20f26" containerName="extract-utilities" Apr 02 14:07:19 crc kubenswrapper[4732]: I0402 14:07:19.367249 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e720223-247b-4d57-a833-bd386df20f26" containerName="extract-utilities" Apr 02 14:07:19 crc kubenswrapper[4732]: E0402 14:07:19.367264 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e720223-247b-4d57-a833-bd386df20f26" containerName="registry-server" Apr 02 14:07:19 crc kubenswrapper[4732]: I0402 14:07:19.367273 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e720223-247b-4d57-a833-bd386df20f26" containerName="registry-server" Apr 02 14:07:19 crc kubenswrapper[4732]: I0402 14:07:19.367517 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e720223-247b-4d57-a833-bd386df20f26" containerName="registry-server" Apr 02 14:07:19 crc kubenswrapper[4732]: I0402 14:07:19.383544 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pb9g5" Apr 02 14:07:19 crc kubenswrapper[4732]: I0402 14:07:19.398789 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pb9g5"] Apr 02 14:07:19 crc kubenswrapper[4732]: I0402 14:07:19.440455 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9967e637-9980-49ea-a5d3-c37dcb9faef1-catalog-content\") pod \"redhat-marketplace-pb9g5\" (UID: \"9967e637-9980-49ea-a5d3-c37dcb9faef1\") " pod="openshift-marketplace/redhat-marketplace-pb9g5" Apr 02 14:07:19 crc kubenswrapper[4732]: I0402 14:07:19.440603 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xqrm\" (UniqueName: \"kubernetes.io/projected/9967e637-9980-49ea-a5d3-c37dcb9faef1-kube-api-access-4xqrm\") pod \"redhat-marketplace-pb9g5\" (UID: \"9967e637-9980-49ea-a5d3-c37dcb9faef1\") " pod="openshift-marketplace/redhat-marketplace-pb9g5" Apr 02 14:07:19 crc kubenswrapper[4732]: I0402 14:07:19.440685 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9967e637-9980-49ea-a5d3-c37dcb9faef1-utilities\") pod \"redhat-marketplace-pb9g5\" (UID: \"9967e637-9980-49ea-a5d3-c37dcb9faef1\") " pod="openshift-marketplace/redhat-marketplace-pb9g5" Apr 02 14:07:19 crc kubenswrapper[4732]: I0402 14:07:19.542181 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9967e637-9980-49ea-a5d3-c37dcb9faef1-catalog-content\") pod \"redhat-marketplace-pb9g5\" (UID: \"9967e637-9980-49ea-a5d3-c37dcb9faef1\") " pod="openshift-marketplace/redhat-marketplace-pb9g5" Apr 02 14:07:19 crc kubenswrapper[4732]: I0402 14:07:19.542259 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xqrm\" (UniqueName: \"kubernetes.io/projected/9967e637-9980-49ea-a5d3-c37dcb9faef1-kube-api-access-4xqrm\") pod \"redhat-marketplace-pb9g5\" (UID: \"9967e637-9980-49ea-a5d3-c37dcb9faef1\") " pod="openshift-marketplace/redhat-marketplace-pb9g5" Apr 02 14:07:19 crc kubenswrapper[4732]: I0402 14:07:19.542282 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9967e637-9980-49ea-a5d3-c37dcb9faef1-utilities\") pod \"redhat-marketplace-pb9g5\" (UID: \"9967e637-9980-49ea-a5d3-c37dcb9faef1\") " pod="openshift-marketplace/redhat-marketplace-pb9g5" Apr 02 14:07:19 crc kubenswrapper[4732]: I0402 14:07:19.542778 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9967e637-9980-49ea-a5d3-c37dcb9faef1-utilities\") pod \"redhat-marketplace-pb9g5\" (UID: \"9967e637-9980-49ea-a5d3-c37dcb9faef1\") " pod="openshift-marketplace/redhat-marketplace-pb9g5" Apr 02 14:07:19 crc kubenswrapper[4732]: I0402 14:07:19.542857 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9967e637-9980-49ea-a5d3-c37dcb9faef1-catalog-content\") pod \"redhat-marketplace-pb9g5\" (UID: \"9967e637-9980-49ea-a5d3-c37dcb9faef1\") " pod="openshift-marketplace/redhat-marketplace-pb9g5" Apr 02 14:07:19 crc kubenswrapper[4732]: I0402 14:07:19.565318 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xqrm\" (UniqueName: \"kubernetes.io/projected/9967e637-9980-49ea-a5d3-c37dcb9faef1-kube-api-access-4xqrm\") pod \"redhat-marketplace-pb9g5\" (UID: \"9967e637-9980-49ea-a5d3-c37dcb9faef1\") " pod="openshift-marketplace/redhat-marketplace-pb9g5" Apr 02 14:07:19 crc kubenswrapper[4732]: I0402 14:07:19.733026 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pb9g5" Apr 02 14:07:20 crc kubenswrapper[4732]: I0402 14:07:20.194538 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pb9g5"] Apr 02 14:07:20 crc kubenswrapper[4732]: I0402 14:07:20.425914 4732 generic.go:334] "Generic (PLEG): container finished" podID="9967e637-9980-49ea-a5d3-c37dcb9faef1" containerID="4d98a6eb0eee82ec4dc4405533d4d8e917f6fdcd5ec70b669308b9df60a3a1f9" exitCode=0 Apr 02 14:07:20 crc kubenswrapper[4732]: I0402 14:07:20.426032 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pb9g5" event={"ID":"9967e637-9980-49ea-a5d3-c37dcb9faef1","Type":"ContainerDied","Data":"4d98a6eb0eee82ec4dc4405533d4d8e917f6fdcd5ec70b669308b9df60a3a1f9"} Apr 02 14:07:20 crc kubenswrapper[4732]: I0402 14:07:20.426236 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pb9g5" event={"ID":"9967e637-9980-49ea-a5d3-c37dcb9faef1","Type":"ContainerStarted","Data":"040e93d26e9e8a4a751e8465f26575ad2d3c5fbd57fbc1585982304d3f298f3d"} Apr 02 14:07:20 crc kubenswrapper[4732]: I0402 14:07:20.427681 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 02 14:07:22 crc kubenswrapper[4732]: I0402 14:07:22.448199 4732 generic.go:334] "Generic (PLEG): container finished" podID="9967e637-9980-49ea-a5d3-c37dcb9faef1" containerID="6b4d4c619951f8991927ee0d58c41f7909c16f1cc12a5832546513f41c96fb48" exitCode=0 Apr 02 14:07:22 crc kubenswrapper[4732]: I0402 14:07:22.448447 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pb9g5" event={"ID":"9967e637-9980-49ea-a5d3-c37dcb9faef1","Type":"ContainerDied","Data":"6b4d4c619951f8991927ee0d58c41f7909c16f1cc12a5832546513f41c96fb48"} Apr 02 14:07:26 crc kubenswrapper[4732]: I0402 14:07:26.496526 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pb9g5" event={"ID":"9967e637-9980-49ea-a5d3-c37dcb9faef1","Type":"ContainerStarted","Data":"405c12a3349a6f927c03c0c25b419f90bfd9a016b0f57a84bdeaf47cc54b7e18"} Apr 02 14:07:26 crc kubenswrapper[4732]: I0402 14:07:26.514062 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pb9g5" podStartSLOduration=2.14050123 podStartE2EDuration="7.514045503s" podCreationTimestamp="2026-04-02 14:07:19 +0000 UTC" firstStartedPulling="2026-04-02 14:07:20.427358848 +0000 UTC m=+1797.331766401" lastFinishedPulling="2026-04-02 14:07:25.800903131 +0000 UTC m=+1802.705310674" observedRunningTime="2026-04-02 14:07:26.511301789 +0000 UTC m=+1803.415709352" watchObservedRunningTime="2026-04-02 14:07:26.514045503 +0000 UTC m=+1803.418453056" Apr 02 14:07:26 crc kubenswrapper[4732]: I0402 14:07:26.964753 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-77xk5"] Apr 02 14:07:26 crc kubenswrapper[4732]: I0402 14:07:26.970906 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77xk5" Apr 02 14:07:26 crc kubenswrapper[4732]: I0402 14:07:26.982763 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-77xk5"] Apr 02 14:07:27 crc kubenswrapper[4732]: I0402 14:07:27.089843 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4677ae72-c6f4-4e3f-992f-77fafeb23ae7-catalog-content\") pod \"certified-operators-77xk5\" (UID: \"4677ae72-c6f4-4e3f-992f-77fafeb23ae7\") " pod="openshift-marketplace/certified-operators-77xk5" Apr 02 14:07:27 crc kubenswrapper[4732]: I0402 14:07:27.089966 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4677ae72-c6f4-4e3f-992f-77fafeb23ae7-utilities\") pod \"certified-operators-77xk5\" (UID: \"4677ae72-c6f4-4e3f-992f-77fafeb23ae7\") " pod="openshift-marketplace/certified-operators-77xk5" Apr 02 14:07:27 crc kubenswrapper[4732]: I0402 14:07:27.090213 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x9pn\" (UniqueName: \"kubernetes.io/projected/4677ae72-c6f4-4e3f-992f-77fafeb23ae7-kube-api-access-4x9pn\") pod \"certified-operators-77xk5\" (UID: \"4677ae72-c6f4-4e3f-992f-77fafeb23ae7\") " pod="openshift-marketplace/certified-operators-77xk5" Apr 02 14:07:27 crc kubenswrapper[4732]: I0402 14:07:27.192393 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x9pn\" (UniqueName: \"kubernetes.io/projected/4677ae72-c6f4-4e3f-992f-77fafeb23ae7-kube-api-access-4x9pn\") pod \"certified-operators-77xk5\" (UID: \"4677ae72-c6f4-4e3f-992f-77fafeb23ae7\") " pod="openshift-marketplace/certified-operators-77xk5" Apr 02 14:07:27 crc kubenswrapper[4732]: I0402 14:07:27.192496 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4677ae72-c6f4-4e3f-992f-77fafeb23ae7-catalog-content\") pod \"certified-operators-77xk5\" (UID: \"4677ae72-c6f4-4e3f-992f-77fafeb23ae7\") " pod="openshift-marketplace/certified-operators-77xk5" Apr 02 14:07:27 crc kubenswrapper[4732]: I0402 14:07:27.192575 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4677ae72-c6f4-4e3f-992f-77fafeb23ae7-utilities\") pod \"certified-operators-77xk5\" (UID: \"4677ae72-c6f4-4e3f-992f-77fafeb23ae7\") " pod="openshift-marketplace/certified-operators-77xk5" Apr 02 14:07:27 crc kubenswrapper[4732]: I0402 14:07:27.193006 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4677ae72-c6f4-4e3f-992f-77fafeb23ae7-catalog-content\") pod \"certified-operators-77xk5\" (UID: \"4677ae72-c6f4-4e3f-992f-77fafeb23ae7\") " pod="openshift-marketplace/certified-operators-77xk5" Apr 02 14:07:27 crc kubenswrapper[4732]: I0402 14:07:27.193086 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4677ae72-c6f4-4e3f-992f-77fafeb23ae7-utilities\") pod \"certified-operators-77xk5\" (UID: \"4677ae72-c6f4-4e3f-992f-77fafeb23ae7\") " pod="openshift-marketplace/certified-operators-77xk5" Apr 02 14:07:27 crc kubenswrapper[4732]: I0402 14:07:27.234672 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x9pn\" (UniqueName: \"kubernetes.io/projected/4677ae72-c6f4-4e3f-992f-77fafeb23ae7-kube-api-access-4x9pn\") pod \"certified-operators-77xk5\" (UID: \"4677ae72-c6f4-4e3f-992f-77fafeb23ae7\") " pod="openshift-marketplace/certified-operators-77xk5" Apr 02 14:07:27 crc kubenswrapper[4732]: I0402 14:07:27.289410 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77xk5" Apr 02 14:07:27 crc kubenswrapper[4732]: I0402 14:07:27.690387 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-77xk5"] Apr 02 14:07:28 crc kubenswrapper[4732]: I0402 14:07:28.518295 4732 generic.go:334] "Generic (PLEG): container finished" podID="4677ae72-c6f4-4e3f-992f-77fafeb23ae7" containerID="8bb0960c473a9b76e783be77c1cd42cc148f4c71e2060d3d053808872df39f1d" exitCode=0 Apr 02 14:07:28 crc kubenswrapper[4732]: I0402 14:07:28.518359 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77xk5" event={"ID":"4677ae72-c6f4-4e3f-992f-77fafeb23ae7","Type":"ContainerDied","Data":"8bb0960c473a9b76e783be77c1cd42cc148f4c71e2060d3d053808872df39f1d"} Apr 02 14:07:28 crc kubenswrapper[4732]: I0402 14:07:28.518661 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77xk5" event={"ID":"4677ae72-c6f4-4e3f-992f-77fafeb23ae7","Type":"ContainerStarted","Data":"fc0325473c1942e3ca56d68c8ac4ad5d7ac12d310da562b8656d87b5f10dbb3c"} Apr 02 14:07:29 crc kubenswrapper[4732]: I0402 14:07:29.733768 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pb9g5" Apr 02 14:07:29 crc kubenswrapper[4732]: I0402 14:07:29.734033 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pb9g5" Apr 02 14:07:29 crc kubenswrapper[4732]: I0402 14:07:29.790144 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pb9g5" Apr 02 14:07:30 crc kubenswrapper[4732]: I0402 14:07:30.539520 4732 generic.go:334] "Generic (PLEG): container finished" podID="4677ae72-c6f4-4e3f-992f-77fafeb23ae7" containerID="da26eb48c2babdfb7e08bd85f5c8acd425b1113447523e06c6793c8b15e4536b" exitCode=0 Apr 02 14:07:30 crc kubenswrapper[4732]: I0402 14:07:30.539644 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77xk5" event={"ID":"4677ae72-c6f4-4e3f-992f-77fafeb23ae7","Type":"ContainerDied","Data":"da26eb48c2babdfb7e08bd85f5c8acd425b1113447523e06c6793c8b15e4536b"} Apr 02 14:07:30 crc kubenswrapper[4732]: I0402 14:07:30.590843 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pb9g5" Apr 02 14:07:31 crc kubenswrapper[4732]: I0402 14:07:31.133143 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pb9g5"] Apr 02 14:07:31 crc kubenswrapper[4732]: I0402 14:07:31.551923 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77xk5" event={"ID":"4677ae72-c6f4-4e3f-992f-77fafeb23ae7","Type":"ContainerStarted","Data":"d74ac693dc0cdfff3107c5627923712ca0fa3226e21e684fd109228b565fe8ae"} Apr 02 14:07:31 crc kubenswrapper[4732]: I0402 14:07:31.571208 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-77xk5" podStartSLOduration=3.190416478 podStartE2EDuration="5.571187957s" podCreationTimestamp="2026-04-02 14:07:26 +0000 UTC" firstStartedPulling="2026-04-02 14:07:28.520258226 +0000 UTC m=+1805.424665779" lastFinishedPulling="2026-04-02 14:07:30.901029705 +0000 UTC m=+1807.805437258" observedRunningTime="2026-04-02 14:07:31.570425307 +0000 UTC m=+1808.474832860" watchObservedRunningTime="2026-04-02 14:07:31.571187957 +0000 UTC m=+1808.475595510" Apr 02 14:07:31 crc kubenswrapper[4732]: I0402 14:07:31.924592 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:07:31 crc kubenswrapper[4732]: I0402 14:07:31.924979 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:07:31 crc kubenswrapper[4732]: I0402 14:07:31.925096 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 14:07:31 crc kubenswrapper[4732]: I0402 14:07:31.925904 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383"} pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 02 14:07:31 crc kubenswrapper[4732]: I0402 14:07:31.926046 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" containerID="cri-o://0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" gracePeriod=600 Apr 02 14:07:32 crc kubenswrapper[4732]: E0402 14:07:32.554767 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:07:32 crc kubenswrapper[4732]: I0402 14:07:32.563685 4732 generic.go:334] "Generic (PLEG): container finished" podID="38409e5e-4545-49da-8f6c-4bfb30582878" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" exitCode=0 Apr 02 14:07:32 crc kubenswrapper[4732]: I0402 14:07:32.563758 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerDied","Data":"0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383"} Apr 02 14:07:32 crc kubenswrapper[4732]: I0402 14:07:32.563799 4732 scope.go:117] "RemoveContainer" containerID="de6153f9349b412a56e88983b18d3d8fdd63881d0461412cebd345d437c6871b" Apr 02 14:07:32 crc kubenswrapper[4732]: I0402 14:07:32.563934 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pb9g5" podUID="9967e637-9980-49ea-a5d3-c37dcb9faef1" containerName="registry-server" containerID="cri-o://405c12a3349a6f927c03c0c25b419f90bfd9a016b0f57a84bdeaf47cc54b7e18" gracePeriod=2 Apr 02 14:07:32 crc kubenswrapper[4732]: I0402 14:07:32.564854 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:07:32 crc kubenswrapper[4732]: E0402 14:07:32.565263 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:07:33 crc kubenswrapper[4732]: I0402 14:07:33.001262 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pb9g5" Apr 02 14:07:33 crc kubenswrapper[4732]: I0402 14:07:33.102276 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9967e637-9980-49ea-a5d3-c37dcb9faef1-catalog-content\") pod \"9967e637-9980-49ea-a5d3-c37dcb9faef1\" (UID: \"9967e637-9980-49ea-a5d3-c37dcb9faef1\") " Apr 02 14:07:33 crc kubenswrapper[4732]: I0402 14:07:33.102509 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xqrm\" (UniqueName: \"kubernetes.io/projected/9967e637-9980-49ea-a5d3-c37dcb9faef1-kube-api-access-4xqrm\") pod \"9967e637-9980-49ea-a5d3-c37dcb9faef1\" (UID: \"9967e637-9980-49ea-a5d3-c37dcb9faef1\") " Apr 02 14:07:33 crc kubenswrapper[4732]: I0402 14:07:33.102589 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9967e637-9980-49ea-a5d3-c37dcb9faef1-utilities\") pod \"9967e637-9980-49ea-a5d3-c37dcb9faef1\" (UID: \"9967e637-9980-49ea-a5d3-c37dcb9faef1\") " Apr 02 14:07:33 crc kubenswrapper[4732]: I0402 14:07:33.103400 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9967e637-9980-49ea-a5d3-c37dcb9faef1-utilities" (OuterVolumeSpecName: "utilities") pod "9967e637-9980-49ea-a5d3-c37dcb9faef1" (UID: "9967e637-9980-49ea-a5d3-c37dcb9faef1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:07:33 crc kubenswrapper[4732]: I0402 14:07:33.108891 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9967e637-9980-49ea-a5d3-c37dcb9faef1-kube-api-access-4xqrm" (OuterVolumeSpecName: "kube-api-access-4xqrm") pod "9967e637-9980-49ea-a5d3-c37dcb9faef1" (UID: "9967e637-9980-49ea-a5d3-c37dcb9faef1"). InnerVolumeSpecName "kube-api-access-4xqrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:07:33 crc kubenswrapper[4732]: I0402 14:07:33.131803 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9967e637-9980-49ea-a5d3-c37dcb9faef1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9967e637-9980-49ea-a5d3-c37dcb9faef1" (UID: "9967e637-9980-49ea-a5d3-c37dcb9faef1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:07:33 crc kubenswrapper[4732]: I0402 14:07:33.205308 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9967e637-9980-49ea-a5d3-c37dcb9faef1-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 14:07:33 crc kubenswrapper[4732]: I0402 14:07:33.205378 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xqrm\" (UniqueName: \"kubernetes.io/projected/9967e637-9980-49ea-a5d3-c37dcb9faef1-kube-api-access-4xqrm\") on node \"crc\" DevicePath \"\"" Apr 02 14:07:33 crc kubenswrapper[4732]: I0402 14:07:33.205405 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9967e637-9980-49ea-a5d3-c37dcb9faef1-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 14:07:33 crc kubenswrapper[4732]: I0402 14:07:33.579105 4732 generic.go:334] "Generic (PLEG): container finished" podID="9967e637-9980-49ea-a5d3-c37dcb9faef1" containerID="405c12a3349a6f927c03c0c25b419f90bfd9a016b0f57a84bdeaf47cc54b7e18" exitCode=0 Apr 02 14:07:33 crc kubenswrapper[4732]: I0402 14:07:33.580644 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pb9g5" event={"ID":"9967e637-9980-49ea-a5d3-c37dcb9faef1","Type":"ContainerDied","Data":"405c12a3349a6f927c03c0c25b419f90bfd9a016b0f57a84bdeaf47cc54b7e18"} Apr 02 14:07:33 crc kubenswrapper[4732]: I0402 14:07:33.580763 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pb9g5" event={"ID":"9967e637-9980-49ea-a5d3-c37dcb9faef1","Type":"ContainerDied","Data":"040e93d26e9e8a4a751e8465f26575ad2d3c5fbd57fbc1585982304d3f298f3d"} Apr 02 14:07:33 crc kubenswrapper[4732]: I0402 14:07:33.580912 4732 scope.go:117] "RemoveContainer" containerID="405c12a3349a6f927c03c0c25b419f90bfd9a016b0f57a84bdeaf47cc54b7e18" Apr 02 14:07:33 crc kubenswrapper[4732]: I0402 14:07:33.581164 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pb9g5" Apr 02 14:07:33 crc kubenswrapper[4732]: I0402 14:07:33.613068 4732 scope.go:117] "RemoveContainer" containerID="6b4d4c619951f8991927ee0d58c41f7909c16f1cc12a5832546513f41c96fb48" Apr 02 14:07:33 crc kubenswrapper[4732]: I0402 14:07:33.637506 4732 scope.go:117] "RemoveContainer" containerID="4d98a6eb0eee82ec4dc4405533d4d8e917f6fdcd5ec70b669308b9df60a3a1f9" Apr 02 14:07:33 crc kubenswrapper[4732]: I0402 14:07:33.637809 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pb9g5"] Apr 02 14:07:33 crc kubenswrapper[4732]: I0402 14:07:33.647541 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pb9g5"] Apr 02 14:07:33 crc kubenswrapper[4732]: I0402 14:07:33.687770 4732 scope.go:117] "RemoveContainer" containerID="405c12a3349a6f927c03c0c25b419f90bfd9a016b0f57a84bdeaf47cc54b7e18" Apr 02 14:07:33 crc kubenswrapper[4732]: E0402 14:07:33.688134 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"405c12a3349a6f927c03c0c25b419f90bfd9a016b0f57a84bdeaf47cc54b7e18\": container with ID starting with 405c12a3349a6f927c03c0c25b419f90bfd9a016b0f57a84bdeaf47cc54b7e18 not found: ID does not exist" containerID="405c12a3349a6f927c03c0c25b419f90bfd9a016b0f57a84bdeaf47cc54b7e18" Apr 02 14:07:33 crc kubenswrapper[4732]: I0402 14:07:33.688181 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"405c12a3349a6f927c03c0c25b419f90bfd9a016b0f57a84bdeaf47cc54b7e18"} err="failed to get container status \"405c12a3349a6f927c03c0c25b419f90bfd9a016b0f57a84bdeaf47cc54b7e18\": rpc error: code = NotFound desc = could not find container \"405c12a3349a6f927c03c0c25b419f90bfd9a016b0f57a84bdeaf47cc54b7e18\": container with ID starting with 405c12a3349a6f927c03c0c25b419f90bfd9a016b0f57a84bdeaf47cc54b7e18 not found: ID does not exist" Apr 02 14:07:33 crc kubenswrapper[4732]: I0402 14:07:33.688208 4732 scope.go:117] "RemoveContainer" containerID="6b4d4c619951f8991927ee0d58c41f7909c16f1cc12a5832546513f41c96fb48" Apr 02 14:07:33 crc kubenswrapper[4732]: E0402 14:07:33.688524 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b4d4c619951f8991927ee0d58c41f7909c16f1cc12a5832546513f41c96fb48\": container with ID starting with 6b4d4c619951f8991927ee0d58c41f7909c16f1cc12a5832546513f41c96fb48 not found: ID does not exist" containerID="6b4d4c619951f8991927ee0d58c41f7909c16f1cc12a5832546513f41c96fb48" Apr 02 14:07:33 crc kubenswrapper[4732]: I0402 14:07:33.688556 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b4d4c619951f8991927ee0d58c41f7909c16f1cc12a5832546513f41c96fb48"} err="failed to get container status \"6b4d4c619951f8991927ee0d58c41f7909c16f1cc12a5832546513f41c96fb48\": rpc error: code = NotFound desc = could not find container \"6b4d4c619951f8991927ee0d58c41f7909c16f1cc12a5832546513f41c96fb48\": container with ID starting with 6b4d4c619951f8991927ee0d58c41f7909c16f1cc12a5832546513f41c96fb48 not found: ID does not exist" Apr 02 14:07:33 crc kubenswrapper[4732]: I0402 14:07:33.688575 4732 scope.go:117] "RemoveContainer" containerID="4d98a6eb0eee82ec4dc4405533d4d8e917f6fdcd5ec70b669308b9df60a3a1f9" Apr 02 14:07:33 crc kubenswrapper[4732]: E0402 14:07:33.688874 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d98a6eb0eee82ec4dc4405533d4d8e917f6fdcd5ec70b669308b9df60a3a1f9\": container with ID starting with 4d98a6eb0eee82ec4dc4405533d4d8e917f6fdcd5ec70b669308b9df60a3a1f9 not found: ID does not exist" containerID="4d98a6eb0eee82ec4dc4405533d4d8e917f6fdcd5ec70b669308b9df60a3a1f9" Apr 02 14:07:33 crc kubenswrapper[4732]: I0402 14:07:33.688897 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d98a6eb0eee82ec4dc4405533d4d8e917f6fdcd5ec70b669308b9df60a3a1f9"} err="failed to get container status \"4d98a6eb0eee82ec4dc4405533d4d8e917f6fdcd5ec70b669308b9df60a3a1f9\": rpc error: code = NotFound desc = could not find container \"4d98a6eb0eee82ec4dc4405533d4d8e917f6fdcd5ec70b669308b9df60a3a1f9\": container with ID starting with 4d98a6eb0eee82ec4dc4405533d4d8e917f6fdcd5ec70b669308b9df60a3a1f9 not found: ID does not exist" Apr 02 14:07:34 crc kubenswrapper[4732]: I0402 14:07:34.689867 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9967e637-9980-49ea-a5d3-c37dcb9faef1" path="/var/lib/kubelet/pods/9967e637-9980-49ea-a5d3-c37dcb9faef1/volumes" Apr 02 14:07:37 crc kubenswrapper[4732]: I0402 14:07:37.291198 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-77xk5" Apr 02 14:07:37 crc kubenswrapper[4732]: I0402 14:07:37.292513 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-77xk5" Apr 02 14:07:37 crc kubenswrapper[4732]: I0402 14:07:37.338709 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-77xk5" Apr 02 14:07:37 crc kubenswrapper[4732]: I0402 14:07:37.681977 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-77xk5" Apr 02 14:07:38 crc kubenswrapper[4732]: I0402 14:07:38.130411 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-77xk5"] Apr 02 14:07:39 crc kubenswrapper[4732]: I0402 14:07:39.641848 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-77xk5" podUID="4677ae72-c6f4-4e3f-992f-77fafeb23ae7" containerName="registry-server" containerID="cri-o://d74ac693dc0cdfff3107c5627923712ca0fa3226e21e684fd109228b565fe8ae" gracePeriod=2 Apr 02 14:07:40 crc kubenswrapper[4732]: I0402 14:07:40.095818 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77xk5" Apr 02 14:07:40 crc kubenswrapper[4732]: I0402 14:07:40.239763 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4677ae72-c6f4-4e3f-992f-77fafeb23ae7-catalog-content\") pod \"4677ae72-c6f4-4e3f-992f-77fafeb23ae7\" (UID: \"4677ae72-c6f4-4e3f-992f-77fafeb23ae7\") " Apr 02 14:07:40 crc kubenswrapper[4732]: I0402 14:07:40.239877 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4677ae72-c6f4-4e3f-992f-77fafeb23ae7-utilities\") pod \"4677ae72-c6f4-4e3f-992f-77fafeb23ae7\" (UID: \"4677ae72-c6f4-4e3f-992f-77fafeb23ae7\") " Apr 02 14:07:40 crc kubenswrapper[4732]: I0402 14:07:40.240006 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x9pn\" (UniqueName: \"kubernetes.io/projected/4677ae72-c6f4-4e3f-992f-77fafeb23ae7-kube-api-access-4x9pn\") pod \"4677ae72-c6f4-4e3f-992f-77fafeb23ae7\" (UID: \"4677ae72-c6f4-4e3f-992f-77fafeb23ae7\") " Apr 02 14:07:40 crc kubenswrapper[4732]: I0402 14:07:40.241227 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4677ae72-c6f4-4e3f-992f-77fafeb23ae7-utilities" (OuterVolumeSpecName: "utilities") pod "4677ae72-c6f4-4e3f-992f-77fafeb23ae7" (UID: "4677ae72-c6f4-4e3f-992f-77fafeb23ae7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:07:40 crc kubenswrapper[4732]: I0402 14:07:40.248918 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4677ae72-c6f4-4e3f-992f-77fafeb23ae7-kube-api-access-4x9pn" (OuterVolumeSpecName: "kube-api-access-4x9pn") pod "4677ae72-c6f4-4e3f-992f-77fafeb23ae7" (UID: "4677ae72-c6f4-4e3f-992f-77fafeb23ae7"). InnerVolumeSpecName "kube-api-access-4x9pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:07:40 crc kubenswrapper[4732]: I0402 14:07:40.318779 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4677ae72-c6f4-4e3f-992f-77fafeb23ae7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4677ae72-c6f4-4e3f-992f-77fafeb23ae7" (UID: "4677ae72-c6f4-4e3f-992f-77fafeb23ae7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:07:40 crc kubenswrapper[4732]: I0402 14:07:40.342299 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4677ae72-c6f4-4e3f-992f-77fafeb23ae7-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 14:07:40 crc kubenswrapper[4732]: I0402 14:07:40.342680 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4677ae72-c6f4-4e3f-992f-77fafeb23ae7-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 14:07:40 crc kubenswrapper[4732]: I0402 14:07:40.342690 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x9pn\" (UniqueName: \"kubernetes.io/projected/4677ae72-c6f4-4e3f-992f-77fafeb23ae7-kube-api-access-4x9pn\") on node \"crc\" DevicePath \"\"" Apr 02 14:07:40 crc kubenswrapper[4732]: I0402 14:07:40.657864 4732 generic.go:334] "Generic (PLEG): container finished" podID="4677ae72-c6f4-4e3f-992f-77fafeb23ae7" containerID="d74ac693dc0cdfff3107c5627923712ca0fa3226e21e684fd109228b565fe8ae" exitCode=0 Apr 02 14:07:40 crc kubenswrapper[4732]: I0402 14:07:40.657926 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77xk5" event={"ID":"4677ae72-c6f4-4e3f-992f-77fafeb23ae7","Type":"ContainerDied","Data":"d74ac693dc0cdfff3107c5627923712ca0fa3226e21e684fd109228b565fe8ae"} Apr 02 14:07:40 crc kubenswrapper[4732]: I0402 14:07:40.657947 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77xk5" Apr 02 14:07:40 crc kubenswrapper[4732]: I0402 14:07:40.658754 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77xk5" event={"ID":"4677ae72-c6f4-4e3f-992f-77fafeb23ae7","Type":"ContainerDied","Data":"fc0325473c1942e3ca56d68c8ac4ad5d7ac12d310da562b8656d87b5f10dbb3c"} Apr 02 14:07:40 crc kubenswrapper[4732]: I0402 14:07:40.658794 4732 scope.go:117] "RemoveContainer" containerID="d74ac693dc0cdfff3107c5627923712ca0fa3226e21e684fd109228b565fe8ae" Apr 02 14:07:40 crc kubenswrapper[4732]: I0402 14:07:40.679055 4732 scope.go:117] "RemoveContainer" containerID="da26eb48c2babdfb7e08bd85f5c8acd425b1113447523e06c6793c8b15e4536b" Apr 02 14:07:40 crc kubenswrapper[4732]: I0402 14:07:40.705981 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-77xk5"] Apr 02 14:07:40 crc kubenswrapper[4732]: I0402 14:07:40.717822 4732 scope.go:117] "RemoveContainer" containerID="8bb0960c473a9b76e783be77c1cd42cc148f4c71e2060d3d053808872df39f1d" Apr 02 14:07:40 crc kubenswrapper[4732]: I0402 14:07:40.719110 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-77xk5"] Apr 02 14:07:40 crc kubenswrapper[4732]: I0402 14:07:40.762325 4732 scope.go:117] "RemoveContainer" containerID="d74ac693dc0cdfff3107c5627923712ca0fa3226e21e684fd109228b565fe8ae" Apr 02 14:07:40 crc kubenswrapper[4732]: E0402 14:07:40.763506 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d74ac693dc0cdfff3107c5627923712ca0fa3226e21e684fd109228b565fe8ae\": container with ID starting with d74ac693dc0cdfff3107c5627923712ca0fa3226e21e684fd109228b565fe8ae not found: ID does not exist" containerID="d74ac693dc0cdfff3107c5627923712ca0fa3226e21e684fd109228b565fe8ae" Apr 02 14:07:40 crc kubenswrapper[4732]: I0402 14:07:40.763552 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d74ac693dc0cdfff3107c5627923712ca0fa3226e21e684fd109228b565fe8ae"} err="failed to get container status \"d74ac693dc0cdfff3107c5627923712ca0fa3226e21e684fd109228b565fe8ae\": rpc error: code = NotFound desc = could not find container \"d74ac693dc0cdfff3107c5627923712ca0fa3226e21e684fd109228b565fe8ae\": container with ID starting with d74ac693dc0cdfff3107c5627923712ca0fa3226e21e684fd109228b565fe8ae not found: ID does not exist" Apr 02 14:07:40 crc kubenswrapper[4732]: I0402 14:07:40.763580 4732 scope.go:117] "RemoveContainer" containerID="da26eb48c2babdfb7e08bd85f5c8acd425b1113447523e06c6793c8b15e4536b" Apr 02 14:07:40 crc kubenswrapper[4732]: E0402 14:07:40.764288 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da26eb48c2babdfb7e08bd85f5c8acd425b1113447523e06c6793c8b15e4536b\": container with ID starting with da26eb48c2babdfb7e08bd85f5c8acd425b1113447523e06c6793c8b15e4536b not found: ID does not exist" containerID="da26eb48c2babdfb7e08bd85f5c8acd425b1113447523e06c6793c8b15e4536b" Apr 02 14:07:40 crc kubenswrapper[4732]: I0402 14:07:40.764316 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da26eb48c2babdfb7e08bd85f5c8acd425b1113447523e06c6793c8b15e4536b"} err="failed to get container status \"da26eb48c2babdfb7e08bd85f5c8acd425b1113447523e06c6793c8b15e4536b\": rpc error: code = NotFound desc = could not find container \"da26eb48c2babdfb7e08bd85f5c8acd425b1113447523e06c6793c8b15e4536b\": container with ID starting with da26eb48c2babdfb7e08bd85f5c8acd425b1113447523e06c6793c8b15e4536b not found: ID does not exist" Apr 02 14:07:40 crc kubenswrapper[4732]: I0402 14:07:40.764329 4732 scope.go:117] "RemoveContainer" containerID="8bb0960c473a9b76e783be77c1cd42cc148f4c71e2060d3d053808872df39f1d" Apr 02 14:07:40 crc kubenswrapper[4732]: E0402 14:07:40.764718 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bb0960c473a9b76e783be77c1cd42cc148f4c71e2060d3d053808872df39f1d\": container with ID starting with 8bb0960c473a9b76e783be77c1cd42cc148f4c71e2060d3d053808872df39f1d not found: ID does not exist" containerID="8bb0960c473a9b76e783be77c1cd42cc148f4c71e2060d3d053808872df39f1d" Apr 02 14:07:40 crc kubenswrapper[4732]: I0402 14:07:40.764803 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bb0960c473a9b76e783be77c1cd42cc148f4c71e2060d3d053808872df39f1d"} err="failed to get container status \"8bb0960c473a9b76e783be77c1cd42cc148f4c71e2060d3d053808872df39f1d\": rpc error: code = NotFound desc = could not find container \"8bb0960c473a9b76e783be77c1cd42cc148f4c71e2060d3d053808872df39f1d\": container with ID starting with 8bb0960c473a9b76e783be77c1cd42cc148f4c71e2060d3d053808872df39f1d not found: ID does not exist" Apr 02 14:07:42 crc kubenswrapper[4732]: I0402 14:07:42.690770 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4677ae72-c6f4-4e3f-992f-77fafeb23ae7" path="/var/lib/kubelet/pods/4677ae72-c6f4-4e3f-992f-77fafeb23ae7/volumes" Apr 02 14:07:43 crc kubenswrapper[4732]: I0402 14:07:43.680864 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:07:43 crc kubenswrapper[4732]: E0402 14:07:43.681410 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:07:54 crc kubenswrapper[4732]: I0402 14:07:54.689699 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:07:54 crc kubenswrapper[4732]: E0402 14:07:54.691401 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:08:00 crc kubenswrapper[4732]: I0402 14:08:00.140677 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585648-5xpdd"] Apr 02 14:08:00 crc kubenswrapper[4732]: E0402 14:08:00.141571 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4677ae72-c6f4-4e3f-992f-77fafeb23ae7" containerName="extract-utilities" Apr 02 14:08:00 crc kubenswrapper[4732]: I0402 14:08:00.141584 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4677ae72-c6f4-4e3f-992f-77fafeb23ae7" containerName="extract-utilities" Apr 02 14:08:00 crc kubenswrapper[4732]: E0402 14:08:00.141599 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9967e637-9980-49ea-a5d3-c37dcb9faef1" containerName="extract-content" Apr 02 14:08:00 crc kubenswrapper[4732]: I0402 14:08:00.141605 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9967e637-9980-49ea-a5d3-c37dcb9faef1" containerName="extract-content" Apr 02 14:08:00 crc kubenswrapper[4732]: E0402 14:08:00.141700 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4677ae72-c6f4-4e3f-992f-77fafeb23ae7" containerName="extract-content" Apr 02 14:08:00 crc kubenswrapper[4732]: I0402 14:08:00.141709 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4677ae72-c6f4-4e3f-992f-77fafeb23ae7" containerName="extract-content" Apr 02 14:08:00 crc kubenswrapper[4732]: E0402 14:08:00.141720 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9967e637-9980-49ea-a5d3-c37dcb9faef1" containerName="extract-utilities" Apr 02 14:08:00 crc kubenswrapper[4732]: I0402 14:08:00.141726 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9967e637-9980-49ea-a5d3-c37dcb9faef1" containerName="extract-utilities" Apr 02 14:08:00 crc kubenswrapper[4732]: E0402 14:08:00.141739 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9967e637-9980-49ea-a5d3-c37dcb9faef1" containerName="registry-server" Apr 02 14:08:00 crc kubenswrapper[4732]: I0402 14:08:00.141744 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9967e637-9980-49ea-a5d3-c37dcb9faef1" containerName="registry-server" Apr 02 14:08:00 crc kubenswrapper[4732]: E0402 14:08:00.141758 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4677ae72-c6f4-4e3f-992f-77fafeb23ae7" containerName="registry-server" Apr 02 14:08:00 crc kubenswrapper[4732]: I0402 14:08:00.141764 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4677ae72-c6f4-4e3f-992f-77fafeb23ae7" containerName="registry-server" Apr 02 14:08:00 crc kubenswrapper[4732]: I0402 14:08:00.141969 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4677ae72-c6f4-4e3f-992f-77fafeb23ae7" containerName="registry-server" Apr 02 14:08:00 crc kubenswrapper[4732]: I0402 14:08:00.141981 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9967e637-9980-49ea-a5d3-c37dcb9faef1" containerName="registry-server" Apr 02 14:08:00 crc kubenswrapper[4732]: I0402 14:08:00.142601 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585648-5xpdd" Apr 02 14:08:00 crc kubenswrapper[4732]: I0402 14:08:00.145203 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:08:00 crc kubenswrapper[4732]: I0402 14:08:00.145464 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:08:00 crc kubenswrapper[4732]: I0402 14:08:00.145632 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:08:00 crc kubenswrapper[4732]: I0402 14:08:00.151162 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585648-5xpdd"] Apr 02 14:08:00 crc kubenswrapper[4732]: I0402 14:08:00.224363 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrpzq\" (UniqueName: \"kubernetes.io/projected/26576523-a1b0-4e31-9477-064954cf21a6-kube-api-access-vrpzq\") pod \"auto-csr-approver-29585648-5xpdd\" (UID: \"26576523-a1b0-4e31-9477-064954cf21a6\") " pod="openshift-infra/auto-csr-approver-29585648-5xpdd" Apr 02 14:08:00 crc kubenswrapper[4732]: I0402 14:08:00.326164 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrpzq\" (UniqueName: \"kubernetes.io/projected/26576523-a1b0-4e31-9477-064954cf21a6-kube-api-access-vrpzq\") pod \"auto-csr-approver-29585648-5xpdd\" (UID: \"26576523-a1b0-4e31-9477-064954cf21a6\") " pod="openshift-infra/auto-csr-approver-29585648-5xpdd" Apr 02 14:08:00 crc kubenswrapper[4732]: I0402 14:08:00.344978 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrpzq\" (UniqueName: \"kubernetes.io/projected/26576523-a1b0-4e31-9477-064954cf21a6-kube-api-access-vrpzq\") pod \"auto-csr-approver-29585648-5xpdd\" (UID: \"26576523-a1b0-4e31-9477-064954cf21a6\") " pod="openshift-infra/auto-csr-approver-29585648-5xpdd" Apr 02 14:08:00 crc kubenswrapper[4732]: I0402 14:08:00.464426 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585648-5xpdd" Apr 02 14:08:00 crc kubenswrapper[4732]: I0402 14:08:00.911000 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585648-5xpdd"] Apr 02 14:08:01 crc kubenswrapper[4732]: I0402 14:08:01.866382 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585648-5xpdd" event={"ID":"26576523-a1b0-4e31-9477-064954cf21a6","Type":"ContainerStarted","Data":"cd74b9e2a6be6be9b06259ad203ec478d91afa2e2a9c947361aad3e0aa2ff5ca"} Apr 02 14:08:02 crc kubenswrapper[4732]: I0402 14:08:02.876873 4732 generic.go:334] "Generic (PLEG): container finished" podID="26576523-a1b0-4e31-9477-064954cf21a6" containerID="c2b0a42edffbf54c0807d243ec7a9400d0398d3ebeae7e9de25537a4159fd61b" exitCode=0 Apr 02 14:08:02 crc kubenswrapper[4732]: I0402 14:08:02.876949 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585648-5xpdd" event={"ID":"26576523-a1b0-4e31-9477-064954cf21a6","Type":"ContainerDied","Data":"c2b0a42edffbf54c0807d243ec7a9400d0398d3ebeae7e9de25537a4159fd61b"} Apr 02 14:08:04 crc kubenswrapper[4732]: I0402 14:08:04.361793 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585648-5xpdd" Apr 02 14:08:04 crc kubenswrapper[4732]: I0402 14:08:04.400033 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrpzq\" (UniqueName: \"kubernetes.io/projected/26576523-a1b0-4e31-9477-064954cf21a6-kube-api-access-vrpzq\") pod \"26576523-a1b0-4e31-9477-064954cf21a6\" (UID: \"26576523-a1b0-4e31-9477-064954cf21a6\") " Apr 02 14:08:04 crc kubenswrapper[4732]: I0402 14:08:04.406434 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26576523-a1b0-4e31-9477-064954cf21a6-kube-api-access-vrpzq" (OuterVolumeSpecName: "kube-api-access-vrpzq") pod "26576523-a1b0-4e31-9477-064954cf21a6" (UID: "26576523-a1b0-4e31-9477-064954cf21a6"). InnerVolumeSpecName "kube-api-access-vrpzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:08:04 crc kubenswrapper[4732]: I0402 14:08:04.502274 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrpzq\" (UniqueName: \"kubernetes.io/projected/26576523-a1b0-4e31-9477-064954cf21a6-kube-api-access-vrpzq\") on node \"crc\" DevicePath \"\"" Apr 02 14:08:04 crc kubenswrapper[4732]: I0402 14:08:04.898475 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585648-5xpdd" event={"ID":"26576523-a1b0-4e31-9477-064954cf21a6","Type":"ContainerDied","Data":"cd74b9e2a6be6be9b06259ad203ec478d91afa2e2a9c947361aad3e0aa2ff5ca"} Apr 02 14:08:04 crc kubenswrapper[4732]: I0402 14:08:04.898521 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd74b9e2a6be6be9b06259ad203ec478d91afa2e2a9c947361aad3e0aa2ff5ca" Apr 02 14:08:04 crc kubenswrapper[4732]: I0402 14:08:04.898799 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585648-5xpdd" Apr 02 14:08:05 crc kubenswrapper[4732]: I0402 14:08:05.434601 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585642-b58hp"] Apr 02 14:08:05 crc kubenswrapper[4732]: I0402 14:08:05.444422 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585642-b58hp"] Apr 02 14:08:06 crc kubenswrapper[4732]: I0402 14:08:06.680637 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:08:06 crc kubenswrapper[4732]: E0402 14:08:06.681042 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:08:06 crc kubenswrapper[4732]: I0402 14:08:06.693028 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6f8ac3c-1daf-481a-bbb8-2ecbeb7bc44f" path="/var/lib/kubelet/pods/c6f8ac3c-1daf-481a-bbb8-2ecbeb7bc44f/volumes" Apr 02 14:08:18 crc kubenswrapper[4732]: I0402 14:08:18.681558 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:08:18 crc kubenswrapper[4732]: E0402 14:08:18.682507 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:08:20 crc kubenswrapper[4732]: I0402 14:08:20.043110 4732 generic.go:334] "Generic (PLEG): container finished" podID="a67d60f0-3912-4fc4-96b7-f96831ff23d3" containerID="64b1cbbf5c8dc33f37abc0fce039f671184f9c49425ba27d84d2d2122aedd2c3" exitCode=0 Apr 02 14:08:20 crc kubenswrapper[4732]: I0402 14:08:20.043178 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr" event={"ID":"a67d60f0-3912-4fc4-96b7-f96831ff23d3","Type":"ContainerDied","Data":"64b1cbbf5c8dc33f37abc0fce039f671184f9c49425ba27d84d2d2122aedd2c3"} Apr 02 14:08:21 crc kubenswrapper[4732]: I0402 14:08:21.427248 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr" Apr 02 14:08:21 crc kubenswrapper[4732]: I0402 14:08:21.532654 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a67d60f0-3912-4fc4-96b7-f96831ff23d3-ssh-key-openstack-edpm-ipam\") pod \"a67d60f0-3912-4fc4-96b7-f96831ff23d3\" (UID: \"a67d60f0-3912-4fc4-96b7-f96831ff23d3\") " Apr 02 14:08:21 crc kubenswrapper[4732]: I0402 14:08:21.533081 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a67d60f0-3912-4fc4-96b7-f96831ff23d3-inventory\") pod \"a67d60f0-3912-4fc4-96b7-f96831ff23d3\" (UID: \"a67d60f0-3912-4fc4-96b7-f96831ff23d3\") " Apr 02 14:08:21 crc kubenswrapper[4732]: I0402 14:08:21.533226 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbzv2\" (UniqueName: \"kubernetes.io/projected/a67d60f0-3912-4fc4-96b7-f96831ff23d3-kube-api-access-dbzv2\") pod \"a67d60f0-3912-4fc4-96b7-f96831ff23d3\" (UID: \"a67d60f0-3912-4fc4-96b7-f96831ff23d3\") " Apr 02 14:08:21 crc kubenswrapper[4732]: I0402 14:08:21.533421 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a67d60f0-3912-4fc4-96b7-f96831ff23d3-bootstrap-combined-ca-bundle\") pod \"a67d60f0-3912-4fc4-96b7-f96831ff23d3\" (UID: \"a67d60f0-3912-4fc4-96b7-f96831ff23d3\") " Apr 02 14:08:21 crc kubenswrapper[4732]: I0402 14:08:21.538627 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a67d60f0-3912-4fc4-96b7-f96831ff23d3-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "a67d60f0-3912-4fc4-96b7-f96831ff23d3" (UID: "a67d60f0-3912-4fc4-96b7-f96831ff23d3"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:08:21 crc kubenswrapper[4732]: I0402 14:08:21.538662 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a67d60f0-3912-4fc4-96b7-f96831ff23d3-kube-api-access-dbzv2" (OuterVolumeSpecName: "kube-api-access-dbzv2") pod "a67d60f0-3912-4fc4-96b7-f96831ff23d3" (UID: "a67d60f0-3912-4fc4-96b7-f96831ff23d3"). InnerVolumeSpecName "kube-api-access-dbzv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:08:21 crc kubenswrapper[4732]: I0402 14:08:21.559494 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a67d60f0-3912-4fc4-96b7-f96831ff23d3-inventory" (OuterVolumeSpecName: "inventory") pod "a67d60f0-3912-4fc4-96b7-f96831ff23d3" (UID: "a67d60f0-3912-4fc4-96b7-f96831ff23d3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:08:21 crc kubenswrapper[4732]: I0402 14:08:21.560265 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a67d60f0-3912-4fc4-96b7-f96831ff23d3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a67d60f0-3912-4fc4-96b7-f96831ff23d3" (UID: "a67d60f0-3912-4fc4-96b7-f96831ff23d3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:08:21 crc kubenswrapper[4732]: I0402 14:08:21.636280 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a67d60f0-3912-4fc4-96b7-f96831ff23d3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 02 14:08:21 crc kubenswrapper[4732]: I0402 14:08:21.636369 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a67d60f0-3912-4fc4-96b7-f96831ff23d3-inventory\") on node \"crc\" DevicePath \"\"" Apr 02 14:08:21 crc kubenswrapper[4732]: I0402 14:08:21.636391 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbzv2\" (UniqueName: \"kubernetes.io/projected/a67d60f0-3912-4fc4-96b7-f96831ff23d3-kube-api-access-dbzv2\") on node \"crc\" DevicePath \"\"" Apr 02 14:08:21 crc kubenswrapper[4732]: I0402 14:08:21.636412 4732 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a67d60f0-3912-4fc4-96b7-f96831ff23d3-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:08:22 crc kubenswrapper[4732]: I0402 14:08:22.064229 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr" event={"ID":"a67d60f0-3912-4fc4-96b7-f96831ff23d3","Type":"ContainerDied","Data":"0d082c0183e43eb12e00b62651ce5ae68da6024a1b583eea904c8d4daacc0f56"} Apr 02 14:08:22 crc kubenswrapper[4732]: I0402 14:08:22.064284 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d082c0183e43eb12e00b62651ce5ae68da6024a1b583eea904c8d4daacc0f56" Apr 02 14:08:22 crc kubenswrapper[4732]: I0402 14:08:22.064355 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr" Apr 02 14:08:22 crc kubenswrapper[4732]: I0402 14:08:22.142705 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q"] Apr 02 14:08:22 crc kubenswrapper[4732]: E0402 14:08:22.143219 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26576523-a1b0-4e31-9477-064954cf21a6" containerName="oc" Apr 02 14:08:22 crc kubenswrapper[4732]: I0402 14:08:22.143243 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="26576523-a1b0-4e31-9477-064954cf21a6" containerName="oc" Apr 02 14:08:22 crc kubenswrapper[4732]: E0402 14:08:22.143281 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a67d60f0-3912-4fc4-96b7-f96831ff23d3" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Apr 02 14:08:22 crc kubenswrapper[4732]: I0402 14:08:22.143290 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a67d60f0-3912-4fc4-96b7-f96831ff23d3" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Apr 02 14:08:22 crc kubenswrapper[4732]: I0402 14:08:22.143529 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="26576523-a1b0-4e31-9477-064954cf21a6" containerName="oc" Apr 02 14:08:22 crc kubenswrapper[4732]: I0402 14:08:22.143547 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a67d60f0-3912-4fc4-96b7-f96831ff23d3" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Apr 02 14:08:22 crc kubenswrapper[4732]: I0402 14:08:22.144341 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q" Apr 02 14:08:22 crc kubenswrapper[4732]: I0402 14:08:22.146359 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 02 14:08:22 crc kubenswrapper[4732]: I0402 14:08:22.148251 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wdhd4" Apr 02 14:08:22 crc kubenswrapper[4732]: I0402 14:08:22.148300 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 02 14:08:22 crc kubenswrapper[4732]: I0402 14:08:22.148577 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 02 14:08:22 crc kubenswrapper[4732]: I0402 14:08:22.155577 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q"] Apr 02 14:08:22 crc kubenswrapper[4732]: I0402 14:08:22.250251 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k79sk\" (UniqueName: \"kubernetes.io/projected/240ff67d-47d5-4b2e-b744-e0e2332a9496-kube-api-access-k79sk\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q\" (UID: \"240ff67d-47d5-4b2e-b744-e0e2332a9496\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q" Apr 02 14:08:22 crc kubenswrapper[4732]: I0402 14:08:22.250372 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/240ff67d-47d5-4b2e-b744-e0e2332a9496-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q\" (UID: \"240ff67d-47d5-4b2e-b744-e0e2332a9496\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q" Apr 02 14:08:22 crc kubenswrapper[4732]: I0402 14:08:22.250540 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/240ff67d-47d5-4b2e-b744-e0e2332a9496-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q\" (UID: \"240ff67d-47d5-4b2e-b744-e0e2332a9496\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q" Apr 02 14:08:22 crc kubenswrapper[4732]: I0402 14:08:22.353552 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k79sk\" (UniqueName: \"kubernetes.io/projected/240ff67d-47d5-4b2e-b744-e0e2332a9496-kube-api-access-k79sk\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q\" (UID: \"240ff67d-47d5-4b2e-b744-e0e2332a9496\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q" Apr 02 14:08:22 crc kubenswrapper[4732]: I0402 14:08:22.353992 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/240ff67d-47d5-4b2e-b744-e0e2332a9496-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q\" (UID: \"240ff67d-47d5-4b2e-b744-e0e2332a9496\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q" Apr 02 14:08:22 crc kubenswrapper[4732]: I0402 14:08:22.354140 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/240ff67d-47d5-4b2e-b744-e0e2332a9496-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q\" (UID: \"240ff67d-47d5-4b2e-b744-e0e2332a9496\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q" Apr 02 14:08:22 crc kubenswrapper[4732]: I0402 14:08:22.360243 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/240ff67d-47d5-4b2e-b744-e0e2332a9496-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q\" (UID: \"240ff67d-47d5-4b2e-b744-e0e2332a9496\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q" Apr 02 14:08:22 crc kubenswrapper[4732]: I0402 14:08:22.362702 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/240ff67d-47d5-4b2e-b744-e0e2332a9496-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q\" (UID: \"240ff67d-47d5-4b2e-b744-e0e2332a9496\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q" Apr 02 14:08:22 crc kubenswrapper[4732]: I0402 14:08:22.381509 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k79sk\" (UniqueName: \"kubernetes.io/projected/240ff67d-47d5-4b2e-b744-e0e2332a9496-kube-api-access-k79sk\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q\" (UID: \"240ff67d-47d5-4b2e-b744-e0e2332a9496\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q" Apr 02 14:08:22 crc kubenswrapper[4732]: I0402 14:08:22.465037 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q" Apr 02 14:08:22 crc kubenswrapper[4732]: I0402 14:08:22.976381 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q"] Apr 02 14:08:23 crc kubenswrapper[4732]: I0402 14:08:23.073234 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q" event={"ID":"240ff67d-47d5-4b2e-b744-e0e2332a9496","Type":"ContainerStarted","Data":"47c2945aca4463ca18d257ac5bfc8437ae944f173daf7d444af4931e0c8a8236"} Apr 02 14:08:24 crc kubenswrapper[4732]: I0402 14:08:24.084293 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q" event={"ID":"240ff67d-47d5-4b2e-b744-e0e2332a9496","Type":"ContainerStarted","Data":"7cb4e07ba31e7603f8a0198ac3082697817e00b2d659fe39efbfe72c4a6c395e"} Apr 02 14:08:24 crc kubenswrapper[4732]: I0402 14:08:24.111514 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q" podStartSLOduration=1.472576982 podStartE2EDuration="2.111494712s" podCreationTimestamp="2026-04-02 14:08:22 +0000 UTC" firstStartedPulling="2026-04-02 14:08:22.982776397 +0000 UTC m=+1859.887183940" lastFinishedPulling="2026-04-02 14:08:23.621694117 +0000 UTC m=+1860.526101670" observedRunningTime="2026-04-02 14:08:24.097153285 +0000 UTC m=+1861.001560858" watchObservedRunningTime="2026-04-02 14:08:24.111494712 +0000 UTC m=+1861.015902285" Apr 02 14:08:33 crc kubenswrapper[4732]: I0402 14:08:33.680388 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:08:33 crc kubenswrapper[4732]: E0402 14:08:33.681121 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:08:47 crc kubenswrapper[4732]: I0402 14:08:47.686086 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:08:47 crc kubenswrapper[4732]: E0402 14:08:47.691052 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:08:49 crc kubenswrapper[4732]: I0402 14:08:49.204941 4732 scope.go:117] "RemoveContainer" containerID="023d1061732a239ce49a7c0ad8b8c1320c94212598a48b9f7bd92b00be54b302" Apr 02 14:08:49 crc kubenswrapper[4732]: I0402 14:08:49.274525 4732 scope.go:117] "RemoveContainer" containerID="2bf3c865c9d3f80d52215bceb5acce60796625666c1002a703a1e1c4608f9b78" Apr 02 14:08:49 crc kubenswrapper[4732]: I0402 14:08:49.314359 4732 scope.go:117] "RemoveContainer" containerID="de458a620751559cc0ae0bdde07ff19ef2327c161a36efa23aae8c84591fb109" Apr 02 14:08:49 crc kubenswrapper[4732]: I0402 14:08:49.336486 4732 scope.go:117] "RemoveContainer" containerID="813df6979af898c53fe62751d376e6fa9acbf0c42d87e8d987333c16cb709bc3" Apr 02 14:08:58 crc kubenswrapper[4732]: I0402 14:08:58.051178 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-9sdcc"] Apr 02 14:08:58 crc kubenswrapper[4732]: I0402 14:08:58.063178 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-8a7f-account-create-update-rcjkk"] Apr 02 14:08:58 crc kubenswrapper[4732]: I0402 14:08:58.073049 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-9sdcc"] Apr 02 14:08:58 crc kubenswrapper[4732]: I0402 14:08:58.081875 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-8a7f-account-create-update-rcjkk"] Apr 02 14:08:58 crc kubenswrapper[4732]: I0402 14:08:58.703586 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36a80dbf-30b0-40f2-b37f-01912226bb43" path="/var/lib/kubelet/pods/36a80dbf-30b0-40f2-b37f-01912226bb43/volumes" Apr 02 14:08:58 crc kubenswrapper[4732]: I0402 14:08:58.704901 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b57899ef-a548-4a0d-a677-529a876d7d68" path="/var/lib/kubelet/pods/b57899ef-a548-4a0d-a677-529a876d7d68/volumes" Apr 02 14:09:02 crc kubenswrapper[4732]: I0402 14:09:02.680279 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:09:02 crc kubenswrapper[4732]: E0402 14:09:02.681053 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:09:13 crc kubenswrapper[4732]: I0402 14:09:13.680968 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:09:13 crc kubenswrapper[4732]: E0402 14:09:13.681792 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:09:14 crc kubenswrapper[4732]: I0402 14:09:14.049928 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-da70-account-create-update-pl545"] Apr 02 14:09:14 crc kubenswrapper[4732]: I0402 14:09:14.063440 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-sjkqb"] Apr 02 14:09:14 crc kubenswrapper[4732]: I0402 14:09:14.076991 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-56c3-account-create-update-xg9gr"] Apr 02 14:09:14 crc kubenswrapper[4732]: I0402 14:09:14.089377 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-lmkhl"] Apr 02 14:09:14 crc kubenswrapper[4732]: I0402 14:09:14.097981 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-56c3-account-create-update-xg9gr"] Apr 02 14:09:14 crc kubenswrapper[4732]: I0402 14:09:14.105681 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-sjkqb"] Apr 02 14:09:14 crc kubenswrapper[4732]: I0402 14:09:14.114720 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-da70-account-create-update-pl545"] Apr 02 14:09:14 crc kubenswrapper[4732]: I0402 14:09:14.124535 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-lmkhl"] Apr 02 14:09:14 crc kubenswrapper[4732]: I0402 14:09:14.690713 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d495f3e-102b-45f0-88e4-fe3777c11b97" path="/var/lib/kubelet/pods/1d495f3e-102b-45f0-88e4-fe3777c11b97/volumes" Apr 02 14:09:14 crc kubenswrapper[4732]: I0402 14:09:14.691546 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b406167-98d8-46ae-8319-5734de01a2e0" path="/var/lib/kubelet/pods/6b406167-98d8-46ae-8319-5734de01a2e0/volumes" Apr 02 14:09:14 crc kubenswrapper[4732]: I0402 14:09:14.692124 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7907b3e6-ba65-4bc8-bf30-dd79cab266b5" path="/var/lib/kubelet/pods/7907b3e6-ba65-4bc8-bf30-dd79cab266b5/volumes" Apr 02 14:09:14 crc kubenswrapper[4732]: I0402 14:09:14.692702 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90e2fd4b-3c18-4414-93ea-6433d6c52f80" path="/var/lib/kubelet/pods/90e2fd4b-3c18-4414-93ea-6433d6c52f80/volumes" Apr 02 14:09:16 crc kubenswrapper[4732]: I0402 14:09:16.030246 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-kbmg5"] Apr 02 14:09:16 crc kubenswrapper[4732]: I0402 14:09:16.043256 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-kbmg5"] Apr 02 14:09:16 crc kubenswrapper[4732]: I0402 14:09:16.692912 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2e69ec9-8296-4cbf-a039-82cd34ed3d72" path="/var/lib/kubelet/pods/e2e69ec9-8296-4cbf-a039-82cd34ed3d72/volumes" Apr 02 14:09:23 crc kubenswrapper[4732]: I0402 14:09:23.037099 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-cs4gt"] Apr 02 14:09:23 crc kubenswrapper[4732]: I0402 14:09:23.049655 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-cs4gt"] Apr 02 14:09:24 crc kubenswrapper[4732]: I0402 14:09:24.694138 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3eee308-f9e6-4475-a2a4-2116af760963" path="/var/lib/kubelet/pods/e3eee308-f9e6-4475-a2a4-2116af760963/volumes" Apr 02 14:09:27 crc kubenswrapper[4732]: I0402 14:09:27.680217 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:09:27 crc kubenswrapper[4732]: E0402 14:09:27.681050 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:09:38 crc kubenswrapper[4732]: I0402 14:09:38.681010 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:09:38 crc kubenswrapper[4732]: E0402 14:09:38.681865 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:09:41 crc kubenswrapper[4732]: I0402 14:09:41.043161 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-rvpl9"] Apr 02 14:09:41 crc kubenswrapper[4732]: I0402 14:09:41.067859 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-4269-account-create-update-nv66q"] Apr 02 14:09:41 crc kubenswrapper[4732]: I0402 14:09:41.079168 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-5e24-account-create-update-rbhwq"] Apr 02 14:09:41 crc kubenswrapper[4732]: I0402 14:09:41.086569 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-lc9bc"] Apr 02 14:09:41 crc kubenswrapper[4732]: I0402 14:09:41.094467 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-de5a-account-create-update-x9fbv"] Apr 02 14:09:41 crc kubenswrapper[4732]: I0402 14:09:41.102315 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-4269-account-create-update-nv66q"] Apr 02 14:09:41 crc kubenswrapper[4732]: I0402 14:09:41.111122 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-de5a-account-create-update-x9fbv"] Apr 02 14:09:41 crc kubenswrapper[4732]: I0402 14:09:41.121882 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-rvpl9"] Apr 02 14:09:41 crc kubenswrapper[4732]: I0402 14:09:41.130214 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-lc9bc"] Apr 02 14:09:41 crc kubenswrapper[4732]: I0402 14:09:41.138646 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-5e24-account-create-update-rbhwq"] Apr 02 14:09:41 crc kubenswrapper[4732]: I0402 14:09:41.147702 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-blk7m"] Apr 02 14:09:41 crc kubenswrapper[4732]: I0402 14:09:41.156385 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-blk7m"] Apr 02 14:09:42 crc kubenswrapper[4732]: I0402 14:09:42.690682 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2055195c-7029-4ba4-b6b1-7e717991cbb3" path="/var/lib/kubelet/pods/2055195c-7029-4ba4-b6b1-7e717991cbb3/volumes" Apr 02 14:09:42 crc kubenswrapper[4732]: I0402 14:09:42.691280 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="354dc245-41f8-48ca-8fef-ef66ea015690" path="/var/lib/kubelet/pods/354dc245-41f8-48ca-8fef-ef66ea015690/volumes" Apr 02 14:09:42 crc kubenswrapper[4732]: I0402 14:09:42.691811 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50683249-0922-48bc-9bea-f6ce81e3d192" path="/var/lib/kubelet/pods/50683249-0922-48bc-9bea-f6ce81e3d192/volumes" Apr 02 14:09:42 crc kubenswrapper[4732]: I0402 14:09:42.692418 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61ff3e0e-f82e-4130-afa8-739689043221" path="/var/lib/kubelet/pods/61ff3e0e-f82e-4130-afa8-739689043221/volumes" Apr 02 14:09:42 crc kubenswrapper[4732]: I0402 14:09:42.693504 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ed7061f-5bb0-4113-851f-45cb0af3e77d" path="/var/lib/kubelet/pods/7ed7061f-5bb0-4113-851f-45cb0af3e77d/volumes" Apr 02 14:09:42 crc kubenswrapper[4732]: I0402 14:09:42.694117 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7e9d6ea-5e58-46fb-a241-6db82d0abd15" path="/var/lib/kubelet/pods/f7e9d6ea-5e58-46fb-a241-6db82d0abd15/volumes" Apr 02 14:09:47 crc kubenswrapper[4732]: I0402 14:09:47.043538 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-qgp9c"] Apr 02 14:09:47 crc kubenswrapper[4732]: I0402 14:09:47.052578 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-qgp9c"] Apr 02 14:09:48 crc kubenswrapper[4732]: I0402 14:09:48.699425 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="756f0330-2838-4d2f-a92f-739ed4acab76" path="/var/lib/kubelet/pods/756f0330-2838-4d2f-a92f-739ed4acab76/volumes" Apr 02 14:09:49 crc kubenswrapper[4732]: I0402 14:09:49.435574 4732 scope.go:117] "RemoveContainer" containerID="07c87ac3e88714a95605541c92131c731ee3697e162d5ee5c1ba6b8627b54434" Apr 02 14:09:49 crc kubenswrapper[4732]: I0402 14:09:49.464314 4732 scope.go:117] "RemoveContainer" containerID="acbd44376914a5a7cce475daf23bfe53fa628560b0bb8ce2c80bec24acc6ab85" Apr 02 14:09:49 crc kubenswrapper[4732]: I0402 14:09:49.509808 4732 scope.go:117] "RemoveContainer" containerID="74926c31f3a2ffe4338b1e5734feeaa65c342309cbe58b3350a039dbe94db109" Apr 02 14:09:49 crc kubenswrapper[4732]: I0402 14:09:49.555955 4732 scope.go:117] "RemoveContainer" containerID="0a0032734f5fd52e26c5e0604d22f129133838bce5b2865c9a8ace616d48d77d" Apr 02 14:09:49 crc kubenswrapper[4732]: I0402 14:09:49.610313 4732 scope.go:117] "RemoveContainer" containerID="b3657c45bbcc5ec422c65b30f15d71eaea3a630e99da5c680870af5be98a1057" Apr 02 14:09:49 crc kubenswrapper[4732]: I0402 14:09:49.639696 4732 scope.go:117] "RemoveContainer" containerID="bb8ea28ebf5e33ced4c2f8486f8dfe5524a7abce1a62b5e7fb5262ef8252196d" Apr 02 14:09:49 crc kubenswrapper[4732]: I0402 14:09:49.681184 4732 scope.go:117] "RemoveContainer" containerID="90301a5b3770820faf4773b54429c8ae265f19bd4a26fa5f5d0b97834e08f405" Apr 02 14:09:49 crc kubenswrapper[4732]: I0402 14:09:49.709192 4732 scope.go:117] "RemoveContainer" containerID="f49b0d66e2113c8b4a9bfd08a0ad2f5202af6eb438aeb031770c1c903d8ee3b9" Apr 02 14:09:49 crc kubenswrapper[4732]: I0402 14:09:49.731696 4732 scope.go:117] "RemoveContainer" containerID="d1d0b2dec54ab79b1b90f5a344b79ec87fdccf2c932b5d875a946933032fd5a5" Apr 02 14:09:49 crc kubenswrapper[4732]: I0402 14:09:49.754942 4732 scope.go:117] "RemoveContainer" containerID="16ed54a1e0a3aae64c8783d62ca5fc2d1642d99087645ba7ab6ae31450c56103" Apr 02 14:09:49 crc kubenswrapper[4732]: I0402 14:09:49.776202 4732 scope.go:117] "RemoveContainer" containerID="7bbca9f232721293aae826fe9b616df67da3b0c59428f8612b9b1bc2b8128d6b" Apr 02 14:09:49 crc kubenswrapper[4732]: I0402 14:09:49.794258 4732 scope.go:117] "RemoveContainer" containerID="4b0d9f22e57ea5804e170db48b53d2c96be751433714eeedd5de906812ae372a" Apr 02 14:09:49 crc kubenswrapper[4732]: I0402 14:09:49.811652 4732 scope.go:117] "RemoveContainer" containerID="96ef04e158bb2a83cd3bf3c9ac26655c113bf9e195ec6e34cfd8766d4564a311" Apr 02 14:09:49 crc kubenswrapper[4732]: I0402 14:09:49.826908 4732 scope.go:117] "RemoveContainer" containerID="339e0eb5b86d64be8bcd69bc05ae280f085ab5cd74e7c3b14ac096e9e752d90c" Apr 02 14:09:49 crc kubenswrapper[4732]: I0402 14:09:49.846288 4732 scope.go:117] "RemoveContainer" containerID="f056e082766505ebdab12624db3986b625ea9b0998b7eb20a1651ee5747de15d" Apr 02 14:09:53 crc kubenswrapper[4732]: I0402 14:09:53.680420 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:09:53 crc kubenswrapper[4732]: E0402 14:09:53.681229 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:09:56 crc kubenswrapper[4732]: I0402 14:09:56.006936 4732 generic.go:334] "Generic (PLEG): container finished" podID="240ff67d-47d5-4b2e-b744-e0e2332a9496" containerID="7cb4e07ba31e7603f8a0198ac3082697817e00b2d659fe39efbfe72c4a6c395e" exitCode=0 Apr 02 14:09:56 crc kubenswrapper[4732]: I0402 14:09:56.007059 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q" event={"ID":"240ff67d-47d5-4b2e-b744-e0e2332a9496","Type":"ContainerDied","Data":"7cb4e07ba31e7603f8a0198ac3082697817e00b2d659fe39efbfe72c4a6c395e"} Apr 02 14:09:56 crc kubenswrapper[4732]: I0402 14:09:56.517737 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-9f57ff6c-7m8sr" podUID="7f6ffca1-ce91-4e20-8cbc-38a3eab1616e" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Apr 02 14:09:57 crc kubenswrapper[4732]: I0402 14:09:57.395859 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q" Apr 02 14:09:57 crc kubenswrapper[4732]: I0402 14:09:57.402475 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/240ff67d-47d5-4b2e-b744-e0e2332a9496-ssh-key-openstack-edpm-ipam\") pod \"240ff67d-47d5-4b2e-b744-e0e2332a9496\" (UID: \"240ff67d-47d5-4b2e-b744-e0e2332a9496\") " Apr 02 14:09:57 crc kubenswrapper[4732]: I0402 14:09:57.402597 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k79sk\" (UniqueName: \"kubernetes.io/projected/240ff67d-47d5-4b2e-b744-e0e2332a9496-kube-api-access-k79sk\") pod \"240ff67d-47d5-4b2e-b744-e0e2332a9496\" (UID: \"240ff67d-47d5-4b2e-b744-e0e2332a9496\") " Apr 02 14:09:57 crc kubenswrapper[4732]: I0402 14:09:57.402681 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/240ff67d-47d5-4b2e-b744-e0e2332a9496-inventory\") pod \"240ff67d-47d5-4b2e-b744-e0e2332a9496\" (UID: \"240ff67d-47d5-4b2e-b744-e0e2332a9496\") " Apr 02 14:09:57 crc kubenswrapper[4732]: I0402 14:09:57.407750 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/240ff67d-47d5-4b2e-b744-e0e2332a9496-kube-api-access-k79sk" (OuterVolumeSpecName: "kube-api-access-k79sk") pod "240ff67d-47d5-4b2e-b744-e0e2332a9496" (UID: "240ff67d-47d5-4b2e-b744-e0e2332a9496"). InnerVolumeSpecName "kube-api-access-k79sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:09:57 crc kubenswrapper[4732]: I0402 14:09:57.439806 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/240ff67d-47d5-4b2e-b744-e0e2332a9496-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "240ff67d-47d5-4b2e-b744-e0e2332a9496" (UID: "240ff67d-47d5-4b2e-b744-e0e2332a9496"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:09:57 crc kubenswrapper[4732]: I0402 14:09:57.444460 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/240ff67d-47d5-4b2e-b744-e0e2332a9496-inventory" (OuterVolumeSpecName: "inventory") pod "240ff67d-47d5-4b2e-b744-e0e2332a9496" (UID: "240ff67d-47d5-4b2e-b744-e0e2332a9496"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:09:57 crc kubenswrapper[4732]: I0402 14:09:57.505588 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/240ff67d-47d5-4b2e-b744-e0e2332a9496-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 02 14:09:57 crc kubenswrapper[4732]: I0402 14:09:57.505627 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k79sk\" (UniqueName: \"kubernetes.io/projected/240ff67d-47d5-4b2e-b744-e0e2332a9496-kube-api-access-k79sk\") on node \"crc\" DevicePath \"\"" Apr 02 14:09:57 crc kubenswrapper[4732]: I0402 14:09:57.505637 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/240ff67d-47d5-4b2e-b744-e0e2332a9496-inventory\") on node \"crc\" DevicePath \"\"" Apr 02 14:09:58 crc kubenswrapper[4732]: I0402 14:09:58.033920 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q" event={"ID":"240ff67d-47d5-4b2e-b744-e0e2332a9496","Type":"ContainerDied","Data":"47c2945aca4463ca18d257ac5bfc8437ae944f173daf7d444af4931e0c8a8236"} Apr 02 14:09:58 crc kubenswrapper[4732]: I0402 14:09:58.034003 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47c2945aca4463ca18d257ac5bfc8437ae944f173daf7d444af4931e0c8a8236" Apr 02 14:09:58 crc kubenswrapper[4732]: I0402 14:09:58.033967 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q" Apr 02 14:09:58 crc kubenswrapper[4732]: I0402 14:09:58.106864 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z"] Apr 02 14:09:58 crc kubenswrapper[4732]: E0402 14:09:58.107439 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="240ff67d-47d5-4b2e-b744-e0e2332a9496" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Apr 02 14:09:58 crc kubenswrapper[4732]: I0402 14:09:58.107456 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="240ff67d-47d5-4b2e-b744-e0e2332a9496" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Apr 02 14:09:58 crc kubenswrapper[4732]: I0402 14:09:58.107690 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="240ff67d-47d5-4b2e-b744-e0e2332a9496" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Apr 02 14:09:58 crc kubenswrapper[4732]: I0402 14:09:58.108408 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z" Apr 02 14:09:58 crc kubenswrapper[4732]: I0402 14:09:58.113381 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wdhd4" Apr 02 14:09:58 crc kubenswrapper[4732]: I0402 14:09:58.113562 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 02 14:09:58 crc kubenswrapper[4732]: I0402 14:09:58.113737 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 02 14:09:58 crc kubenswrapper[4732]: I0402 14:09:58.113866 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 02 14:09:58 crc kubenswrapper[4732]: I0402 14:09:58.116323 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21ce48db-fb4b-4086-86cc-a32f30ebd002-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z\" (UID: \"21ce48db-fb4b-4086-86cc-a32f30ebd002\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z" Apr 02 14:09:58 crc kubenswrapper[4732]: I0402 14:09:58.116405 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/21ce48db-fb4b-4086-86cc-a32f30ebd002-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z\" (UID: \"21ce48db-fb4b-4086-86cc-a32f30ebd002\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z" Apr 02 14:09:58 crc kubenswrapper[4732]: I0402 14:09:58.116440 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6snd\" (UniqueName: \"kubernetes.io/projected/21ce48db-fb4b-4086-86cc-a32f30ebd002-kube-api-access-q6snd\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z\" (UID: \"21ce48db-fb4b-4086-86cc-a32f30ebd002\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z" Apr 02 14:09:58 crc kubenswrapper[4732]: I0402 14:09:58.120684 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z"] Apr 02 14:09:58 crc kubenswrapper[4732]: I0402 14:09:58.218952 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21ce48db-fb4b-4086-86cc-a32f30ebd002-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z\" (UID: \"21ce48db-fb4b-4086-86cc-a32f30ebd002\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z" Apr 02 14:09:58 crc kubenswrapper[4732]: I0402 14:09:58.219105 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/21ce48db-fb4b-4086-86cc-a32f30ebd002-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z\" (UID: \"21ce48db-fb4b-4086-86cc-a32f30ebd002\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z" Apr 02 14:09:58 crc kubenswrapper[4732]: I0402 14:09:58.219157 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6snd\" (UniqueName: \"kubernetes.io/projected/21ce48db-fb4b-4086-86cc-a32f30ebd002-kube-api-access-q6snd\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z\" (UID: \"21ce48db-fb4b-4086-86cc-a32f30ebd002\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z" Apr 02 14:09:58 crc kubenswrapper[4732]: I0402 14:09:58.223313 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/21ce48db-fb4b-4086-86cc-a32f30ebd002-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z\" (UID: \"21ce48db-fb4b-4086-86cc-a32f30ebd002\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z" Apr 02 14:09:58 crc kubenswrapper[4732]: I0402 14:09:58.223701 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21ce48db-fb4b-4086-86cc-a32f30ebd002-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z\" (UID: \"21ce48db-fb4b-4086-86cc-a32f30ebd002\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z" Apr 02 14:09:58 crc kubenswrapper[4732]: I0402 14:09:58.235481 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6snd\" (UniqueName: \"kubernetes.io/projected/21ce48db-fb4b-4086-86cc-a32f30ebd002-kube-api-access-q6snd\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z\" (UID: \"21ce48db-fb4b-4086-86cc-a32f30ebd002\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z" Apr 02 14:09:58 crc kubenswrapper[4732]: I0402 14:09:58.434363 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z" Apr 02 14:09:58 crc kubenswrapper[4732]: I0402 14:09:58.980578 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z"] Apr 02 14:09:59 crc kubenswrapper[4732]: I0402 14:09:59.045827 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z" event={"ID":"21ce48db-fb4b-4086-86cc-a32f30ebd002","Type":"ContainerStarted","Data":"ab64e8615f90438282085d5f8339001479ea70796056274459a3b408351ce446"} Apr 02 14:10:00 crc kubenswrapper[4732]: I0402 14:10:00.056072 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z" event={"ID":"21ce48db-fb4b-4086-86cc-a32f30ebd002","Type":"ContainerStarted","Data":"249c01831f4b6aecdcc6168a2223258efaba958b82867300a7192d6f41b4feb1"} Apr 02 14:10:00 crc kubenswrapper[4732]: I0402 14:10:00.080135 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z" podStartSLOduration=1.653169971 podStartE2EDuration="2.080115101s" podCreationTimestamp="2026-04-02 14:09:58 +0000 UTC" firstStartedPulling="2026-04-02 14:09:58.976850633 +0000 UTC m=+1955.881258186" lastFinishedPulling="2026-04-02 14:09:59.403795773 +0000 UTC m=+1956.308203316" observedRunningTime="2026-04-02 14:10:00.071921787 +0000 UTC m=+1956.976329350" watchObservedRunningTime="2026-04-02 14:10:00.080115101 +0000 UTC m=+1956.984522674" Apr 02 14:10:00 crc kubenswrapper[4732]: I0402 14:10:00.134469 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585650-2gx64"] Apr 02 14:10:00 crc kubenswrapper[4732]: I0402 14:10:00.135803 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585650-2gx64" Apr 02 14:10:00 crc kubenswrapper[4732]: I0402 14:10:00.138284 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:10:00 crc kubenswrapper[4732]: I0402 14:10:00.138473 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:10:00 crc kubenswrapper[4732]: I0402 14:10:00.141509 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:10:00 crc kubenswrapper[4732]: I0402 14:10:00.155741 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585650-2gx64"] Apr 02 14:10:00 crc kubenswrapper[4732]: I0402 14:10:00.259518 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqzvt\" (UniqueName: \"kubernetes.io/projected/37663d23-9f83-4211-80df-a8167d95f79e-kube-api-access-sqzvt\") pod \"auto-csr-approver-29585650-2gx64\" (UID: \"37663d23-9f83-4211-80df-a8167d95f79e\") " pod="openshift-infra/auto-csr-approver-29585650-2gx64" Apr 02 14:10:00 crc kubenswrapper[4732]: I0402 14:10:00.361639 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqzvt\" (UniqueName: \"kubernetes.io/projected/37663d23-9f83-4211-80df-a8167d95f79e-kube-api-access-sqzvt\") pod \"auto-csr-approver-29585650-2gx64\" (UID: \"37663d23-9f83-4211-80df-a8167d95f79e\") " pod="openshift-infra/auto-csr-approver-29585650-2gx64" Apr 02 14:10:00 crc kubenswrapper[4732]: I0402 14:10:00.394420 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqzvt\" (UniqueName: \"kubernetes.io/projected/37663d23-9f83-4211-80df-a8167d95f79e-kube-api-access-sqzvt\") pod \"auto-csr-approver-29585650-2gx64\" (UID: \"37663d23-9f83-4211-80df-a8167d95f79e\") " pod="openshift-infra/auto-csr-approver-29585650-2gx64" Apr 02 14:10:00 crc kubenswrapper[4732]: I0402 14:10:00.458004 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585650-2gx64" Apr 02 14:10:00 crc kubenswrapper[4732]: I0402 14:10:00.930428 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585650-2gx64"] Apr 02 14:10:00 crc kubenswrapper[4732]: W0402 14:10:00.946819 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37663d23_9f83_4211_80df_a8167d95f79e.slice/crio-e8fee0fb221fe07dd2fb16def41e97d717e93a71e73771073e35aa07c558710d WatchSource:0}: Error finding container e8fee0fb221fe07dd2fb16def41e97d717e93a71e73771073e35aa07c558710d: Status 404 returned error can't find the container with id e8fee0fb221fe07dd2fb16def41e97d717e93a71e73771073e35aa07c558710d Apr 02 14:10:01 crc kubenswrapper[4732]: I0402 14:10:01.065330 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585650-2gx64" event={"ID":"37663d23-9f83-4211-80df-a8167d95f79e","Type":"ContainerStarted","Data":"e8fee0fb221fe07dd2fb16def41e97d717e93a71e73771073e35aa07c558710d"} Apr 02 14:10:03 crc kubenswrapper[4732]: I0402 14:10:03.083085 4732 generic.go:334] "Generic (PLEG): container finished" podID="37663d23-9f83-4211-80df-a8167d95f79e" containerID="069d712375aa62b6ebf7a181cff281a8b2210a138171da75928ff6006c3aa34c" exitCode=0 Apr 02 14:10:03 crc kubenswrapper[4732]: I0402 14:10:03.083362 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585650-2gx64" event={"ID":"37663d23-9f83-4211-80df-a8167d95f79e","Type":"ContainerDied","Data":"069d712375aa62b6ebf7a181cff281a8b2210a138171da75928ff6006c3aa34c"} Apr 02 14:10:04 crc kubenswrapper[4732]: I0402 14:10:04.687671 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:10:04 crc kubenswrapper[4732]: E0402 14:10:04.688748 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:10:04 crc kubenswrapper[4732]: I0402 14:10:04.718638 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585650-2gx64" Apr 02 14:10:04 crc kubenswrapper[4732]: I0402 14:10:04.854360 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqzvt\" (UniqueName: \"kubernetes.io/projected/37663d23-9f83-4211-80df-a8167d95f79e-kube-api-access-sqzvt\") pod \"37663d23-9f83-4211-80df-a8167d95f79e\" (UID: \"37663d23-9f83-4211-80df-a8167d95f79e\") " Apr 02 14:10:04 crc kubenswrapper[4732]: I0402 14:10:04.860220 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37663d23-9f83-4211-80df-a8167d95f79e-kube-api-access-sqzvt" (OuterVolumeSpecName: "kube-api-access-sqzvt") pod "37663d23-9f83-4211-80df-a8167d95f79e" (UID: "37663d23-9f83-4211-80df-a8167d95f79e"). InnerVolumeSpecName "kube-api-access-sqzvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:10:04 crc kubenswrapper[4732]: I0402 14:10:04.957390 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqzvt\" (UniqueName: \"kubernetes.io/projected/37663d23-9f83-4211-80df-a8167d95f79e-kube-api-access-sqzvt\") on node \"crc\" DevicePath \"\"" Apr 02 14:10:05 crc kubenswrapper[4732]: I0402 14:10:05.101925 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585650-2gx64" event={"ID":"37663d23-9f83-4211-80df-a8167d95f79e","Type":"ContainerDied","Data":"e8fee0fb221fe07dd2fb16def41e97d717e93a71e73771073e35aa07c558710d"} Apr 02 14:10:05 crc kubenswrapper[4732]: I0402 14:10:05.101970 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8fee0fb221fe07dd2fb16def41e97d717e93a71e73771073e35aa07c558710d" Apr 02 14:10:05 crc kubenswrapper[4732]: I0402 14:10:05.101977 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585650-2gx64" Apr 02 14:10:05 crc kubenswrapper[4732]: I0402 14:10:05.773464 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585644-vgwpc"] Apr 02 14:10:05 crc kubenswrapper[4732]: I0402 14:10:05.781563 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585644-vgwpc"] Apr 02 14:10:06 crc kubenswrapper[4732]: I0402 14:10:06.690432 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="891746ea-0d72-4bb9-aba3-a0f140bb31b4" path="/var/lib/kubelet/pods/891746ea-0d72-4bb9-aba3-a0f140bb31b4/volumes" Apr 02 14:10:15 crc kubenswrapper[4732]: I0402 14:10:15.681908 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:10:15 crc kubenswrapper[4732]: E0402 14:10:15.682698 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:10:17 crc kubenswrapper[4732]: I0402 14:10:17.043687 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-89fvh"] Apr 02 14:10:17 crc kubenswrapper[4732]: I0402 14:10:17.057086 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-89fvh"] Apr 02 14:10:18 crc kubenswrapper[4732]: I0402 14:10:18.700598 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="481f6e75-b423-4c6c-a1d6-b43674481fc1" path="/var/lib/kubelet/pods/481f6e75-b423-4c6c-a1d6-b43674481fc1/volumes" Apr 02 14:10:26 crc kubenswrapper[4732]: I0402 14:10:26.680448 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:10:26 crc kubenswrapper[4732]: E0402 14:10:26.681186 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:10:32 crc kubenswrapper[4732]: I0402 14:10:32.040422 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-fb5bv"] Apr 02 14:10:32 crc kubenswrapper[4732]: I0402 14:10:32.049039 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-fb5bv"] Apr 02 14:10:32 crc kubenswrapper[4732]: I0402 14:10:32.691062 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="261837b5-19d6-404f-b88f-b5b6cf88ebec" path="/var/lib/kubelet/pods/261837b5-19d6-404f-b88f-b5b6cf88ebec/volumes" Apr 02 14:10:38 crc kubenswrapper[4732]: I0402 14:10:38.680425 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:10:38 crc kubenswrapper[4732]: E0402 14:10:38.681446 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:10:39 crc kubenswrapper[4732]: I0402 14:10:39.032975 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-dh5g6"] Apr 02 14:10:39 crc kubenswrapper[4732]: I0402 14:10:39.044388 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-dh5g6"] Apr 02 14:10:40 crc kubenswrapper[4732]: I0402 14:10:40.691338 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64f807d9-0af7-4723-98b2-dd3cbe55df99" path="/var/lib/kubelet/pods/64f807d9-0af7-4723-98b2-dd3cbe55df99/volumes" Apr 02 14:10:41 crc kubenswrapper[4732]: I0402 14:10:41.035347 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-d54q6"] Apr 02 14:10:41 crc kubenswrapper[4732]: I0402 14:10:41.044955 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-d54q6"] Apr 02 14:10:42 crc kubenswrapper[4732]: I0402 14:10:42.691758 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34aed337-bbff-45a7-b95f-b26c95733c82" path="/var/lib/kubelet/pods/34aed337-bbff-45a7-b95f-b26c95733c82/volumes" Apr 02 14:10:44 crc kubenswrapper[4732]: I0402 14:10:44.048458 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-jwc6l"] Apr 02 14:10:44 crc kubenswrapper[4732]: I0402 14:10:44.059295 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-jwc6l"] Apr 02 14:10:44 crc kubenswrapper[4732]: I0402 14:10:44.696411 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10ba0697-529f-41d3-a1a8-55b50ed024a2" path="/var/lib/kubelet/pods/10ba0697-529f-41d3-a1a8-55b50ed024a2/volumes" Apr 02 14:10:50 crc kubenswrapper[4732]: I0402 14:10:50.162694 4732 scope.go:117] "RemoveContainer" containerID="d8df9e280fcecedceff628b0366a8cfd007c16737c9bc8bf5c8a30e7d6f7222a" Apr 02 14:10:50 crc kubenswrapper[4732]: I0402 14:10:50.207248 4732 scope.go:117] "RemoveContainer" containerID="d653dc1b373bc973247840f58437e041e1d7e3215ec38a57c61328d8bdf9a166" Apr 02 14:10:50 crc kubenswrapper[4732]: I0402 14:10:50.255343 4732 scope.go:117] "RemoveContainer" containerID="29099b7d93412a262eb083604ee4e20a01b86f78aa60f531a53fea918e12553a" Apr 02 14:10:50 crc kubenswrapper[4732]: I0402 14:10:50.292125 4732 scope.go:117] "RemoveContainer" containerID="640881bf9aa75a112cdc7f52c1419406cdb9d754515996f2df4a58ab10015672" Apr 02 14:10:50 crc kubenswrapper[4732]: I0402 14:10:50.321216 4732 scope.go:117] "RemoveContainer" containerID="84418e9105b2ad1c6b30a264558b8d882276562246909a9034ee9b5e81184d7a" Apr 02 14:10:50 crc kubenswrapper[4732]: I0402 14:10:50.381541 4732 scope.go:117] "RemoveContainer" containerID="c1ff334494eebb50605f5ce8c2393c63ab95edea02fd9a390bfa160974a7444f" Apr 02 14:10:53 crc kubenswrapper[4732]: I0402 14:10:53.680572 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:10:53 crc kubenswrapper[4732]: E0402 14:10:53.682028 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:11:02 crc kubenswrapper[4732]: I0402 14:11:02.614039 4732 generic.go:334] "Generic (PLEG): container finished" podID="21ce48db-fb4b-4086-86cc-a32f30ebd002" containerID="249c01831f4b6aecdcc6168a2223258efaba958b82867300a7192d6f41b4feb1" exitCode=0 Apr 02 14:11:02 crc kubenswrapper[4732]: I0402 14:11:02.614119 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z" event={"ID":"21ce48db-fb4b-4086-86cc-a32f30ebd002","Type":"ContainerDied","Data":"249c01831f4b6aecdcc6168a2223258efaba958b82867300a7192d6f41b4feb1"} Apr 02 14:11:04 crc kubenswrapper[4732]: I0402 14:11:04.695043 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z" Apr 02 14:11:04 crc kubenswrapper[4732]: I0402 14:11:04.723581 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21ce48db-fb4b-4086-86cc-a32f30ebd002-inventory\") pod \"21ce48db-fb4b-4086-86cc-a32f30ebd002\" (UID: \"21ce48db-fb4b-4086-86cc-a32f30ebd002\") " Apr 02 14:11:04 crc kubenswrapper[4732]: I0402 14:11:04.723677 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6snd\" (UniqueName: \"kubernetes.io/projected/21ce48db-fb4b-4086-86cc-a32f30ebd002-kube-api-access-q6snd\") pod \"21ce48db-fb4b-4086-86cc-a32f30ebd002\" (UID: \"21ce48db-fb4b-4086-86cc-a32f30ebd002\") " Apr 02 14:11:04 crc kubenswrapper[4732]: I0402 14:11:04.723978 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/21ce48db-fb4b-4086-86cc-a32f30ebd002-ssh-key-openstack-edpm-ipam\") pod \"21ce48db-fb4b-4086-86cc-a32f30ebd002\" (UID: \"21ce48db-fb4b-4086-86cc-a32f30ebd002\") " Apr 02 14:11:04 crc kubenswrapper[4732]: I0402 14:11:04.730473 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21ce48db-fb4b-4086-86cc-a32f30ebd002-kube-api-access-q6snd" (OuterVolumeSpecName: "kube-api-access-q6snd") pod "21ce48db-fb4b-4086-86cc-a32f30ebd002" (UID: "21ce48db-fb4b-4086-86cc-a32f30ebd002"). InnerVolumeSpecName "kube-api-access-q6snd". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:11:04 crc kubenswrapper[4732]: I0402 14:11:04.754709 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21ce48db-fb4b-4086-86cc-a32f30ebd002-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "21ce48db-fb4b-4086-86cc-a32f30ebd002" (UID: "21ce48db-fb4b-4086-86cc-a32f30ebd002"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:11:04 crc kubenswrapper[4732]: I0402 14:11:04.766576 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21ce48db-fb4b-4086-86cc-a32f30ebd002-inventory" (OuterVolumeSpecName: "inventory") pod "21ce48db-fb4b-4086-86cc-a32f30ebd002" (UID: "21ce48db-fb4b-4086-86cc-a32f30ebd002"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:11:04 crc kubenswrapper[4732]: I0402 14:11:04.827028 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/21ce48db-fb4b-4086-86cc-a32f30ebd002-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 02 14:11:04 crc kubenswrapper[4732]: I0402 14:11:04.827061 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21ce48db-fb4b-4086-86cc-a32f30ebd002-inventory\") on node \"crc\" DevicePath \"\"" Apr 02 14:11:04 crc kubenswrapper[4732]: I0402 14:11:04.827070 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6snd\" (UniqueName: \"kubernetes.io/projected/21ce48db-fb4b-4086-86cc-a32f30ebd002-kube-api-access-q6snd\") on node \"crc\" DevicePath \"\"" Apr 02 14:11:05 crc kubenswrapper[4732]: I0402 14:11:05.260174 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z" event={"ID":"21ce48db-fb4b-4086-86cc-a32f30ebd002","Type":"ContainerDied","Data":"ab64e8615f90438282085d5f8339001479ea70796056274459a3b408351ce446"} Apr 02 14:11:05 crc kubenswrapper[4732]: I0402 14:11:05.260213 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab64e8615f90438282085d5f8339001479ea70796056274459a3b408351ce446" Apr 02 14:11:05 crc kubenswrapper[4732]: I0402 14:11:05.260266 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z" Apr 02 14:11:05 crc kubenswrapper[4732]: I0402 14:11:05.300364 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv"] Apr 02 14:11:05 crc kubenswrapper[4732]: E0402 14:11:05.300797 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ce48db-fb4b-4086-86cc-a32f30ebd002" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Apr 02 14:11:05 crc kubenswrapper[4732]: I0402 14:11:05.300816 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ce48db-fb4b-4086-86cc-a32f30ebd002" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Apr 02 14:11:05 crc kubenswrapper[4732]: E0402 14:11:05.300831 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37663d23-9f83-4211-80df-a8167d95f79e" containerName="oc" Apr 02 14:11:05 crc kubenswrapper[4732]: I0402 14:11:05.300839 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="37663d23-9f83-4211-80df-a8167d95f79e" containerName="oc" Apr 02 14:11:05 crc kubenswrapper[4732]: I0402 14:11:05.301046 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="37663d23-9f83-4211-80df-a8167d95f79e" containerName="oc" Apr 02 14:11:05 crc kubenswrapper[4732]: I0402 14:11:05.301075 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="21ce48db-fb4b-4086-86cc-a32f30ebd002" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Apr 02 14:11:05 crc kubenswrapper[4732]: I0402 14:11:05.301730 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv" Apr 02 14:11:05 crc kubenswrapper[4732]: I0402 14:11:05.306697 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wdhd4" Apr 02 14:11:05 crc kubenswrapper[4732]: I0402 14:11:05.306907 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 02 14:11:05 crc kubenswrapper[4732]: I0402 14:11:05.307162 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 02 14:11:05 crc kubenswrapper[4732]: I0402 14:11:05.307306 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 02 14:11:05 crc kubenswrapper[4732]: I0402 14:11:05.313707 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv"] Apr 02 14:11:05 crc kubenswrapper[4732]: I0402 14:11:05.366212 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce9af86e-92fb-4693-8af9-4d95af13b999-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv\" (UID: \"ce9af86e-92fb-4693-8af9-4d95af13b999\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv" Apr 02 14:11:05 crc kubenswrapper[4732]: I0402 14:11:05.366312 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ce9af86e-92fb-4693-8af9-4d95af13b999-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv\" (UID: \"ce9af86e-92fb-4693-8af9-4d95af13b999\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv" Apr 02 14:11:05 crc kubenswrapper[4732]: I0402 14:11:05.366361 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j997z\" (UniqueName: \"kubernetes.io/projected/ce9af86e-92fb-4693-8af9-4d95af13b999-kube-api-access-j997z\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv\" (UID: \"ce9af86e-92fb-4693-8af9-4d95af13b999\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv" Apr 02 14:11:05 crc kubenswrapper[4732]: I0402 14:11:05.468264 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ce9af86e-92fb-4693-8af9-4d95af13b999-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv\" (UID: \"ce9af86e-92fb-4693-8af9-4d95af13b999\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv" Apr 02 14:11:05 crc kubenswrapper[4732]: I0402 14:11:05.468353 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j997z\" (UniqueName: \"kubernetes.io/projected/ce9af86e-92fb-4693-8af9-4d95af13b999-kube-api-access-j997z\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv\" (UID: \"ce9af86e-92fb-4693-8af9-4d95af13b999\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv" Apr 02 14:11:05 crc kubenswrapper[4732]: I0402 14:11:05.468553 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce9af86e-92fb-4693-8af9-4d95af13b999-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv\" (UID: \"ce9af86e-92fb-4693-8af9-4d95af13b999\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv" Apr 02 14:11:05 crc kubenswrapper[4732]: I0402 14:11:05.473368 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce9af86e-92fb-4693-8af9-4d95af13b999-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv\" (UID: \"ce9af86e-92fb-4693-8af9-4d95af13b999\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv" Apr 02 14:11:05 crc kubenswrapper[4732]: I0402 14:11:05.492151 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ce9af86e-92fb-4693-8af9-4d95af13b999-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv\" (UID: \"ce9af86e-92fb-4693-8af9-4d95af13b999\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv" Apr 02 14:11:05 crc kubenswrapper[4732]: I0402 14:11:05.492450 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j997z\" (UniqueName: \"kubernetes.io/projected/ce9af86e-92fb-4693-8af9-4d95af13b999-kube-api-access-j997z\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv\" (UID: \"ce9af86e-92fb-4693-8af9-4d95af13b999\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv" Apr 02 14:11:05 crc kubenswrapper[4732]: I0402 14:11:05.675923 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv" Apr 02 14:11:06 crc kubenswrapper[4732]: I0402 14:11:06.230988 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv"] Apr 02 14:11:06 crc kubenswrapper[4732]: I0402 14:11:06.269193 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv" event={"ID":"ce9af86e-92fb-4693-8af9-4d95af13b999","Type":"ContainerStarted","Data":"1d2b68d91f0f9f3abec9d6a9195e883bbd341d35291166fa12a1e074970138ed"} Apr 02 14:11:07 crc kubenswrapper[4732]: I0402 14:11:07.279642 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv" event={"ID":"ce9af86e-92fb-4693-8af9-4d95af13b999","Type":"ContainerStarted","Data":"0fb8e49c382e983208e58f0dcc39b25942844e6f9ec999e87a0457946062c91b"} Apr 02 14:11:07 crc kubenswrapper[4732]: I0402 14:11:07.296496 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv" podStartSLOduration=1.8699887240000002 podStartE2EDuration="2.296476882s" podCreationTimestamp="2026-04-02 14:11:05 +0000 UTC" firstStartedPulling="2026-04-02 14:11:06.246085812 +0000 UTC m=+2023.150493365" lastFinishedPulling="2026-04-02 14:11:06.67257396 +0000 UTC m=+2023.576981523" observedRunningTime="2026-04-02 14:11:07.295597118 +0000 UTC m=+2024.200004681" watchObservedRunningTime="2026-04-02 14:11:07.296476882 +0000 UTC m=+2024.200884435" Apr 02 14:11:07 crc kubenswrapper[4732]: I0402 14:11:07.680307 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:11:07 crc kubenswrapper[4732]: E0402 14:11:07.680637 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:11:11 crc kubenswrapper[4732]: I0402 14:11:11.320846 4732 generic.go:334] "Generic (PLEG): container finished" podID="ce9af86e-92fb-4693-8af9-4d95af13b999" containerID="0fb8e49c382e983208e58f0dcc39b25942844e6f9ec999e87a0457946062c91b" exitCode=0 Apr 02 14:11:11 crc kubenswrapper[4732]: I0402 14:11:11.320895 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv" event={"ID":"ce9af86e-92fb-4693-8af9-4d95af13b999","Type":"ContainerDied","Data":"0fb8e49c382e983208e58f0dcc39b25942844e6f9ec999e87a0457946062c91b"} Apr 02 14:11:12 crc kubenswrapper[4732]: I0402 14:11:12.778155 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv" Apr 02 14:11:12 crc kubenswrapper[4732]: I0402 14:11:12.915763 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ce9af86e-92fb-4693-8af9-4d95af13b999-ssh-key-openstack-edpm-ipam\") pod \"ce9af86e-92fb-4693-8af9-4d95af13b999\" (UID: \"ce9af86e-92fb-4693-8af9-4d95af13b999\") " Apr 02 14:11:12 crc kubenswrapper[4732]: I0402 14:11:12.915819 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce9af86e-92fb-4693-8af9-4d95af13b999-inventory\") pod \"ce9af86e-92fb-4693-8af9-4d95af13b999\" (UID: \"ce9af86e-92fb-4693-8af9-4d95af13b999\") " Apr 02 14:11:12 crc kubenswrapper[4732]: I0402 14:11:12.916124 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j997z\" (UniqueName: \"kubernetes.io/projected/ce9af86e-92fb-4693-8af9-4d95af13b999-kube-api-access-j997z\") pod \"ce9af86e-92fb-4693-8af9-4d95af13b999\" (UID: \"ce9af86e-92fb-4693-8af9-4d95af13b999\") " Apr 02 14:11:12 crc kubenswrapper[4732]: I0402 14:11:12.922325 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce9af86e-92fb-4693-8af9-4d95af13b999-kube-api-access-j997z" (OuterVolumeSpecName: "kube-api-access-j997z") pod "ce9af86e-92fb-4693-8af9-4d95af13b999" (UID: "ce9af86e-92fb-4693-8af9-4d95af13b999"). InnerVolumeSpecName "kube-api-access-j997z". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:11:12 crc kubenswrapper[4732]: I0402 14:11:12.946984 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce9af86e-92fb-4693-8af9-4d95af13b999-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ce9af86e-92fb-4693-8af9-4d95af13b999" (UID: "ce9af86e-92fb-4693-8af9-4d95af13b999"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:11:12 crc kubenswrapper[4732]: I0402 14:11:12.952822 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce9af86e-92fb-4693-8af9-4d95af13b999-inventory" (OuterVolumeSpecName: "inventory") pod "ce9af86e-92fb-4693-8af9-4d95af13b999" (UID: "ce9af86e-92fb-4693-8af9-4d95af13b999"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:11:13 crc kubenswrapper[4732]: I0402 14:11:13.018604 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j997z\" (UniqueName: \"kubernetes.io/projected/ce9af86e-92fb-4693-8af9-4d95af13b999-kube-api-access-j997z\") on node \"crc\" DevicePath \"\"" Apr 02 14:11:13 crc kubenswrapper[4732]: I0402 14:11:13.018667 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ce9af86e-92fb-4693-8af9-4d95af13b999-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 02 14:11:13 crc kubenswrapper[4732]: I0402 14:11:13.018684 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce9af86e-92fb-4693-8af9-4d95af13b999-inventory\") on node \"crc\" DevicePath \"\"" Apr 02 14:11:13 crc kubenswrapper[4732]: I0402 14:11:13.339448 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv" event={"ID":"ce9af86e-92fb-4693-8af9-4d95af13b999","Type":"ContainerDied","Data":"1d2b68d91f0f9f3abec9d6a9195e883bbd341d35291166fa12a1e074970138ed"} Apr 02 14:11:13 crc kubenswrapper[4732]: I0402 14:11:13.339822 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d2b68d91f0f9f3abec9d6a9195e883bbd341d35291166fa12a1e074970138ed" Apr 02 14:11:13 crc kubenswrapper[4732]: I0402 14:11:13.339516 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv" Apr 02 14:11:13 crc kubenswrapper[4732]: I0402 14:11:13.427138 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-49z8f"] Apr 02 14:11:13 crc kubenswrapper[4732]: E0402 14:11:13.427777 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce9af86e-92fb-4693-8af9-4d95af13b999" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Apr 02 14:11:13 crc kubenswrapper[4732]: I0402 14:11:13.427799 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce9af86e-92fb-4693-8af9-4d95af13b999" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Apr 02 14:11:13 crc kubenswrapper[4732]: I0402 14:11:13.428031 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce9af86e-92fb-4693-8af9-4d95af13b999" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Apr 02 14:11:13 crc kubenswrapper[4732]: I0402 14:11:13.428815 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-49z8f" Apr 02 14:11:13 crc kubenswrapper[4732]: I0402 14:11:13.434477 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 02 14:11:13 crc kubenswrapper[4732]: I0402 14:11:13.434481 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 02 14:11:13 crc kubenswrapper[4732]: I0402 14:11:13.435539 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 02 14:11:13 crc kubenswrapper[4732]: I0402 14:11:13.437305 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wdhd4" Apr 02 14:11:13 crc kubenswrapper[4732]: I0402 14:11:13.439339 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-49z8f"] Apr 02 14:11:13 crc kubenswrapper[4732]: I0402 14:11:13.531403 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2072a722-772d-4379-a439-fdebfa6e219e-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-49z8f\" (UID: \"2072a722-772d-4379-a439-fdebfa6e219e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-49z8f" Apr 02 14:11:13 crc kubenswrapper[4732]: I0402 14:11:13.531471 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2072a722-772d-4379-a439-fdebfa6e219e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-49z8f\" (UID: \"2072a722-772d-4379-a439-fdebfa6e219e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-49z8f" Apr 02 14:11:13 crc kubenswrapper[4732]: I0402 14:11:13.531565 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btgs7\" (UniqueName: \"kubernetes.io/projected/2072a722-772d-4379-a439-fdebfa6e219e-kube-api-access-btgs7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-49z8f\" (UID: \"2072a722-772d-4379-a439-fdebfa6e219e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-49z8f" Apr 02 14:11:13 crc kubenswrapper[4732]: I0402 14:11:13.633700 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2072a722-772d-4379-a439-fdebfa6e219e-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-49z8f\" (UID: \"2072a722-772d-4379-a439-fdebfa6e219e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-49z8f" Apr 02 14:11:13 crc kubenswrapper[4732]: I0402 14:11:13.633790 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2072a722-772d-4379-a439-fdebfa6e219e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-49z8f\" (UID: \"2072a722-772d-4379-a439-fdebfa6e219e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-49z8f" Apr 02 14:11:13 crc kubenswrapper[4732]: I0402 14:11:13.633864 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btgs7\" (UniqueName: \"kubernetes.io/projected/2072a722-772d-4379-a439-fdebfa6e219e-kube-api-access-btgs7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-49z8f\" (UID: \"2072a722-772d-4379-a439-fdebfa6e219e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-49z8f" Apr 02 14:11:13 crc kubenswrapper[4732]: I0402 14:11:13.640569 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2072a722-772d-4379-a439-fdebfa6e219e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-49z8f\" (UID: \"2072a722-772d-4379-a439-fdebfa6e219e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-49z8f" Apr 02 14:11:13 crc kubenswrapper[4732]: I0402 14:11:13.640692 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2072a722-772d-4379-a439-fdebfa6e219e-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-49z8f\" (UID: \"2072a722-772d-4379-a439-fdebfa6e219e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-49z8f" Apr 02 14:11:13 crc kubenswrapper[4732]: I0402 14:11:13.651026 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btgs7\" (UniqueName: \"kubernetes.io/projected/2072a722-772d-4379-a439-fdebfa6e219e-kube-api-access-btgs7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-49z8f\" (UID: \"2072a722-772d-4379-a439-fdebfa6e219e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-49z8f" Apr 02 14:11:13 crc kubenswrapper[4732]: I0402 14:11:13.747290 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-49z8f" Apr 02 14:11:14 crc kubenswrapper[4732]: I0402 14:11:14.249587 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-49z8f"] Apr 02 14:11:14 crc kubenswrapper[4732]: I0402 14:11:14.349661 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-49z8f" event={"ID":"2072a722-772d-4379-a439-fdebfa6e219e","Type":"ContainerStarted","Data":"63c99787983910d6d0325dfea70cc3b0cab51a2abca86f7f3adad96fd5c473c7"} Apr 02 14:11:15 crc kubenswrapper[4732]: I0402 14:11:15.361129 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-49z8f" event={"ID":"2072a722-772d-4379-a439-fdebfa6e219e","Type":"ContainerStarted","Data":"766d7b475b1db701a7190f332d023475da1e7dfeadbf872e76fce8bffee9f5ae"} Apr 02 14:11:15 crc kubenswrapper[4732]: I0402 14:11:15.383644 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-49z8f" podStartSLOduration=1.987978047 podStartE2EDuration="2.38362707s" podCreationTimestamp="2026-04-02 14:11:13 +0000 UTC" firstStartedPulling="2026-04-02 14:11:14.244103359 +0000 UTC m=+2031.148510912" lastFinishedPulling="2026-04-02 14:11:14.639752382 +0000 UTC m=+2031.544159935" observedRunningTime="2026-04-02 14:11:15.377720078 +0000 UTC m=+2032.282127631" watchObservedRunningTime="2026-04-02 14:11:15.38362707 +0000 UTC m=+2032.288034623" Apr 02 14:11:16 crc kubenswrapper[4732]: I0402 14:11:16.036848 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-gcw8g"] Apr 02 14:11:16 crc kubenswrapper[4732]: I0402 14:11:16.048092 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-gcw8g"] Apr 02 14:11:16 crc kubenswrapper[4732]: I0402 14:11:16.700808 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0267b36-acec-4035-9bdd-b19758f45275" path="/var/lib/kubelet/pods/d0267b36-acec-4035-9bdd-b19758f45275/volumes" Apr 02 14:11:17 crc kubenswrapper[4732]: I0402 14:11:17.040808 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-e3c0-account-create-update-5k7n4"] Apr 02 14:11:17 crc kubenswrapper[4732]: I0402 14:11:17.055982 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-e441-account-create-update-7222l"] Apr 02 14:11:17 crc kubenswrapper[4732]: I0402 14:11:17.064521 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-e3c0-account-create-update-5k7n4"] Apr 02 14:11:17 crc kubenswrapper[4732]: I0402 14:11:17.073593 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-1cfc-account-create-update-vtn6k"] Apr 02 14:11:17 crc kubenswrapper[4732]: I0402 14:11:17.081676 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-e441-account-create-update-7222l"] Apr 02 14:11:17 crc kubenswrapper[4732]: I0402 14:11:17.090090 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-1cfc-account-create-update-vtn6k"] Apr 02 14:11:18 crc kubenswrapper[4732]: I0402 14:11:18.032303 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-2bf58"] Apr 02 14:11:18 crc kubenswrapper[4732]: I0402 14:11:18.049634 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-2bf58"] Apr 02 14:11:18 crc kubenswrapper[4732]: I0402 14:11:18.059039 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-265md"] Apr 02 14:11:18 crc kubenswrapper[4732]: I0402 14:11:18.068598 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-265md"] Apr 02 14:11:18 crc kubenswrapper[4732]: I0402 14:11:18.681175 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:11:18 crc kubenswrapper[4732]: E0402 14:11:18.681430 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:11:18 crc kubenswrapper[4732]: I0402 14:11:18.693299 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3012ae69-b011-46df-a7c0-36845efb7172" path="/var/lib/kubelet/pods/3012ae69-b011-46df-a7c0-36845efb7172/volumes" Apr 02 14:11:18 crc kubenswrapper[4732]: I0402 14:11:18.693965 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3edb7dea-2bf7-4dcc-80e0-54c59916152c" path="/var/lib/kubelet/pods/3edb7dea-2bf7-4dcc-80e0-54c59916152c/volumes" Apr 02 14:11:18 crc kubenswrapper[4732]: I0402 14:11:18.694475 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58580fa1-1d1a-4c30-9e7a-d0e464c1487f" path="/var/lib/kubelet/pods/58580fa1-1d1a-4c30-9e7a-d0e464c1487f/volumes" Apr 02 14:11:18 crc kubenswrapper[4732]: I0402 14:11:18.695033 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99e5e7a5-f66b-45c2-881c-60507bfa4c25" path="/var/lib/kubelet/pods/99e5e7a5-f66b-45c2-881c-60507bfa4c25/volumes" Apr 02 14:11:18 crc kubenswrapper[4732]: I0402 14:11:18.696211 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cca116f4-1c5a-4dc9-966d-5033ed344c2f" path="/var/lib/kubelet/pods/cca116f4-1c5a-4dc9-966d-5033ed344c2f/volumes" Apr 02 14:11:33 crc kubenswrapper[4732]: I0402 14:11:33.681313 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:11:33 crc kubenswrapper[4732]: E0402 14:11:33.682123 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:11:44 crc kubenswrapper[4732]: I0402 14:11:44.686622 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:11:44 crc kubenswrapper[4732]: E0402 14:11:44.687228 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:11:48 crc kubenswrapper[4732]: I0402 14:11:48.652331 4732 generic.go:334] "Generic (PLEG): container finished" podID="2072a722-772d-4379-a439-fdebfa6e219e" containerID="766d7b475b1db701a7190f332d023475da1e7dfeadbf872e76fce8bffee9f5ae" exitCode=0 Apr 02 14:11:48 crc kubenswrapper[4732]: I0402 14:11:48.652437 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-49z8f" event={"ID":"2072a722-772d-4379-a439-fdebfa6e219e","Type":"ContainerDied","Data":"766d7b475b1db701a7190f332d023475da1e7dfeadbf872e76fce8bffee9f5ae"} Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.055529 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-49z8f" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.164191 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btgs7\" (UniqueName: \"kubernetes.io/projected/2072a722-772d-4379-a439-fdebfa6e219e-kube-api-access-btgs7\") pod \"2072a722-772d-4379-a439-fdebfa6e219e\" (UID: \"2072a722-772d-4379-a439-fdebfa6e219e\") " Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.164332 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2072a722-772d-4379-a439-fdebfa6e219e-ssh-key-openstack-edpm-ipam\") pod \"2072a722-772d-4379-a439-fdebfa6e219e\" (UID: \"2072a722-772d-4379-a439-fdebfa6e219e\") " Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.164487 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2072a722-772d-4379-a439-fdebfa6e219e-inventory\") pod \"2072a722-772d-4379-a439-fdebfa6e219e\" (UID: \"2072a722-772d-4379-a439-fdebfa6e219e\") " Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.171780 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2072a722-772d-4379-a439-fdebfa6e219e-kube-api-access-btgs7" (OuterVolumeSpecName: "kube-api-access-btgs7") pod "2072a722-772d-4379-a439-fdebfa6e219e" (UID: "2072a722-772d-4379-a439-fdebfa6e219e"). InnerVolumeSpecName "kube-api-access-btgs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.193876 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2072a722-772d-4379-a439-fdebfa6e219e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2072a722-772d-4379-a439-fdebfa6e219e" (UID: "2072a722-772d-4379-a439-fdebfa6e219e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.208914 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2072a722-772d-4379-a439-fdebfa6e219e-inventory" (OuterVolumeSpecName: "inventory") pod "2072a722-772d-4379-a439-fdebfa6e219e" (UID: "2072a722-772d-4379-a439-fdebfa6e219e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.266297 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2072a722-772d-4379-a439-fdebfa6e219e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.266329 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2072a722-772d-4379-a439-fdebfa6e219e-inventory\") on node \"crc\" DevicePath \"\"" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.266339 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btgs7\" (UniqueName: \"kubernetes.io/projected/2072a722-772d-4379-a439-fdebfa6e219e-kube-api-access-btgs7\") on node \"crc\" DevicePath \"\"" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.543328 4732 scope.go:117] "RemoveContainer" containerID="03f75e719d435b7310f14addd9962e52eccfb55e128e4bae6bf461c6b5c968bb" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.577416 4732 scope.go:117] "RemoveContainer" containerID="d00d5c36cd8aa69862685ba7aa1b5b654c559f5edb60f47093237e2c51c84f61" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.641484 4732 scope.go:117] "RemoveContainer" containerID="f4284ba9e6ef759fd112fe94d44b65bc399b40334f37ac83aaf6a6a51d41137f" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.659418 4732 scope.go:117] "RemoveContainer" containerID="162391e4dc1cc110bb45845cf779d42622d3bba05a9caad043bdcf8f836d3d0c" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.674246 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-49z8f" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.674242 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-49z8f" event={"ID":"2072a722-772d-4379-a439-fdebfa6e219e","Type":"ContainerDied","Data":"63c99787983910d6d0325dfea70cc3b0cab51a2abca86f7f3adad96fd5c473c7"} Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.674403 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63c99787983910d6d0325dfea70cc3b0cab51a2abca86f7f3adad96fd5c473c7" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.686593 4732 scope.go:117] "RemoveContainer" containerID="f8f2b2490f72d792e898a25b229cd5c8e8b10d81335981da4b9f5cac497f20a2" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.733712 4732 scope.go:117] "RemoveContainer" containerID="5d42fac2733be04e88650d180b8d8f50e9ee46420d5a97902e599603499b0f1a" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.757960 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5"] Apr 02 14:11:50 crc kubenswrapper[4732]: E0402 14:11:50.758510 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2072a722-772d-4379-a439-fdebfa6e219e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.758528 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2072a722-772d-4379-a439-fdebfa6e219e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.758751 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2072a722-772d-4379-a439-fdebfa6e219e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.759522 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.761712 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wdhd4" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.761949 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.762085 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.762275 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.766896 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5"] Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.878962 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d8bb9bae-9d09-42c5-a60a-134c907db6d5-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5\" (UID: \"d8bb9bae-9d09-42c5-a60a-134c907db6d5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.879007 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8bb9bae-9d09-42c5-a60a-134c907db6d5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5\" (UID: \"d8bb9bae-9d09-42c5-a60a-134c907db6d5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.879263 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpbxk\" (UniqueName: \"kubernetes.io/projected/d8bb9bae-9d09-42c5-a60a-134c907db6d5-kube-api-access-kpbxk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5\" (UID: \"d8bb9bae-9d09-42c5-a60a-134c907db6d5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.981263 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d8bb9bae-9d09-42c5-a60a-134c907db6d5-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5\" (UID: \"d8bb9bae-9d09-42c5-a60a-134c907db6d5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.981326 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8bb9bae-9d09-42c5-a60a-134c907db6d5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5\" (UID: \"d8bb9bae-9d09-42c5-a60a-134c907db6d5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.981409 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpbxk\" (UniqueName: \"kubernetes.io/projected/d8bb9bae-9d09-42c5-a60a-134c907db6d5-kube-api-access-kpbxk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5\" (UID: \"d8bb9bae-9d09-42c5-a60a-134c907db6d5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.987309 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d8bb9bae-9d09-42c5-a60a-134c907db6d5-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5\" (UID: \"d8bb9bae-9d09-42c5-a60a-134c907db6d5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5" Apr 02 14:11:50 crc kubenswrapper[4732]: I0402 14:11:50.987418 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8bb9bae-9d09-42c5-a60a-134c907db6d5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5\" (UID: \"d8bb9bae-9d09-42c5-a60a-134c907db6d5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5" Apr 02 14:11:51 crc kubenswrapper[4732]: I0402 14:11:51.004465 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpbxk\" (UniqueName: \"kubernetes.io/projected/d8bb9bae-9d09-42c5-a60a-134c907db6d5-kube-api-access-kpbxk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5\" (UID: \"d8bb9bae-9d09-42c5-a60a-134c907db6d5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5" Apr 02 14:11:51 crc kubenswrapper[4732]: I0402 14:11:51.076790 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5" Apr 02 14:11:51 crc kubenswrapper[4732]: I0402 14:11:51.631072 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5"] Apr 02 14:11:51 crc kubenswrapper[4732]: I0402 14:11:51.704075 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5" event={"ID":"d8bb9bae-9d09-42c5-a60a-134c907db6d5","Type":"ContainerStarted","Data":"ac303cf1b41acbc0d058895a4c63eb952f5ba890857836730a484c061336861b"} Apr 02 14:11:52 crc kubenswrapper[4732]: I0402 14:11:52.716063 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5" event={"ID":"d8bb9bae-9d09-42c5-a60a-134c907db6d5","Type":"ContainerStarted","Data":"2a9972034992dd9ab87c672ca3a762bb50d86c8d0c21ca6dea8cf30cae69ed0f"} Apr 02 14:11:52 crc kubenswrapper[4732]: I0402 14:11:52.736899 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5" podStartSLOduration=2.3542487 podStartE2EDuration="2.736878076s" podCreationTimestamp="2026-04-02 14:11:50 +0000 UTC" firstStartedPulling="2026-04-02 14:11:51.633048323 +0000 UTC m=+2068.537455896" lastFinishedPulling="2026-04-02 14:11:52.015677719 +0000 UTC m=+2068.920085272" observedRunningTime="2026-04-02 14:11:52.731756706 +0000 UTC m=+2069.636164259" watchObservedRunningTime="2026-04-02 14:11:52.736878076 +0000 UTC m=+2069.641285639" Apr 02 14:11:57 crc kubenswrapper[4732]: I0402 14:11:57.044434 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lcftj"] Apr 02 14:11:57 crc kubenswrapper[4732]: I0402 14:11:57.053507 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lcftj"] Apr 02 14:11:58 crc kubenswrapper[4732]: I0402 14:11:58.689673 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d186e696-8cc5-4dac-b6cb-b9a5530bc57e" path="/var/lib/kubelet/pods/d186e696-8cc5-4dac-b6cb-b9a5530bc57e/volumes" Apr 02 14:11:59 crc kubenswrapper[4732]: I0402 14:11:59.680854 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:11:59 crc kubenswrapper[4732]: E0402 14:11:59.681430 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:12:00 crc kubenswrapper[4732]: I0402 14:12:00.131466 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585652-6p5dx"] Apr 02 14:12:00 crc kubenswrapper[4732]: I0402 14:12:00.133039 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585652-6p5dx" Apr 02 14:12:00 crc kubenswrapper[4732]: I0402 14:12:00.136882 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:12:00 crc kubenswrapper[4732]: I0402 14:12:00.137158 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:12:00 crc kubenswrapper[4732]: I0402 14:12:00.143188 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585652-6p5dx"] Apr 02 14:12:00 crc kubenswrapper[4732]: I0402 14:12:00.146816 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:12:00 crc kubenswrapper[4732]: I0402 14:12:00.268919 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk4cb\" (UniqueName: \"kubernetes.io/projected/aa9c5076-d469-46c7-9467-8d7ee80a71ff-kube-api-access-rk4cb\") pod \"auto-csr-approver-29585652-6p5dx\" (UID: \"aa9c5076-d469-46c7-9467-8d7ee80a71ff\") " pod="openshift-infra/auto-csr-approver-29585652-6p5dx" Apr 02 14:12:00 crc kubenswrapper[4732]: I0402 14:12:00.371144 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk4cb\" (UniqueName: \"kubernetes.io/projected/aa9c5076-d469-46c7-9467-8d7ee80a71ff-kube-api-access-rk4cb\") pod \"auto-csr-approver-29585652-6p5dx\" (UID: \"aa9c5076-d469-46c7-9467-8d7ee80a71ff\") " pod="openshift-infra/auto-csr-approver-29585652-6p5dx" Apr 02 14:12:00 crc kubenswrapper[4732]: I0402 14:12:00.395794 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk4cb\" (UniqueName: \"kubernetes.io/projected/aa9c5076-d469-46c7-9467-8d7ee80a71ff-kube-api-access-rk4cb\") pod \"auto-csr-approver-29585652-6p5dx\" (UID: \"aa9c5076-d469-46c7-9467-8d7ee80a71ff\") " pod="openshift-infra/auto-csr-approver-29585652-6p5dx" Apr 02 14:12:00 crc kubenswrapper[4732]: I0402 14:12:00.453383 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585652-6p5dx" Apr 02 14:12:00 crc kubenswrapper[4732]: I0402 14:12:00.929293 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585652-6p5dx"] Apr 02 14:12:01 crc kubenswrapper[4732]: I0402 14:12:01.802630 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585652-6p5dx" event={"ID":"aa9c5076-d469-46c7-9467-8d7ee80a71ff","Type":"ContainerStarted","Data":"cee1088eff785e7b8fe5ff29112e438624e53b89bb1108fecffb9c17665e2a9b"} Apr 02 14:12:02 crc kubenswrapper[4732]: I0402 14:12:02.813042 4732 generic.go:334] "Generic (PLEG): container finished" podID="aa9c5076-d469-46c7-9467-8d7ee80a71ff" containerID="3ad43324fbecf368f6fcc06a0ab1f3f5db739bf2832b0b8f34d462fb20763345" exitCode=0 Apr 02 14:12:02 crc kubenswrapper[4732]: I0402 14:12:02.813210 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585652-6p5dx" event={"ID":"aa9c5076-d469-46c7-9467-8d7ee80a71ff","Type":"ContainerDied","Data":"3ad43324fbecf368f6fcc06a0ab1f3f5db739bf2832b0b8f34d462fb20763345"} Apr 02 14:12:04 crc kubenswrapper[4732]: I0402 14:12:04.468746 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585652-6p5dx" Apr 02 14:12:04 crc kubenswrapper[4732]: I0402 14:12:04.476255 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk4cb\" (UniqueName: \"kubernetes.io/projected/aa9c5076-d469-46c7-9467-8d7ee80a71ff-kube-api-access-rk4cb\") pod \"aa9c5076-d469-46c7-9467-8d7ee80a71ff\" (UID: \"aa9c5076-d469-46c7-9467-8d7ee80a71ff\") " Apr 02 14:12:04 crc kubenswrapper[4732]: I0402 14:12:04.485186 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa9c5076-d469-46c7-9467-8d7ee80a71ff-kube-api-access-rk4cb" (OuterVolumeSpecName: "kube-api-access-rk4cb") pod "aa9c5076-d469-46c7-9467-8d7ee80a71ff" (UID: "aa9c5076-d469-46c7-9467-8d7ee80a71ff"). InnerVolumeSpecName "kube-api-access-rk4cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:12:04 crc kubenswrapper[4732]: I0402 14:12:04.577796 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk4cb\" (UniqueName: \"kubernetes.io/projected/aa9c5076-d469-46c7-9467-8d7ee80a71ff-kube-api-access-rk4cb\") on node \"crc\" DevicePath \"\"" Apr 02 14:12:04 crc kubenswrapper[4732]: I0402 14:12:04.834765 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585652-6p5dx" event={"ID":"aa9c5076-d469-46c7-9467-8d7ee80a71ff","Type":"ContainerDied","Data":"cee1088eff785e7b8fe5ff29112e438624e53b89bb1108fecffb9c17665e2a9b"} Apr 02 14:12:04 crc kubenswrapper[4732]: I0402 14:12:04.834812 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cee1088eff785e7b8fe5ff29112e438624e53b89bb1108fecffb9c17665e2a9b" Apr 02 14:12:04 crc kubenswrapper[4732]: I0402 14:12:04.834834 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585652-6p5dx" Apr 02 14:12:05 crc kubenswrapper[4732]: I0402 14:12:05.537764 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585646-ndcmk"] Apr 02 14:12:05 crc kubenswrapper[4732]: I0402 14:12:05.547305 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585646-ndcmk"] Apr 02 14:12:06 crc kubenswrapper[4732]: I0402 14:12:06.692519 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5484215b-23cc-462c-bfce-d0ad533381b7" path="/var/lib/kubelet/pods/5484215b-23cc-462c-bfce-d0ad533381b7/volumes" Apr 02 14:12:13 crc kubenswrapper[4732]: I0402 14:12:13.680577 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:12:13 crc kubenswrapper[4732]: E0402 14:12:13.681286 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:12:26 crc kubenswrapper[4732]: I0402 14:12:26.681427 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:12:26 crc kubenswrapper[4732]: E0402 14:12:26.682314 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:12:34 crc kubenswrapper[4732]: I0402 14:12:34.104496 4732 generic.go:334] "Generic (PLEG): container finished" podID="d8bb9bae-9d09-42c5-a60a-134c907db6d5" containerID="2a9972034992dd9ab87c672ca3a762bb50d86c8d0c21ca6dea8cf30cae69ed0f" exitCode=0 Apr 02 14:12:34 crc kubenswrapper[4732]: I0402 14:12:34.104597 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5" event={"ID":"d8bb9bae-9d09-42c5-a60a-134c907db6d5","Type":"ContainerDied","Data":"2a9972034992dd9ab87c672ca3a762bb50d86c8d0c21ca6dea8cf30cae69ed0f"} Apr 02 14:12:35 crc kubenswrapper[4732]: I0402 14:12:35.560431 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5" Apr 02 14:12:35 crc kubenswrapper[4732]: I0402 14:12:35.680400 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpbxk\" (UniqueName: \"kubernetes.io/projected/d8bb9bae-9d09-42c5-a60a-134c907db6d5-kube-api-access-kpbxk\") pod \"d8bb9bae-9d09-42c5-a60a-134c907db6d5\" (UID: \"d8bb9bae-9d09-42c5-a60a-134c907db6d5\") " Apr 02 14:12:35 crc kubenswrapper[4732]: I0402 14:12:35.681634 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d8bb9bae-9d09-42c5-a60a-134c907db6d5-ssh-key-openstack-edpm-ipam\") pod \"d8bb9bae-9d09-42c5-a60a-134c907db6d5\" (UID: \"d8bb9bae-9d09-42c5-a60a-134c907db6d5\") " Apr 02 14:12:35 crc kubenswrapper[4732]: I0402 14:12:35.681851 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8bb9bae-9d09-42c5-a60a-134c907db6d5-inventory\") pod \"d8bb9bae-9d09-42c5-a60a-134c907db6d5\" (UID: \"d8bb9bae-9d09-42c5-a60a-134c907db6d5\") " Apr 02 14:12:35 crc kubenswrapper[4732]: I0402 14:12:35.687540 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8bb9bae-9d09-42c5-a60a-134c907db6d5-kube-api-access-kpbxk" (OuterVolumeSpecName: "kube-api-access-kpbxk") pod "d8bb9bae-9d09-42c5-a60a-134c907db6d5" (UID: "d8bb9bae-9d09-42c5-a60a-134c907db6d5"). InnerVolumeSpecName "kube-api-access-kpbxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:12:35 crc kubenswrapper[4732]: I0402 14:12:35.710096 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8bb9bae-9d09-42c5-a60a-134c907db6d5-inventory" (OuterVolumeSpecName: "inventory") pod "d8bb9bae-9d09-42c5-a60a-134c907db6d5" (UID: "d8bb9bae-9d09-42c5-a60a-134c907db6d5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:12:35 crc kubenswrapper[4732]: I0402 14:12:35.710515 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8bb9bae-9d09-42c5-a60a-134c907db6d5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d8bb9bae-9d09-42c5-a60a-134c907db6d5" (UID: "d8bb9bae-9d09-42c5-a60a-134c907db6d5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:12:35 crc kubenswrapper[4732]: I0402 14:12:35.784359 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d8bb9bae-9d09-42c5-a60a-134c907db6d5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 02 14:12:35 crc kubenswrapper[4732]: I0402 14:12:35.784411 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8bb9bae-9d09-42c5-a60a-134c907db6d5-inventory\") on node \"crc\" DevicePath \"\"" Apr 02 14:12:35 crc kubenswrapper[4732]: I0402 14:12:35.784423 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpbxk\" (UniqueName: \"kubernetes.io/projected/d8bb9bae-9d09-42c5-a60a-134c907db6d5-kube-api-access-kpbxk\") on node \"crc\" DevicePath \"\"" Apr 02 14:12:36 crc kubenswrapper[4732]: I0402 14:12:36.125237 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5" event={"ID":"d8bb9bae-9d09-42c5-a60a-134c907db6d5","Type":"ContainerDied","Data":"ac303cf1b41acbc0d058895a4c63eb952f5ba890857836730a484c061336861b"} Apr 02 14:12:36 crc kubenswrapper[4732]: I0402 14:12:36.125519 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac303cf1b41acbc0d058895a4c63eb952f5ba890857836730a484c061336861b" Apr 02 14:12:36 crc kubenswrapper[4732]: I0402 14:12:36.125331 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5" Apr 02 14:12:36 crc kubenswrapper[4732]: I0402 14:12:36.222800 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8ct56"] Apr 02 14:12:36 crc kubenswrapper[4732]: E0402 14:12:36.223265 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8bb9bae-9d09-42c5-a60a-134c907db6d5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Apr 02 14:12:36 crc kubenswrapper[4732]: I0402 14:12:36.223282 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8bb9bae-9d09-42c5-a60a-134c907db6d5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Apr 02 14:12:36 crc kubenswrapper[4732]: E0402 14:12:36.223324 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa9c5076-d469-46c7-9467-8d7ee80a71ff" containerName="oc" Apr 02 14:12:36 crc kubenswrapper[4732]: I0402 14:12:36.223332 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa9c5076-d469-46c7-9467-8d7ee80a71ff" containerName="oc" Apr 02 14:12:36 crc kubenswrapper[4732]: I0402 14:12:36.223504 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa9c5076-d469-46c7-9467-8d7ee80a71ff" containerName="oc" Apr 02 14:12:36 crc kubenswrapper[4732]: I0402 14:12:36.223525 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8bb9bae-9d09-42c5-a60a-134c907db6d5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Apr 02 14:12:36 crc kubenswrapper[4732]: I0402 14:12:36.226283 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8ct56" Apr 02 14:12:36 crc kubenswrapper[4732]: I0402 14:12:36.229112 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wdhd4" Apr 02 14:12:36 crc kubenswrapper[4732]: I0402 14:12:36.229155 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 02 14:12:36 crc kubenswrapper[4732]: I0402 14:12:36.229288 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 02 14:12:36 crc kubenswrapper[4732]: I0402 14:12:36.229958 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 02 14:12:36 crc kubenswrapper[4732]: I0402 14:12:36.234124 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8ct56"] Apr 02 14:12:36 crc kubenswrapper[4732]: I0402 14:12:36.397701 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvvx6\" (UniqueName: \"kubernetes.io/projected/ede1fe7d-16b7-41be-af74-8933aa0a1e83-kube-api-access-zvvx6\") pod \"ssh-known-hosts-edpm-deployment-8ct56\" (UID: \"ede1fe7d-16b7-41be-af74-8933aa0a1e83\") " pod="openstack/ssh-known-hosts-edpm-deployment-8ct56" Apr 02 14:12:36 crc kubenswrapper[4732]: I0402 14:12:36.397896 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ede1fe7d-16b7-41be-af74-8933aa0a1e83-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-8ct56\" (UID: \"ede1fe7d-16b7-41be-af74-8933aa0a1e83\") " pod="openstack/ssh-known-hosts-edpm-deployment-8ct56" Apr 02 14:12:36 crc kubenswrapper[4732]: I0402 14:12:36.398044 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ede1fe7d-16b7-41be-af74-8933aa0a1e83-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-8ct56\" (UID: \"ede1fe7d-16b7-41be-af74-8933aa0a1e83\") " pod="openstack/ssh-known-hosts-edpm-deployment-8ct56" Apr 02 14:12:36 crc kubenswrapper[4732]: I0402 14:12:36.499704 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvvx6\" (UniqueName: \"kubernetes.io/projected/ede1fe7d-16b7-41be-af74-8933aa0a1e83-kube-api-access-zvvx6\") pod \"ssh-known-hosts-edpm-deployment-8ct56\" (UID: \"ede1fe7d-16b7-41be-af74-8933aa0a1e83\") " pod="openstack/ssh-known-hosts-edpm-deployment-8ct56" Apr 02 14:12:36 crc kubenswrapper[4732]: I0402 14:12:36.499776 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ede1fe7d-16b7-41be-af74-8933aa0a1e83-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-8ct56\" (UID: \"ede1fe7d-16b7-41be-af74-8933aa0a1e83\") " pod="openstack/ssh-known-hosts-edpm-deployment-8ct56" Apr 02 14:12:36 crc kubenswrapper[4732]: I0402 14:12:36.499824 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ede1fe7d-16b7-41be-af74-8933aa0a1e83-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-8ct56\" (UID: \"ede1fe7d-16b7-41be-af74-8933aa0a1e83\") " pod="openstack/ssh-known-hosts-edpm-deployment-8ct56" Apr 02 14:12:36 crc kubenswrapper[4732]: I0402 14:12:36.507172 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ede1fe7d-16b7-41be-af74-8933aa0a1e83-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-8ct56\" (UID: \"ede1fe7d-16b7-41be-af74-8933aa0a1e83\") " pod="openstack/ssh-known-hosts-edpm-deployment-8ct56" Apr 02 14:12:36 crc kubenswrapper[4732]: I0402 14:12:36.514034 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ede1fe7d-16b7-41be-af74-8933aa0a1e83-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-8ct56\" (UID: \"ede1fe7d-16b7-41be-af74-8933aa0a1e83\") " pod="openstack/ssh-known-hosts-edpm-deployment-8ct56" Apr 02 14:12:36 crc kubenswrapper[4732]: I0402 14:12:36.517210 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvvx6\" (UniqueName: \"kubernetes.io/projected/ede1fe7d-16b7-41be-af74-8933aa0a1e83-kube-api-access-zvvx6\") pod \"ssh-known-hosts-edpm-deployment-8ct56\" (UID: \"ede1fe7d-16b7-41be-af74-8933aa0a1e83\") " pod="openstack/ssh-known-hosts-edpm-deployment-8ct56" Apr 02 14:12:36 crc kubenswrapper[4732]: I0402 14:12:36.548266 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8ct56" Apr 02 14:12:37 crc kubenswrapper[4732]: I0402 14:12:37.103808 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8ct56"] Apr 02 14:12:37 crc kubenswrapper[4732]: I0402 14:12:37.110545 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 02 14:12:37 crc kubenswrapper[4732]: I0402 14:12:37.137776 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8ct56" event={"ID":"ede1fe7d-16b7-41be-af74-8933aa0a1e83","Type":"ContainerStarted","Data":"88ccb9cef75d0ba7787310aa0a55ea0e3ac545a70db3a7ea25fb173bd2530b2e"} Apr 02 14:12:39 crc kubenswrapper[4732]: I0402 14:12:39.159979 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8ct56" event={"ID":"ede1fe7d-16b7-41be-af74-8933aa0a1e83","Type":"ContainerStarted","Data":"7cf0d1202f2d222f18cb764a8532440c828e9d69358498c176be2048be853539"} Apr 02 14:12:39 crc kubenswrapper[4732]: I0402 14:12:39.177168 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-8ct56" podStartSLOduration=1.7008014930000002 podStartE2EDuration="3.177148271s" podCreationTimestamp="2026-04-02 14:12:36 +0000 UTC" firstStartedPulling="2026-04-02 14:12:37.110309902 +0000 UTC m=+2114.014717455" lastFinishedPulling="2026-04-02 14:12:38.58665666 +0000 UTC m=+2115.491064233" observedRunningTime="2026-04-02 14:12:39.176351269 +0000 UTC m=+2116.080758872" watchObservedRunningTime="2026-04-02 14:12:39.177148271 +0000 UTC m=+2116.081555824" Apr 02 14:12:39 crc kubenswrapper[4732]: I0402 14:12:39.680110 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:12:40 crc kubenswrapper[4732]: I0402 14:12:40.209599 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerStarted","Data":"63733271f299db2daa2f2ab3f85fd3de726a361d8aba6db8431f92a5c4cd2580"} Apr 02 14:12:45 crc kubenswrapper[4732]: I0402 14:12:45.262387 4732 generic.go:334] "Generic (PLEG): container finished" podID="ede1fe7d-16b7-41be-af74-8933aa0a1e83" containerID="7cf0d1202f2d222f18cb764a8532440c828e9d69358498c176be2048be853539" exitCode=0 Apr 02 14:12:45 crc kubenswrapper[4732]: I0402 14:12:45.263152 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8ct56" event={"ID":"ede1fe7d-16b7-41be-af74-8933aa0a1e83","Type":"ContainerDied","Data":"7cf0d1202f2d222f18cb764a8532440c828e9d69358498c176be2048be853539"} Apr 02 14:12:46 crc kubenswrapper[4732]: I0402 14:12:46.673078 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8ct56" Apr 02 14:12:46 crc kubenswrapper[4732]: I0402 14:12:46.794919 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ede1fe7d-16b7-41be-af74-8933aa0a1e83-inventory-0\") pod \"ede1fe7d-16b7-41be-af74-8933aa0a1e83\" (UID: \"ede1fe7d-16b7-41be-af74-8933aa0a1e83\") " Apr 02 14:12:46 crc kubenswrapper[4732]: I0402 14:12:46.795179 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ede1fe7d-16b7-41be-af74-8933aa0a1e83-ssh-key-openstack-edpm-ipam\") pod \"ede1fe7d-16b7-41be-af74-8933aa0a1e83\" (UID: \"ede1fe7d-16b7-41be-af74-8933aa0a1e83\") " Apr 02 14:12:46 crc kubenswrapper[4732]: I0402 14:12:46.795280 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvvx6\" (UniqueName: \"kubernetes.io/projected/ede1fe7d-16b7-41be-af74-8933aa0a1e83-kube-api-access-zvvx6\") pod \"ede1fe7d-16b7-41be-af74-8933aa0a1e83\" (UID: \"ede1fe7d-16b7-41be-af74-8933aa0a1e83\") " Apr 02 14:12:46 crc kubenswrapper[4732]: I0402 14:12:46.801207 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ede1fe7d-16b7-41be-af74-8933aa0a1e83-kube-api-access-zvvx6" (OuterVolumeSpecName: "kube-api-access-zvvx6") pod "ede1fe7d-16b7-41be-af74-8933aa0a1e83" (UID: "ede1fe7d-16b7-41be-af74-8933aa0a1e83"). InnerVolumeSpecName "kube-api-access-zvvx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:12:46 crc kubenswrapper[4732]: I0402 14:12:46.830735 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ede1fe7d-16b7-41be-af74-8933aa0a1e83-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "ede1fe7d-16b7-41be-af74-8933aa0a1e83" (UID: "ede1fe7d-16b7-41be-af74-8933aa0a1e83"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:12:46 crc kubenswrapper[4732]: I0402 14:12:46.834734 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ede1fe7d-16b7-41be-af74-8933aa0a1e83-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ede1fe7d-16b7-41be-af74-8933aa0a1e83" (UID: "ede1fe7d-16b7-41be-af74-8933aa0a1e83"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:12:46 crc kubenswrapper[4732]: I0402 14:12:46.898115 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ede1fe7d-16b7-41be-af74-8933aa0a1e83-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 02 14:12:46 crc kubenswrapper[4732]: I0402 14:12:46.898165 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvvx6\" (UniqueName: \"kubernetes.io/projected/ede1fe7d-16b7-41be-af74-8933aa0a1e83-kube-api-access-zvvx6\") on node \"crc\" DevicePath \"\"" Apr 02 14:12:46 crc kubenswrapper[4732]: I0402 14:12:46.898182 4732 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ede1fe7d-16b7-41be-af74-8933aa0a1e83-inventory-0\") on node \"crc\" DevicePath \"\"" Apr 02 14:12:47 crc kubenswrapper[4732]: I0402 14:12:47.287505 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8ct56" event={"ID":"ede1fe7d-16b7-41be-af74-8933aa0a1e83","Type":"ContainerDied","Data":"88ccb9cef75d0ba7787310aa0a55ea0e3ac545a70db3a7ea25fb173bd2530b2e"} Apr 02 14:12:47 crc kubenswrapper[4732]: I0402 14:12:47.287561 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88ccb9cef75d0ba7787310aa0a55ea0e3ac545a70db3a7ea25fb173bd2530b2e" Apr 02 14:12:47 crc kubenswrapper[4732]: I0402 14:12:47.287561 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8ct56" Apr 02 14:12:47 crc kubenswrapper[4732]: I0402 14:12:47.373739 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-sdvqf"] Apr 02 14:12:47 crc kubenswrapper[4732]: E0402 14:12:47.374145 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ede1fe7d-16b7-41be-af74-8933aa0a1e83" containerName="ssh-known-hosts-edpm-deployment" Apr 02 14:12:47 crc kubenswrapper[4732]: I0402 14:12:47.374162 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ede1fe7d-16b7-41be-af74-8933aa0a1e83" containerName="ssh-known-hosts-edpm-deployment" Apr 02 14:12:47 crc kubenswrapper[4732]: I0402 14:12:47.374575 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ede1fe7d-16b7-41be-af74-8933aa0a1e83" containerName="ssh-known-hosts-edpm-deployment" Apr 02 14:12:47 crc kubenswrapper[4732]: I0402 14:12:47.375282 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sdvqf" Apr 02 14:12:47 crc kubenswrapper[4732]: I0402 14:12:47.378021 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 02 14:12:47 crc kubenswrapper[4732]: I0402 14:12:47.378215 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wdhd4" Apr 02 14:12:47 crc kubenswrapper[4732]: I0402 14:12:47.378748 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 02 14:12:47 crc kubenswrapper[4732]: I0402 14:12:47.385323 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-sdvqf"] Apr 02 14:12:47 crc kubenswrapper[4732]: I0402 14:12:47.386057 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 02 14:12:47 crc kubenswrapper[4732]: I0402 14:12:47.509922 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mfns\" (UniqueName: \"kubernetes.io/projected/c5427d0c-bb3a-491e-8461-8e189da84bd9-kube-api-access-5mfns\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sdvqf\" (UID: \"c5427d0c-bb3a-491e-8461-8e189da84bd9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sdvqf" Apr 02 14:12:47 crc kubenswrapper[4732]: I0402 14:12:47.510016 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5427d0c-bb3a-491e-8461-8e189da84bd9-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sdvqf\" (UID: \"c5427d0c-bb3a-491e-8461-8e189da84bd9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sdvqf" Apr 02 14:12:47 crc kubenswrapper[4732]: I0402 14:12:47.510039 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5427d0c-bb3a-491e-8461-8e189da84bd9-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sdvqf\" (UID: \"c5427d0c-bb3a-491e-8461-8e189da84bd9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sdvqf" Apr 02 14:12:47 crc kubenswrapper[4732]: I0402 14:12:47.612955 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5427d0c-bb3a-491e-8461-8e189da84bd9-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sdvqf\" (UID: \"c5427d0c-bb3a-491e-8461-8e189da84bd9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sdvqf" Apr 02 14:12:47 crc kubenswrapper[4732]: I0402 14:12:47.613010 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5427d0c-bb3a-491e-8461-8e189da84bd9-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sdvqf\" (UID: \"c5427d0c-bb3a-491e-8461-8e189da84bd9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sdvqf" Apr 02 14:12:47 crc kubenswrapper[4732]: I0402 14:12:47.613130 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mfns\" (UniqueName: \"kubernetes.io/projected/c5427d0c-bb3a-491e-8461-8e189da84bd9-kube-api-access-5mfns\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sdvqf\" (UID: \"c5427d0c-bb3a-491e-8461-8e189da84bd9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sdvqf" Apr 02 14:12:47 crc kubenswrapper[4732]: I0402 14:12:47.617288 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5427d0c-bb3a-491e-8461-8e189da84bd9-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sdvqf\" (UID: \"c5427d0c-bb3a-491e-8461-8e189da84bd9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sdvqf" Apr 02 14:12:47 crc kubenswrapper[4732]: I0402 14:12:47.622998 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5427d0c-bb3a-491e-8461-8e189da84bd9-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sdvqf\" (UID: \"c5427d0c-bb3a-491e-8461-8e189da84bd9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sdvqf" Apr 02 14:12:47 crc kubenswrapper[4732]: I0402 14:12:47.631760 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mfns\" (UniqueName: \"kubernetes.io/projected/c5427d0c-bb3a-491e-8461-8e189da84bd9-kube-api-access-5mfns\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sdvqf\" (UID: \"c5427d0c-bb3a-491e-8461-8e189da84bd9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sdvqf" Apr 02 14:12:47 crc kubenswrapper[4732]: I0402 14:12:47.699148 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sdvqf" Apr 02 14:12:48 crc kubenswrapper[4732]: W0402 14:12:48.272090 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5427d0c_bb3a_491e_8461_8e189da84bd9.slice/crio-a8aa617cda315e5f941cf7b00e3bdc58ceebdea9075baa42995dd18282d00b39 WatchSource:0}: Error finding container a8aa617cda315e5f941cf7b00e3bdc58ceebdea9075baa42995dd18282d00b39: Status 404 returned error can't find the container with id a8aa617cda315e5f941cf7b00e3bdc58ceebdea9075baa42995dd18282d00b39 Apr 02 14:12:48 crc kubenswrapper[4732]: I0402 14:12:48.275212 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-sdvqf"] Apr 02 14:12:48 crc kubenswrapper[4732]: I0402 14:12:48.306877 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sdvqf" event={"ID":"c5427d0c-bb3a-491e-8461-8e189da84bd9","Type":"ContainerStarted","Data":"a8aa617cda315e5f941cf7b00e3bdc58ceebdea9075baa42995dd18282d00b39"} Apr 02 14:12:49 crc kubenswrapper[4732]: I0402 14:12:49.333535 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sdvqf" event={"ID":"c5427d0c-bb3a-491e-8461-8e189da84bd9","Type":"ContainerStarted","Data":"b24feeda5a4823843293b8172c8e8951263b14a951f7d9bf5e7cc5a1b32576b8"} Apr 02 14:12:49 crc kubenswrapper[4732]: I0402 14:12:49.362042 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sdvqf" podStartSLOduration=1.849684324 podStartE2EDuration="2.362022595s" podCreationTimestamp="2026-04-02 14:12:47 +0000 UTC" firstStartedPulling="2026-04-02 14:12:48.275978968 +0000 UTC m=+2125.180386521" lastFinishedPulling="2026-04-02 14:12:48.788317219 +0000 UTC m=+2125.692724792" observedRunningTime="2026-04-02 14:12:49.358019802 +0000 UTC m=+2126.262427355" watchObservedRunningTime="2026-04-02 14:12:49.362022595 +0000 UTC m=+2126.266430148" Apr 02 14:12:50 crc kubenswrapper[4732]: I0402 14:12:50.040317 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bbht6"] Apr 02 14:12:50 crc kubenswrapper[4732]: I0402 14:12:50.050439 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-9pqhd"] Apr 02 14:12:50 crc kubenswrapper[4732]: I0402 14:12:50.060170 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-9pqhd"] Apr 02 14:12:50 crc kubenswrapper[4732]: I0402 14:12:50.069297 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bbht6"] Apr 02 14:12:50 crc kubenswrapper[4732]: I0402 14:12:50.690289 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83eb790e-b902-4dd9-bcf8-352d5675fbce" path="/var/lib/kubelet/pods/83eb790e-b902-4dd9-bcf8-352d5675fbce/volumes" Apr 02 14:12:50 crc kubenswrapper[4732]: I0402 14:12:50.690934 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d730389e-11ac-4fa7-86d7-efa07afdbe08" path="/var/lib/kubelet/pods/d730389e-11ac-4fa7-86d7-efa07afdbe08/volumes" Apr 02 14:12:50 crc kubenswrapper[4732]: I0402 14:12:50.879048 4732 scope.go:117] "RemoveContainer" containerID="6bafa11a5f6f06ce2459f12612464717263a6a689d0a14c463caf4b10f096b3e" Apr 02 14:12:50 crc kubenswrapper[4732]: I0402 14:12:50.921230 4732 scope.go:117] "RemoveContainer" containerID="5b83c75a78e02d4e4d6ae2df76d22ab2277c64bc8ccdd02e965158b68d5659fc" Apr 02 14:12:50 crc kubenswrapper[4732]: I0402 14:12:50.984447 4732 scope.go:117] "RemoveContainer" containerID="34479c920540c29d54b8c53909b3d77dfbd492832314b1dd4444ac9221c2967c" Apr 02 14:12:51 crc kubenswrapper[4732]: I0402 14:12:51.041266 4732 scope.go:117] "RemoveContainer" containerID="c25956aa6d98725a91263ba38314314c23dbb1b2d706189468ac93300a22d052" Apr 02 14:12:56 crc kubenswrapper[4732]: I0402 14:12:56.391336 4732 generic.go:334] "Generic (PLEG): container finished" podID="c5427d0c-bb3a-491e-8461-8e189da84bd9" containerID="b24feeda5a4823843293b8172c8e8951263b14a951f7d9bf5e7cc5a1b32576b8" exitCode=0 Apr 02 14:12:56 crc kubenswrapper[4732]: I0402 14:12:56.391447 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sdvqf" event={"ID":"c5427d0c-bb3a-491e-8461-8e189da84bd9","Type":"ContainerDied","Data":"b24feeda5a4823843293b8172c8e8951263b14a951f7d9bf5e7cc5a1b32576b8"} Apr 02 14:12:57 crc kubenswrapper[4732]: I0402 14:12:57.815394 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sdvqf" Apr 02 14:12:57 crc kubenswrapper[4732]: I0402 14:12:57.935399 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5427d0c-bb3a-491e-8461-8e189da84bd9-inventory\") pod \"c5427d0c-bb3a-491e-8461-8e189da84bd9\" (UID: \"c5427d0c-bb3a-491e-8461-8e189da84bd9\") " Apr 02 14:12:57 crc kubenswrapper[4732]: I0402 14:12:57.935485 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mfns\" (UniqueName: \"kubernetes.io/projected/c5427d0c-bb3a-491e-8461-8e189da84bd9-kube-api-access-5mfns\") pod \"c5427d0c-bb3a-491e-8461-8e189da84bd9\" (UID: \"c5427d0c-bb3a-491e-8461-8e189da84bd9\") " Apr 02 14:12:57 crc kubenswrapper[4732]: I0402 14:12:57.935662 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5427d0c-bb3a-491e-8461-8e189da84bd9-ssh-key-openstack-edpm-ipam\") pod \"c5427d0c-bb3a-491e-8461-8e189da84bd9\" (UID: \"c5427d0c-bb3a-491e-8461-8e189da84bd9\") " Apr 02 14:12:57 crc kubenswrapper[4732]: I0402 14:12:57.942909 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5427d0c-bb3a-491e-8461-8e189da84bd9-kube-api-access-5mfns" (OuterVolumeSpecName: "kube-api-access-5mfns") pod "c5427d0c-bb3a-491e-8461-8e189da84bd9" (UID: "c5427d0c-bb3a-491e-8461-8e189da84bd9"). InnerVolumeSpecName "kube-api-access-5mfns". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:12:57 crc kubenswrapper[4732]: I0402 14:12:57.963670 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5427d0c-bb3a-491e-8461-8e189da84bd9-inventory" (OuterVolumeSpecName: "inventory") pod "c5427d0c-bb3a-491e-8461-8e189da84bd9" (UID: "c5427d0c-bb3a-491e-8461-8e189da84bd9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:12:57 crc kubenswrapper[4732]: I0402 14:12:57.967398 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5427d0c-bb3a-491e-8461-8e189da84bd9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c5427d0c-bb3a-491e-8461-8e189da84bd9" (UID: "c5427d0c-bb3a-491e-8461-8e189da84bd9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:12:58 crc kubenswrapper[4732]: I0402 14:12:58.038137 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5427d0c-bb3a-491e-8461-8e189da84bd9-inventory\") on node \"crc\" DevicePath \"\"" Apr 02 14:12:58 crc kubenswrapper[4732]: I0402 14:12:58.038179 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mfns\" (UniqueName: \"kubernetes.io/projected/c5427d0c-bb3a-491e-8461-8e189da84bd9-kube-api-access-5mfns\") on node \"crc\" DevicePath \"\"" Apr 02 14:12:58 crc kubenswrapper[4732]: I0402 14:12:58.038193 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5427d0c-bb3a-491e-8461-8e189da84bd9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 02 14:12:58 crc kubenswrapper[4732]: I0402 14:12:58.436559 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sdvqf" event={"ID":"c5427d0c-bb3a-491e-8461-8e189da84bd9","Type":"ContainerDied","Data":"a8aa617cda315e5f941cf7b00e3bdc58ceebdea9075baa42995dd18282d00b39"} Apr 02 14:12:58 crc kubenswrapper[4732]: I0402 14:12:58.437110 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8aa617cda315e5f941cf7b00e3bdc58ceebdea9075baa42995dd18282d00b39" Apr 02 14:12:58 crc kubenswrapper[4732]: I0402 14:12:58.436762 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sdvqf" Apr 02 14:12:58 crc kubenswrapper[4732]: I0402 14:12:58.488000 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg"] Apr 02 14:12:58 crc kubenswrapper[4732]: E0402 14:12:58.488450 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5427d0c-bb3a-491e-8461-8e189da84bd9" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Apr 02 14:12:58 crc kubenswrapper[4732]: I0402 14:12:58.488472 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5427d0c-bb3a-491e-8461-8e189da84bd9" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Apr 02 14:12:58 crc kubenswrapper[4732]: I0402 14:12:58.488743 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5427d0c-bb3a-491e-8461-8e189da84bd9" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Apr 02 14:12:58 crc kubenswrapper[4732]: I0402 14:12:58.489449 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg" Apr 02 14:12:58 crc kubenswrapper[4732]: I0402 14:12:58.491641 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wdhd4" Apr 02 14:12:58 crc kubenswrapper[4732]: I0402 14:12:58.491694 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 02 14:12:58 crc kubenswrapper[4732]: I0402 14:12:58.492489 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 02 14:12:58 crc kubenswrapper[4732]: I0402 14:12:58.498563 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 02 14:12:58 crc kubenswrapper[4732]: I0402 14:12:58.502188 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg"] Apr 02 14:12:58 crc kubenswrapper[4732]: I0402 14:12:58.550544 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwwrr\" (UniqueName: \"kubernetes.io/projected/160a19c0-4b2b-439a-9ea5-0f0ec2d4aede-kube-api-access-xwwrr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg\" (UID: \"160a19c0-4b2b-439a-9ea5-0f0ec2d4aede\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg" Apr 02 14:12:58 crc kubenswrapper[4732]: I0402 14:12:58.550861 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/160a19c0-4b2b-439a-9ea5-0f0ec2d4aede-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg\" (UID: \"160a19c0-4b2b-439a-9ea5-0f0ec2d4aede\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg" Apr 02 14:12:58 crc kubenswrapper[4732]: I0402 14:12:58.550945 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/160a19c0-4b2b-439a-9ea5-0f0ec2d4aede-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg\" (UID: \"160a19c0-4b2b-439a-9ea5-0f0ec2d4aede\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg" Apr 02 14:12:58 crc kubenswrapper[4732]: I0402 14:12:58.653345 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/160a19c0-4b2b-439a-9ea5-0f0ec2d4aede-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg\" (UID: \"160a19c0-4b2b-439a-9ea5-0f0ec2d4aede\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg" Apr 02 14:12:58 crc kubenswrapper[4732]: I0402 14:12:58.653476 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/160a19c0-4b2b-439a-9ea5-0f0ec2d4aede-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg\" (UID: \"160a19c0-4b2b-439a-9ea5-0f0ec2d4aede\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg" Apr 02 14:12:58 crc kubenswrapper[4732]: I0402 14:12:58.653520 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwwrr\" (UniqueName: \"kubernetes.io/projected/160a19c0-4b2b-439a-9ea5-0f0ec2d4aede-kube-api-access-xwwrr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg\" (UID: \"160a19c0-4b2b-439a-9ea5-0f0ec2d4aede\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg" Apr 02 14:12:58 crc kubenswrapper[4732]: I0402 14:12:58.657721 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/160a19c0-4b2b-439a-9ea5-0f0ec2d4aede-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg\" (UID: \"160a19c0-4b2b-439a-9ea5-0f0ec2d4aede\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg" Apr 02 14:12:58 crc kubenswrapper[4732]: I0402 14:12:58.658660 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/160a19c0-4b2b-439a-9ea5-0f0ec2d4aede-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg\" (UID: \"160a19c0-4b2b-439a-9ea5-0f0ec2d4aede\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg" Apr 02 14:12:58 crc kubenswrapper[4732]: I0402 14:12:58.674317 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwwrr\" (UniqueName: \"kubernetes.io/projected/160a19c0-4b2b-439a-9ea5-0f0ec2d4aede-kube-api-access-xwwrr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg\" (UID: \"160a19c0-4b2b-439a-9ea5-0f0ec2d4aede\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg" Apr 02 14:12:58 crc kubenswrapper[4732]: I0402 14:12:58.805655 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg" Apr 02 14:12:59 crc kubenswrapper[4732]: W0402 14:12:59.271470 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod160a19c0_4b2b_439a_9ea5_0f0ec2d4aede.slice/crio-94d511df2cc7e1d993efcd28864edde786c16fa7b8f120649c368f62e3783fc6 WatchSource:0}: Error finding container 94d511df2cc7e1d993efcd28864edde786c16fa7b8f120649c368f62e3783fc6: Status 404 returned error can't find the container with id 94d511df2cc7e1d993efcd28864edde786c16fa7b8f120649c368f62e3783fc6 Apr 02 14:12:59 crc kubenswrapper[4732]: I0402 14:12:59.274071 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg"] Apr 02 14:12:59 crc kubenswrapper[4732]: I0402 14:12:59.487323 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg" event={"ID":"160a19c0-4b2b-439a-9ea5-0f0ec2d4aede","Type":"ContainerStarted","Data":"94d511df2cc7e1d993efcd28864edde786c16fa7b8f120649c368f62e3783fc6"} Apr 02 14:13:00 crc kubenswrapper[4732]: I0402 14:13:00.499875 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg" event={"ID":"160a19c0-4b2b-439a-9ea5-0f0ec2d4aede","Type":"ContainerStarted","Data":"b37a57a6b4e5e0f80fc7c77bf432655c23486ec5695881ffc9cdc0211fab9227"} Apr 02 14:13:08 crc kubenswrapper[4732]: I0402 14:13:08.583824 4732 generic.go:334] "Generic (PLEG): container finished" podID="160a19c0-4b2b-439a-9ea5-0f0ec2d4aede" containerID="b37a57a6b4e5e0f80fc7c77bf432655c23486ec5695881ffc9cdc0211fab9227" exitCode=0 Apr 02 14:13:08 crc kubenswrapper[4732]: I0402 14:13:08.583928 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg" event={"ID":"160a19c0-4b2b-439a-9ea5-0f0ec2d4aede","Type":"ContainerDied","Data":"b37a57a6b4e5e0f80fc7c77bf432655c23486ec5695881ffc9cdc0211fab9227"} Apr 02 14:13:09 crc kubenswrapper[4732]: I0402 14:13:09.987718 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.142255 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwwrr\" (UniqueName: \"kubernetes.io/projected/160a19c0-4b2b-439a-9ea5-0f0ec2d4aede-kube-api-access-xwwrr\") pod \"160a19c0-4b2b-439a-9ea5-0f0ec2d4aede\" (UID: \"160a19c0-4b2b-439a-9ea5-0f0ec2d4aede\") " Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.142491 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/160a19c0-4b2b-439a-9ea5-0f0ec2d4aede-ssh-key-openstack-edpm-ipam\") pod \"160a19c0-4b2b-439a-9ea5-0f0ec2d4aede\" (UID: \"160a19c0-4b2b-439a-9ea5-0f0ec2d4aede\") " Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.142530 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/160a19c0-4b2b-439a-9ea5-0f0ec2d4aede-inventory\") pod \"160a19c0-4b2b-439a-9ea5-0f0ec2d4aede\" (UID: \"160a19c0-4b2b-439a-9ea5-0f0ec2d4aede\") " Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.149932 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/160a19c0-4b2b-439a-9ea5-0f0ec2d4aede-kube-api-access-xwwrr" (OuterVolumeSpecName: "kube-api-access-xwwrr") pod "160a19c0-4b2b-439a-9ea5-0f0ec2d4aede" (UID: "160a19c0-4b2b-439a-9ea5-0f0ec2d4aede"). InnerVolumeSpecName "kube-api-access-xwwrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.179159 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/160a19c0-4b2b-439a-9ea5-0f0ec2d4aede-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "160a19c0-4b2b-439a-9ea5-0f0ec2d4aede" (UID: "160a19c0-4b2b-439a-9ea5-0f0ec2d4aede"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.181768 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/160a19c0-4b2b-439a-9ea5-0f0ec2d4aede-inventory" (OuterVolumeSpecName: "inventory") pod "160a19c0-4b2b-439a-9ea5-0f0ec2d4aede" (UID: "160a19c0-4b2b-439a-9ea5-0f0ec2d4aede"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.244873 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/160a19c0-4b2b-439a-9ea5-0f0ec2d4aede-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.244906 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/160a19c0-4b2b-439a-9ea5-0f0ec2d4aede-inventory\") on node \"crc\" DevicePath \"\"" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.244916 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwwrr\" (UniqueName: \"kubernetes.io/projected/160a19c0-4b2b-439a-9ea5-0f0ec2d4aede-kube-api-access-xwwrr\") on node \"crc\" DevicePath \"\"" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.605514 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg" event={"ID":"160a19c0-4b2b-439a-9ea5-0f0ec2d4aede","Type":"ContainerDied","Data":"94d511df2cc7e1d993efcd28864edde786c16fa7b8f120649c368f62e3783fc6"} Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.605572 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94d511df2cc7e1d993efcd28864edde786c16fa7b8f120649c368f62e3783fc6" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.605912 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.712993 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl"] Apr 02 14:13:10 crc kubenswrapper[4732]: E0402 14:13:10.713824 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="160a19c0-4b2b-439a-9ea5-0f0ec2d4aede" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.713854 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="160a19c0-4b2b-439a-9ea5-0f0ec2d4aede" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.714104 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="160a19c0-4b2b-439a-9ea5-0f0ec2d4aede" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.715217 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.720805 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.720906 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.721003 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.721181 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.721303 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.722451 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.722570 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.722709 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wdhd4" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.734684 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl"] Apr 02 14:13:10 crc kubenswrapper[4732]: E0402 14:13:10.867888 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod160a19c0_4b2b_439a_9ea5_0f0ec2d4aede.slice\": RecentStats: unable to find data in memory cache]" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.874129 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.874205 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.874860 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.874919 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.874953 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.875088 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.875228 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.875274 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.875314 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.875402 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.875490 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htj7w\" (UniqueName: \"kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-kube-api-access-htj7w\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.875730 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.875799 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.875862 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.977521 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.977588 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htj7w\" (UniqueName: \"kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-kube-api-access-htj7w\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.977785 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.977829 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.977873 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.977948 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.977973 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.978006 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.978035 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.978064 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.978155 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.978206 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.978246 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.978287 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.983810 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.983816 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.984227 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.984540 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.985461 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.987680 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.988573 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.988583 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.988866 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.991795 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.991870 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.992395 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.992677 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:10 crc kubenswrapper[4732]: I0402 14:13:10.996925 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htj7w\" (UniqueName: \"kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-kube-api-access-htj7w\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:11 crc kubenswrapper[4732]: I0402 14:13:11.043937 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:11 crc kubenswrapper[4732]: I0402 14:13:11.586199 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl"] Apr 02 14:13:11 crc kubenswrapper[4732]: I0402 14:13:11.614733 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" event={"ID":"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd","Type":"ContainerStarted","Data":"dd3c3c3b62cf67994c9cbbf1a9e4b138c6e88bdf8c3515f3f09aecde16123179"} Apr 02 14:13:12 crc kubenswrapper[4732]: I0402 14:13:12.626302 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" event={"ID":"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd","Type":"ContainerStarted","Data":"2769739ff9a86f93e6a18c0172ffba588ea74fa0c9bb9b8867226dd24f398e8f"} Apr 02 14:13:12 crc kubenswrapper[4732]: I0402 14:13:12.650405 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" podStartSLOduration=2.278003465 podStartE2EDuration="2.650385118s" podCreationTimestamp="2026-04-02 14:13:10 +0000 UTC" firstStartedPulling="2026-04-02 14:13:11.58947967 +0000 UTC m=+2148.493887223" lastFinishedPulling="2026-04-02 14:13:11.961861323 +0000 UTC m=+2148.866268876" observedRunningTime="2026-04-02 14:13:12.645021756 +0000 UTC m=+2149.549429349" watchObservedRunningTime="2026-04-02 14:13:12.650385118 +0000 UTC m=+2149.554792681" Apr 02 14:13:32 crc kubenswrapper[4732]: I0402 14:13:32.045765 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-mzjkp"] Apr 02 14:13:32 crc kubenswrapper[4732]: I0402 14:13:32.058224 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-mzjkp"] Apr 02 14:13:32 crc kubenswrapper[4732]: I0402 14:13:32.693180 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a66a025-735c-4c9a-b4cd-06046d7d3881" path="/var/lib/kubelet/pods/6a66a025-735c-4c9a-b4cd-06046d7d3881/volumes" Apr 02 14:13:43 crc kubenswrapper[4732]: I0402 14:13:43.897041 4732 generic.go:334] "Generic (PLEG): container finished" podID="2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd" containerID="2769739ff9a86f93e6a18c0172ffba588ea74fa0c9bb9b8867226dd24f398e8f" exitCode=0 Apr 02 14:13:43 crc kubenswrapper[4732]: I0402 14:13:43.897118 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" event={"ID":"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd","Type":"ContainerDied","Data":"2769739ff9a86f93e6a18c0172ffba588ea74fa0c9bb9b8867226dd24f398e8f"} Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.317341 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.439057 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.439395 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-nova-combined-ca-bundle\") pod \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.439418 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.439458 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htj7w\" (UniqueName: \"kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-kube-api-access-htj7w\") pod \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.439484 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-ssh-key-openstack-edpm-ipam\") pod \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.439535 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.439558 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.439578 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-neutron-metadata-combined-ca-bundle\") pod \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.439603 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-inventory\") pod \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.439640 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-ovn-combined-ca-bundle\") pod \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.439688 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-repo-setup-combined-ca-bundle\") pod \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.439715 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-libvirt-combined-ca-bundle\") pod \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.439728 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-bootstrap-combined-ca-bundle\") pod \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.439750 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-telemetry-combined-ca-bundle\") pod \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\" (UID: \"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd\") " Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.446808 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd" (UID: "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.446857 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd" (UID: "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.446864 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd" (UID: "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.448691 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd" (UID: "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.448736 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-kube-api-access-htj7w" (OuterVolumeSpecName: "kube-api-access-htj7w") pod "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd" (UID: "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd"). InnerVolumeSpecName "kube-api-access-htj7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.449367 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd" (UID: "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.449451 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd" (UID: "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.449983 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd" (UID: "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.451224 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd" (UID: "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.452439 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd" (UID: "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.453053 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd" (UID: "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.460403 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd" (UID: "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.473789 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd" (UID: "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.474768 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-inventory" (OuterVolumeSpecName: "inventory") pod "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd" (UID: "2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.541914 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htj7w\" (UniqueName: \"kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-kube-api-access-htj7w\") on node \"crc\" DevicePath \"\"" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.541953 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.541963 4732 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.541973 4732 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.541984 4732 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.541993 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-inventory\") on node \"crc\" DevicePath \"\"" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.542003 4732 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.542014 4732 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.542024 4732 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.542033 4732 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.542041 4732 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.542050 4732 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.542059 4732 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.542067 4732 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.918754 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" event={"ID":"2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd","Type":"ContainerDied","Data":"dd3c3c3b62cf67994c9cbbf1a9e4b138c6e88bdf8c3515f3f09aecde16123179"} Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.918798 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd3c3c3b62cf67994c9cbbf1a9e4b138c6e88bdf8c3515f3f09aecde16123179" Apr 02 14:13:45 crc kubenswrapper[4732]: I0402 14:13:45.918875 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.014229 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7"] Apr 02 14:13:46 crc kubenswrapper[4732]: E0402 14:13:46.015008 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.015162 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.015876 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.017369 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.019462 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.020063 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.020687 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.023472 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wdhd4" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.024157 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.040469 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7"] Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.055542 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e6ca7706-9083-4555-b762-1d24315b85ea-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc8s7\" (UID: \"e6ca7706-9083-4555-b762-1d24315b85ea\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.055600 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ca7706-9083-4555-b762-1d24315b85ea-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc8s7\" (UID: \"e6ca7706-9083-4555-b762-1d24315b85ea\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.055710 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rlmr\" (UniqueName: \"kubernetes.io/projected/e6ca7706-9083-4555-b762-1d24315b85ea-kube-api-access-2rlmr\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc8s7\" (UID: \"e6ca7706-9083-4555-b762-1d24315b85ea\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.055742 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6ca7706-9083-4555-b762-1d24315b85ea-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc8s7\" (UID: \"e6ca7706-9083-4555-b762-1d24315b85ea\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.055813 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6ca7706-9083-4555-b762-1d24315b85ea-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc8s7\" (UID: \"e6ca7706-9083-4555-b762-1d24315b85ea\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.157670 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6ca7706-9083-4555-b762-1d24315b85ea-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc8s7\" (UID: \"e6ca7706-9083-4555-b762-1d24315b85ea\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.158330 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e6ca7706-9083-4555-b762-1d24315b85ea-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc8s7\" (UID: \"e6ca7706-9083-4555-b762-1d24315b85ea\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.158379 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ca7706-9083-4555-b762-1d24315b85ea-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc8s7\" (UID: \"e6ca7706-9083-4555-b762-1d24315b85ea\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.158446 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rlmr\" (UniqueName: \"kubernetes.io/projected/e6ca7706-9083-4555-b762-1d24315b85ea-kube-api-access-2rlmr\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc8s7\" (UID: \"e6ca7706-9083-4555-b762-1d24315b85ea\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.158488 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6ca7706-9083-4555-b762-1d24315b85ea-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc8s7\" (UID: \"e6ca7706-9083-4555-b762-1d24315b85ea\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.160092 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e6ca7706-9083-4555-b762-1d24315b85ea-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc8s7\" (UID: \"e6ca7706-9083-4555-b762-1d24315b85ea\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.162326 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6ca7706-9083-4555-b762-1d24315b85ea-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc8s7\" (UID: \"e6ca7706-9083-4555-b762-1d24315b85ea\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.162668 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ca7706-9083-4555-b762-1d24315b85ea-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc8s7\" (UID: \"e6ca7706-9083-4555-b762-1d24315b85ea\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.162853 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6ca7706-9083-4555-b762-1d24315b85ea-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc8s7\" (UID: \"e6ca7706-9083-4555-b762-1d24315b85ea\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.179374 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rlmr\" (UniqueName: \"kubernetes.io/projected/e6ca7706-9083-4555-b762-1d24315b85ea-kube-api-access-2rlmr\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc8s7\" (UID: \"e6ca7706-9083-4555-b762-1d24315b85ea\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.335345 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7" Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.899561 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7"] Apr 02 14:13:46 crc kubenswrapper[4732]: I0402 14:13:46.927152 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7" event={"ID":"e6ca7706-9083-4555-b762-1d24315b85ea","Type":"ContainerStarted","Data":"c2ee1409626ee5c1bdf3a93e37c989e006900721496ec322907350ac84fa4886"} Apr 02 14:13:48 crc kubenswrapper[4732]: I0402 14:13:48.946059 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7" event={"ID":"e6ca7706-9083-4555-b762-1d24315b85ea","Type":"ContainerStarted","Data":"0062700fc6544fd3804e6120c648c3b743fa7c46d7dfba40f5f6895466f46114"} Apr 02 14:13:48 crc kubenswrapper[4732]: I0402 14:13:48.970505 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7" podStartSLOduration=3.027757053 podStartE2EDuration="3.970480988s" podCreationTimestamp="2026-04-02 14:13:45 +0000 UTC" firstStartedPulling="2026-04-02 14:13:46.902726631 +0000 UTC m=+2183.807134194" lastFinishedPulling="2026-04-02 14:13:47.845450576 +0000 UTC m=+2184.749858129" observedRunningTime="2026-04-02 14:13:48.961877414 +0000 UTC m=+2185.866284987" watchObservedRunningTime="2026-04-02 14:13:48.970480988 +0000 UTC m=+2185.874888551" Apr 02 14:13:51 crc kubenswrapper[4732]: I0402 14:13:51.150592 4732 scope.go:117] "RemoveContainer" containerID="e8d9d5d06641186f39eedf48abb0fb9d40836867d71ca997e46a75ff5e9aa91a" Apr 02 14:14:00 crc kubenswrapper[4732]: I0402 14:14:00.138860 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585654-np2bl"] Apr 02 14:14:00 crc kubenswrapper[4732]: I0402 14:14:00.140835 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585654-np2bl" Apr 02 14:14:00 crc kubenswrapper[4732]: I0402 14:14:00.143526 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:14:00 crc kubenswrapper[4732]: I0402 14:14:00.144497 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:14:00 crc kubenswrapper[4732]: I0402 14:14:00.145538 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:14:00 crc kubenswrapper[4732]: I0402 14:14:00.154997 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585654-np2bl"] Apr 02 14:14:00 crc kubenswrapper[4732]: I0402 14:14:00.333135 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47ncd\" (UniqueName: \"kubernetes.io/projected/b24501bc-bc9f-464d-be9c-ab1dcc94bec7-kube-api-access-47ncd\") pod \"auto-csr-approver-29585654-np2bl\" (UID: \"b24501bc-bc9f-464d-be9c-ab1dcc94bec7\") " pod="openshift-infra/auto-csr-approver-29585654-np2bl" Apr 02 14:14:00 crc kubenswrapper[4732]: I0402 14:14:00.434816 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47ncd\" (UniqueName: \"kubernetes.io/projected/b24501bc-bc9f-464d-be9c-ab1dcc94bec7-kube-api-access-47ncd\") pod \"auto-csr-approver-29585654-np2bl\" (UID: \"b24501bc-bc9f-464d-be9c-ab1dcc94bec7\") " pod="openshift-infra/auto-csr-approver-29585654-np2bl" Apr 02 14:14:00 crc kubenswrapper[4732]: I0402 14:14:00.453199 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47ncd\" (UniqueName: \"kubernetes.io/projected/b24501bc-bc9f-464d-be9c-ab1dcc94bec7-kube-api-access-47ncd\") pod \"auto-csr-approver-29585654-np2bl\" (UID: \"b24501bc-bc9f-464d-be9c-ab1dcc94bec7\") " pod="openshift-infra/auto-csr-approver-29585654-np2bl" Apr 02 14:14:00 crc kubenswrapper[4732]: I0402 14:14:00.465161 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585654-np2bl" Apr 02 14:14:00 crc kubenswrapper[4732]: I0402 14:14:00.883515 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585654-np2bl"] Apr 02 14:14:00 crc kubenswrapper[4732]: W0402 14:14:00.884730 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb24501bc_bc9f_464d_be9c_ab1dcc94bec7.slice/crio-7fadb5dcc80cb2d12bc55121e872fa067cd36db3b08aef42ff09ee18d0c168b1 WatchSource:0}: Error finding container 7fadb5dcc80cb2d12bc55121e872fa067cd36db3b08aef42ff09ee18d0c168b1: Status 404 returned error can't find the container with id 7fadb5dcc80cb2d12bc55121e872fa067cd36db3b08aef42ff09ee18d0c168b1 Apr 02 14:14:01 crc kubenswrapper[4732]: I0402 14:14:01.043934 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585654-np2bl" event={"ID":"b24501bc-bc9f-464d-be9c-ab1dcc94bec7","Type":"ContainerStarted","Data":"7fadb5dcc80cb2d12bc55121e872fa067cd36db3b08aef42ff09ee18d0c168b1"} Apr 02 14:14:03 crc kubenswrapper[4732]: I0402 14:14:03.062968 4732 generic.go:334] "Generic (PLEG): container finished" podID="b24501bc-bc9f-464d-be9c-ab1dcc94bec7" containerID="3d9111f7026d45fd9c347915f69d6a6382d72447fc924385237d2a9409770a67" exitCode=0 Apr 02 14:14:03 crc kubenswrapper[4732]: I0402 14:14:03.062992 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585654-np2bl" event={"ID":"b24501bc-bc9f-464d-be9c-ab1dcc94bec7","Type":"ContainerDied","Data":"3d9111f7026d45fd9c347915f69d6a6382d72447fc924385237d2a9409770a67"} Apr 02 14:14:04 crc kubenswrapper[4732]: I0402 14:14:04.415428 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585654-np2bl" Apr 02 14:14:04 crc kubenswrapper[4732]: I0402 14:14:04.506833 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47ncd\" (UniqueName: \"kubernetes.io/projected/b24501bc-bc9f-464d-be9c-ab1dcc94bec7-kube-api-access-47ncd\") pod \"b24501bc-bc9f-464d-be9c-ab1dcc94bec7\" (UID: \"b24501bc-bc9f-464d-be9c-ab1dcc94bec7\") " Apr 02 14:14:04 crc kubenswrapper[4732]: I0402 14:14:04.512314 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b24501bc-bc9f-464d-be9c-ab1dcc94bec7-kube-api-access-47ncd" (OuterVolumeSpecName: "kube-api-access-47ncd") pod "b24501bc-bc9f-464d-be9c-ab1dcc94bec7" (UID: "b24501bc-bc9f-464d-be9c-ab1dcc94bec7"). InnerVolumeSpecName "kube-api-access-47ncd". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:14:04 crc kubenswrapper[4732]: I0402 14:14:04.609013 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47ncd\" (UniqueName: \"kubernetes.io/projected/b24501bc-bc9f-464d-be9c-ab1dcc94bec7-kube-api-access-47ncd\") on node \"crc\" DevicePath \"\"" Apr 02 14:14:05 crc kubenswrapper[4732]: I0402 14:14:05.084665 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585654-np2bl" event={"ID":"b24501bc-bc9f-464d-be9c-ab1dcc94bec7","Type":"ContainerDied","Data":"7fadb5dcc80cb2d12bc55121e872fa067cd36db3b08aef42ff09ee18d0c168b1"} Apr 02 14:14:05 crc kubenswrapper[4732]: I0402 14:14:05.085055 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fadb5dcc80cb2d12bc55121e872fa067cd36db3b08aef42ff09ee18d0c168b1" Apr 02 14:14:05 crc kubenswrapper[4732]: I0402 14:14:05.084765 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585654-np2bl" Apr 02 14:14:05 crc kubenswrapper[4732]: I0402 14:14:05.494412 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585648-5xpdd"] Apr 02 14:14:05 crc kubenswrapper[4732]: I0402 14:14:05.502561 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585648-5xpdd"] Apr 02 14:14:06 crc kubenswrapper[4732]: I0402 14:14:06.692267 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26576523-a1b0-4e31-9477-064954cf21a6" path="/var/lib/kubelet/pods/26576523-a1b0-4e31-9477-064954cf21a6/volumes" Apr 02 14:14:42 crc kubenswrapper[4732]: I0402 14:14:42.408176 4732 generic.go:334] "Generic (PLEG): container finished" podID="e6ca7706-9083-4555-b762-1d24315b85ea" containerID="0062700fc6544fd3804e6120c648c3b743fa7c46d7dfba40f5f6895466f46114" exitCode=0 Apr 02 14:14:42 crc kubenswrapper[4732]: I0402 14:14:42.408310 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7" event={"ID":"e6ca7706-9083-4555-b762-1d24315b85ea","Type":"ContainerDied","Data":"0062700fc6544fd3804e6120c648c3b743fa7c46d7dfba40f5f6895466f46114"} Apr 02 14:14:43 crc kubenswrapper[4732]: I0402 14:14:43.806757 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7" Apr 02 14:14:43 crc kubenswrapper[4732]: I0402 14:14:43.951997 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ca7706-9083-4555-b762-1d24315b85ea-ovn-combined-ca-bundle\") pod \"e6ca7706-9083-4555-b762-1d24315b85ea\" (UID: \"e6ca7706-9083-4555-b762-1d24315b85ea\") " Apr 02 14:14:43 crc kubenswrapper[4732]: I0402 14:14:43.952064 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rlmr\" (UniqueName: \"kubernetes.io/projected/e6ca7706-9083-4555-b762-1d24315b85ea-kube-api-access-2rlmr\") pod \"e6ca7706-9083-4555-b762-1d24315b85ea\" (UID: \"e6ca7706-9083-4555-b762-1d24315b85ea\") " Apr 02 14:14:43 crc kubenswrapper[4732]: I0402 14:14:43.952088 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6ca7706-9083-4555-b762-1d24315b85ea-inventory\") pod \"e6ca7706-9083-4555-b762-1d24315b85ea\" (UID: \"e6ca7706-9083-4555-b762-1d24315b85ea\") " Apr 02 14:14:43 crc kubenswrapper[4732]: I0402 14:14:43.952152 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e6ca7706-9083-4555-b762-1d24315b85ea-ovncontroller-config-0\") pod \"e6ca7706-9083-4555-b762-1d24315b85ea\" (UID: \"e6ca7706-9083-4555-b762-1d24315b85ea\") " Apr 02 14:14:43 crc kubenswrapper[4732]: I0402 14:14:43.952223 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6ca7706-9083-4555-b762-1d24315b85ea-ssh-key-openstack-edpm-ipam\") pod \"e6ca7706-9083-4555-b762-1d24315b85ea\" (UID: \"e6ca7706-9083-4555-b762-1d24315b85ea\") " Apr 02 14:14:43 crc kubenswrapper[4732]: I0402 14:14:43.959421 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6ca7706-9083-4555-b762-1d24315b85ea-kube-api-access-2rlmr" (OuterVolumeSpecName: "kube-api-access-2rlmr") pod "e6ca7706-9083-4555-b762-1d24315b85ea" (UID: "e6ca7706-9083-4555-b762-1d24315b85ea"). InnerVolumeSpecName "kube-api-access-2rlmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:14:43 crc kubenswrapper[4732]: I0402 14:14:43.962180 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6ca7706-9083-4555-b762-1d24315b85ea-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e6ca7706-9083-4555-b762-1d24315b85ea" (UID: "e6ca7706-9083-4555-b762-1d24315b85ea"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:14:43 crc kubenswrapper[4732]: I0402 14:14:43.983746 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6ca7706-9083-4555-b762-1d24315b85ea-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e6ca7706-9083-4555-b762-1d24315b85ea" (UID: "e6ca7706-9083-4555-b762-1d24315b85ea"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:14:43 crc kubenswrapper[4732]: I0402 14:14:43.985334 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6ca7706-9083-4555-b762-1d24315b85ea-inventory" (OuterVolumeSpecName: "inventory") pod "e6ca7706-9083-4555-b762-1d24315b85ea" (UID: "e6ca7706-9083-4555-b762-1d24315b85ea"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:14:43 crc kubenswrapper[4732]: I0402 14:14:43.988393 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6ca7706-9083-4555-b762-1d24315b85ea-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "e6ca7706-9083-4555-b762-1d24315b85ea" (UID: "e6ca7706-9083-4555-b762-1d24315b85ea"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.053803 4732 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ca7706-9083-4555-b762-1d24315b85ea-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.053841 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rlmr\" (UniqueName: \"kubernetes.io/projected/e6ca7706-9083-4555-b762-1d24315b85ea-kube-api-access-2rlmr\") on node \"crc\" DevicePath \"\"" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.053850 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6ca7706-9083-4555-b762-1d24315b85ea-inventory\") on node \"crc\" DevicePath \"\"" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.053859 4732 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e6ca7706-9083-4555-b762-1d24315b85ea-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.053868 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6ca7706-9083-4555-b762-1d24315b85ea-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.432557 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7" event={"ID":"e6ca7706-9083-4555-b762-1d24315b85ea","Type":"ContainerDied","Data":"c2ee1409626ee5c1bdf3a93e37c989e006900721496ec322907350ac84fa4886"} Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.432645 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2ee1409626ee5c1bdf3a93e37c989e006900721496ec322907350ac84fa4886" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.432657 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc8s7" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.548732 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt"] Apr 02 14:14:44 crc kubenswrapper[4732]: E0402 14:14:44.549169 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6ca7706-9083-4555-b762-1d24315b85ea" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.549192 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6ca7706-9083-4555-b762-1d24315b85ea" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Apr 02 14:14:44 crc kubenswrapper[4732]: E0402 14:14:44.549241 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b24501bc-bc9f-464d-be9c-ab1dcc94bec7" containerName="oc" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.549250 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b24501bc-bc9f-464d-be9c-ab1dcc94bec7" containerName="oc" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.549444 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6ca7706-9083-4555-b762-1d24315b85ea" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.549474 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="b24501bc-bc9f-464d-be9c-ab1dcc94bec7" containerName="oc" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.550094 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.552742 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.552873 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.552893 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.552902 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wdhd4" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.552969 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.559093 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt"] Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.560321 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.564791 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt\" (UID: \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.564852 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt\" (UID: \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.564886 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7b78\" (UniqueName: \"kubernetes.io/projected/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-kube-api-access-g7b78\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt\" (UID: \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.564933 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt\" (UID: \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.564961 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt\" (UID: \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.565065 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt\" (UID: \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.666313 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt\" (UID: \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.666381 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt\" (UID: \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.666580 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt\" (UID: \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.666700 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt\" (UID: \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.666749 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt\" (UID: \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.666781 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7b78\" (UniqueName: \"kubernetes.io/projected/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-kube-api-access-g7b78\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt\" (UID: \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.671585 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt\" (UID: \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.671660 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt\" (UID: \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.672802 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt\" (UID: \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.673046 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt\" (UID: \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.677277 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt\" (UID: \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.684635 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7b78\" (UniqueName: \"kubernetes.io/projected/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-kube-api-access-g7b78\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt\" (UID: \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" Apr 02 14:14:44 crc kubenswrapper[4732]: I0402 14:14:44.872423 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" Apr 02 14:14:45 crc kubenswrapper[4732]: I0402 14:14:45.385835 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt"] Apr 02 14:14:45 crc kubenswrapper[4732]: I0402 14:14:45.441357 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" event={"ID":"9e89ed59-ef4b-44a7-b6de-d98b2319ee10","Type":"ContainerStarted","Data":"14396479b937569d57e99124bf52469046a95216f457c01e3eacdcde4612fecb"} Apr 02 14:14:46 crc kubenswrapper[4732]: I0402 14:14:46.103261 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m5tvn"] Apr 02 14:14:46 crc kubenswrapper[4732]: I0402 14:14:46.105998 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5tvn" Apr 02 14:14:46 crc kubenswrapper[4732]: I0402 14:14:46.114490 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m5tvn"] Apr 02 14:14:46 crc kubenswrapper[4732]: I0402 14:14:46.297024 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzfcw\" (UniqueName: \"kubernetes.io/projected/028fb44c-e0de-4f8f-b0f6-2896a0e110e7-kube-api-access-bzfcw\") pod \"redhat-operators-m5tvn\" (UID: \"028fb44c-e0de-4f8f-b0f6-2896a0e110e7\") " pod="openshift-marketplace/redhat-operators-m5tvn" Apr 02 14:14:46 crc kubenswrapper[4732]: I0402 14:14:46.297476 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/028fb44c-e0de-4f8f-b0f6-2896a0e110e7-utilities\") pod \"redhat-operators-m5tvn\" (UID: \"028fb44c-e0de-4f8f-b0f6-2896a0e110e7\") " pod="openshift-marketplace/redhat-operators-m5tvn" Apr 02 14:14:46 crc kubenswrapper[4732]: I0402 14:14:46.297542 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/028fb44c-e0de-4f8f-b0f6-2896a0e110e7-catalog-content\") pod \"redhat-operators-m5tvn\" (UID: \"028fb44c-e0de-4f8f-b0f6-2896a0e110e7\") " pod="openshift-marketplace/redhat-operators-m5tvn" Apr 02 14:14:46 crc kubenswrapper[4732]: I0402 14:14:46.399710 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/028fb44c-e0de-4f8f-b0f6-2896a0e110e7-utilities\") pod \"redhat-operators-m5tvn\" (UID: \"028fb44c-e0de-4f8f-b0f6-2896a0e110e7\") " pod="openshift-marketplace/redhat-operators-m5tvn" Apr 02 14:14:46 crc kubenswrapper[4732]: I0402 14:14:46.400442 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/028fb44c-e0de-4f8f-b0f6-2896a0e110e7-utilities\") pod \"redhat-operators-m5tvn\" (UID: \"028fb44c-e0de-4f8f-b0f6-2896a0e110e7\") " pod="openshift-marketplace/redhat-operators-m5tvn" Apr 02 14:14:46 crc kubenswrapper[4732]: I0402 14:14:46.401412 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/028fb44c-e0de-4f8f-b0f6-2896a0e110e7-catalog-content\") pod \"redhat-operators-m5tvn\" (UID: \"028fb44c-e0de-4f8f-b0f6-2896a0e110e7\") " pod="openshift-marketplace/redhat-operators-m5tvn" Apr 02 14:14:46 crc kubenswrapper[4732]: I0402 14:14:46.401805 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/028fb44c-e0de-4f8f-b0f6-2896a0e110e7-catalog-content\") pod \"redhat-operators-m5tvn\" (UID: \"028fb44c-e0de-4f8f-b0f6-2896a0e110e7\") " pod="openshift-marketplace/redhat-operators-m5tvn" Apr 02 14:14:46 crc kubenswrapper[4732]: I0402 14:14:46.402010 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzfcw\" (UniqueName: \"kubernetes.io/projected/028fb44c-e0de-4f8f-b0f6-2896a0e110e7-kube-api-access-bzfcw\") pod \"redhat-operators-m5tvn\" (UID: \"028fb44c-e0de-4f8f-b0f6-2896a0e110e7\") " pod="openshift-marketplace/redhat-operators-m5tvn" Apr 02 14:14:46 crc kubenswrapper[4732]: I0402 14:14:46.433630 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzfcw\" (UniqueName: \"kubernetes.io/projected/028fb44c-e0de-4f8f-b0f6-2896a0e110e7-kube-api-access-bzfcw\") pod \"redhat-operators-m5tvn\" (UID: \"028fb44c-e0de-4f8f-b0f6-2896a0e110e7\") " pod="openshift-marketplace/redhat-operators-m5tvn" Apr 02 14:14:46 crc kubenswrapper[4732]: I0402 14:14:46.451851 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" event={"ID":"9e89ed59-ef4b-44a7-b6de-d98b2319ee10","Type":"ContainerStarted","Data":"b09139be10aeed3f655acbf4ad4a353e5efa19e236f2e613475fa89c92ad6326"} Apr 02 14:14:46 crc kubenswrapper[4732]: I0402 14:14:46.476571 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" podStartSLOduration=2.052513688 podStartE2EDuration="2.476555387s" podCreationTimestamp="2026-04-02 14:14:44 +0000 UTC" firstStartedPulling="2026-04-02 14:14:45.390496963 +0000 UTC m=+2242.294904516" lastFinishedPulling="2026-04-02 14:14:45.814538662 +0000 UTC m=+2242.718946215" observedRunningTime="2026-04-02 14:14:46.467401248 +0000 UTC m=+2243.371808801" watchObservedRunningTime="2026-04-02 14:14:46.476555387 +0000 UTC m=+2243.380962940" Apr 02 14:14:46 crc kubenswrapper[4732]: I0402 14:14:46.730841 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5tvn" Apr 02 14:14:47 crc kubenswrapper[4732]: I0402 14:14:47.211336 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m5tvn"] Apr 02 14:14:47 crc kubenswrapper[4732]: W0402 14:14:47.224832 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod028fb44c_e0de_4f8f_b0f6_2896a0e110e7.slice/crio-06f0a6ee212867905e2e098dd88652870b9611fc75742e035ac0527e45eee4ec WatchSource:0}: Error finding container 06f0a6ee212867905e2e098dd88652870b9611fc75742e035ac0527e45eee4ec: Status 404 returned error can't find the container with id 06f0a6ee212867905e2e098dd88652870b9611fc75742e035ac0527e45eee4ec Apr 02 14:14:47 crc kubenswrapper[4732]: I0402 14:14:47.461934 4732 generic.go:334] "Generic (PLEG): container finished" podID="028fb44c-e0de-4f8f-b0f6-2896a0e110e7" containerID="5ed71a4176667817b6f6fda598598cf9a3aaa4eb7612e2f08219a7b025bbcc3b" exitCode=0 Apr 02 14:14:47 crc kubenswrapper[4732]: I0402 14:14:47.461981 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5tvn" event={"ID":"028fb44c-e0de-4f8f-b0f6-2896a0e110e7","Type":"ContainerDied","Data":"5ed71a4176667817b6f6fda598598cf9a3aaa4eb7612e2f08219a7b025bbcc3b"} Apr 02 14:14:47 crc kubenswrapper[4732]: I0402 14:14:47.462035 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5tvn" event={"ID":"028fb44c-e0de-4f8f-b0f6-2896a0e110e7","Type":"ContainerStarted","Data":"06f0a6ee212867905e2e098dd88652870b9611fc75742e035ac0527e45eee4ec"} Apr 02 14:14:48 crc kubenswrapper[4732]: I0402 14:14:48.470584 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5tvn" event={"ID":"028fb44c-e0de-4f8f-b0f6-2896a0e110e7","Type":"ContainerStarted","Data":"b5f385fafa8ae391e7fa6dd2a19f6486c0d73528451d9e4766d4815b47d463c9"} Apr 02 14:14:50 crc kubenswrapper[4732]: I0402 14:14:50.494117 4732 generic.go:334] "Generic (PLEG): container finished" podID="028fb44c-e0de-4f8f-b0f6-2896a0e110e7" containerID="b5f385fafa8ae391e7fa6dd2a19f6486c0d73528451d9e4766d4815b47d463c9" exitCode=0 Apr 02 14:14:50 crc kubenswrapper[4732]: I0402 14:14:50.494168 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5tvn" event={"ID":"028fb44c-e0de-4f8f-b0f6-2896a0e110e7","Type":"ContainerDied","Data":"b5f385fafa8ae391e7fa6dd2a19f6486c0d73528451d9e4766d4815b47d463c9"} Apr 02 14:14:51 crc kubenswrapper[4732]: I0402 14:14:51.273377 4732 scope.go:117] "RemoveContainer" containerID="c2b0a42edffbf54c0807d243ec7a9400d0398d3ebeae7e9de25537a4159fd61b" Apr 02 14:14:52 crc kubenswrapper[4732]: I0402 14:14:52.516445 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5tvn" event={"ID":"028fb44c-e0de-4f8f-b0f6-2896a0e110e7","Type":"ContainerStarted","Data":"7b2b723916ab5aef19dde721e1f849565d1c9fa18609604dd8321039c7c52f85"} Apr 02 14:14:52 crc kubenswrapper[4732]: I0402 14:14:52.550216 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m5tvn" podStartSLOduration=2.660844125 podStartE2EDuration="6.550199519s" podCreationTimestamp="2026-04-02 14:14:46 +0000 UTC" firstStartedPulling="2026-04-02 14:14:47.464222678 +0000 UTC m=+2244.368630231" lastFinishedPulling="2026-04-02 14:14:51.353578072 +0000 UTC m=+2248.257985625" observedRunningTime="2026-04-02 14:14:52.546012555 +0000 UTC m=+2249.450420138" watchObservedRunningTime="2026-04-02 14:14:52.550199519 +0000 UTC m=+2249.454607072" Apr 02 14:14:56 crc kubenswrapper[4732]: I0402 14:14:56.731531 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m5tvn" Apr 02 14:14:56 crc kubenswrapper[4732]: I0402 14:14:56.732392 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m5tvn" Apr 02 14:14:57 crc kubenswrapper[4732]: I0402 14:14:57.780014 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m5tvn" podUID="028fb44c-e0de-4f8f-b0f6-2896a0e110e7" containerName="registry-server" probeResult="failure" output=< Apr 02 14:14:57 crc kubenswrapper[4732]: timeout: failed to connect service ":50051" within 1s Apr 02 14:14:57 crc kubenswrapper[4732]: > Apr 02 14:15:00 crc kubenswrapper[4732]: I0402 14:15:00.145037 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29585655-m7mtf"] Apr 02 14:15:00 crc kubenswrapper[4732]: I0402 14:15:00.147365 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29585655-m7mtf" Apr 02 14:15:00 crc kubenswrapper[4732]: I0402 14:15:00.150492 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Apr 02 14:15:00 crc kubenswrapper[4732]: I0402 14:15:00.150647 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Apr 02 14:15:00 crc kubenswrapper[4732]: I0402 14:15:00.155085 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29585655-m7mtf"] Apr 02 14:15:00 crc kubenswrapper[4732]: I0402 14:15:00.285696 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tsdd\" (UniqueName: \"kubernetes.io/projected/cb678697-dd92-42e8-836f-6f4510b21522-kube-api-access-7tsdd\") pod \"collect-profiles-29585655-m7mtf\" (UID: \"cb678697-dd92-42e8-836f-6f4510b21522\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585655-m7mtf" Apr 02 14:15:00 crc kubenswrapper[4732]: I0402 14:15:00.285805 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb678697-dd92-42e8-836f-6f4510b21522-config-volume\") pod \"collect-profiles-29585655-m7mtf\" (UID: \"cb678697-dd92-42e8-836f-6f4510b21522\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585655-m7mtf" Apr 02 14:15:00 crc kubenswrapper[4732]: I0402 14:15:00.285849 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb678697-dd92-42e8-836f-6f4510b21522-secret-volume\") pod \"collect-profiles-29585655-m7mtf\" (UID: \"cb678697-dd92-42e8-836f-6f4510b21522\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585655-m7mtf" Apr 02 14:15:00 crc kubenswrapper[4732]: I0402 14:15:00.387798 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tsdd\" (UniqueName: \"kubernetes.io/projected/cb678697-dd92-42e8-836f-6f4510b21522-kube-api-access-7tsdd\") pod \"collect-profiles-29585655-m7mtf\" (UID: \"cb678697-dd92-42e8-836f-6f4510b21522\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585655-m7mtf" Apr 02 14:15:00 crc kubenswrapper[4732]: I0402 14:15:00.387914 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb678697-dd92-42e8-836f-6f4510b21522-config-volume\") pod \"collect-profiles-29585655-m7mtf\" (UID: \"cb678697-dd92-42e8-836f-6f4510b21522\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585655-m7mtf" Apr 02 14:15:00 crc kubenswrapper[4732]: I0402 14:15:00.387959 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb678697-dd92-42e8-836f-6f4510b21522-secret-volume\") pod \"collect-profiles-29585655-m7mtf\" (UID: \"cb678697-dd92-42e8-836f-6f4510b21522\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585655-m7mtf" Apr 02 14:15:00 crc kubenswrapper[4732]: I0402 14:15:00.388966 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb678697-dd92-42e8-836f-6f4510b21522-config-volume\") pod \"collect-profiles-29585655-m7mtf\" (UID: \"cb678697-dd92-42e8-836f-6f4510b21522\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585655-m7mtf" Apr 02 14:15:00 crc kubenswrapper[4732]: I0402 14:15:00.394216 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb678697-dd92-42e8-836f-6f4510b21522-secret-volume\") pod \"collect-profiles-29585655-m7mtf\" (UID: \"cb678697-dd92-42e8-836f-6f4510b21522\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585655-m7mtf" Apr 02 14:15:00 crc kubenswrapper[4732]: I0402 14:15:00.405572 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tsdd\" (UniqueName: \"kubernetes.io/projected/cb678697-dd92-42e8-836f-6f4510b21522-kube-api-access-7tsdd\") pod \"collect-profiles-29585655-m7mtf\" (UID: \"cb678697-dd92-42e8-836f-6f4510b21522\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585655-m7mtf" Apr 02 14:15:00 crc kubenswrapper[4732]: I0402 14:15:00.473534 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29585655-m7mtf" Apr 02 14:15:00 crc kubenswrapper[4732]: I0402 14:15:00.932143 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29585655-m7mtf"] Apr 02 14:15:00 crc kubenswrapper[4732]: W0402 14:15:00.937377 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb678697_dd92_42e8_836f_6f4510b21522.slice/crio-f4942db9b01ef497521a8513ff2ea8b00bf571d0d6bac73839fc088a4119c4bd WatchSource:0}: Error finding container f4942db9b01ef497521a8513ff2ea8b00bf571d0d6bac73839fc088a4119c4bd: Status 404 returned error can't find the container with id f4942db9b01ef497521a8513ff2ea8b00bf571d0d6bac73839fc088a4119c4bd Apr 02 14:15:01 crc kubenswrapper[4732]: I0402 14:15:01.603342 4732 generic.go:334] "Generic (PLEG): container finished" podID="cb678697-dd92-42e8-836f-6f4510b21522" containerID="bde214fd31dc4f7822deb885fc952bf1053799b65abb9bf9011e5453f90416eb" exitCode=0 Apr 02 14:15:01 crc kubenswrapper[4732]: I0402 14:15:01.603398 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29585655-m7mtf" event={"ID":"cb678697-dd92-42e8-836f-6f4510b21522","Type":"ContainerDied","Data":"bde214fd31dc4f7822deb885fc952bf1053799b65abb9bf9011e5453f90416eb"} Apr 02 14:15:01 crc kubenswrapper[4732]: I0402 14:15:01.603429 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29585655-m7mtf" event={"ID":"cb678697-dd92-42e8-836f-6f4510b21522","Type":"ContainerStarted","Data":"f4942db9b01ef497521a8513ff2ea8b00bf571d0d6bac73839fc088a4119c4bd"} Apr 02 14:15:01 crc kubenswrapper[4732]: I0402 14:15:01.924443 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:15:01 crc kubenswrapper[4732]: I0402 14:15:01.924530 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:15:03 crc kubenswrapper[4732]: I0402 14:15:03.018114 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29585655-m7mtf" Apr 02 14:15:03 crc kubenswrapper[4732]: I0402 14:15:03.140318 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb678697-dd92-42e8-836f-6f4510b21522-config-volume\") pod \"cb678697-dd92-42e8-836f-6f4510b21522\" (UID: \"cb678697-dd92-42e8-836f-6f4510b21522\") " Apr 02 14:15:03 crc kubenswrapper[4732]: I0402 14:15:03.140459 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tsdd\" (UniqueName: \"kubernetes.io/projected/cb678697-dd92-42e8-836f-6f4510b21522-kube-api-access-7tsdd\") pod \"cb678697-dd92-42e8-836f-6f4510b21522\" (UID: \"cb678697-dd92-42e8-836f-6f4510b21522\") " Apr 02 14:15:03 crc kubenswrapper[4732]: I0402 14:15:03.140507 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb678697-dd92-42e8-836f-6f4510b21522-secret-volume\") pod \"cb678697-dd92-42e8-836f-6f4510b21522\" (UID: \"cb678697-dd92-42e8-836f-6f4510b21522\") " Apr 02 14:15:03 crc kubenswrapper[4732]: I0402 14:15:03.141322 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb678697-dd92-42e8-836f-6f4510b21522-config-volume" (OuterVolumeSpecName: "config-volume") pod "cb678697-dd92-42e8-836f-6f4510b21522" (UID: "cb678697-dd92-42e8-836f-6f4510b21522"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:15:03 crc kubenswrapper[4732]: I0402 14:15:03.146781 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb678697-dd92-42e8-836f-6f4510b21522-kube-api-access-7tsdd" (OuterVolumeSpecName: "kube-api-access-7tsdd") pod "cb678697-dd92-42e8-836f-6f4510b21522" (UID: "cb678697-dd92-42e8-836f-6f4510b21522"). InnerVolumeSpecName "kube-api-access-7tsdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:15:03 crc kubenswrapper[4732]: I0402 14:15:03.146962 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb678697-dd92-42e8-836f-6f4510b21522-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cb678697-dd92-42e8-836f-6f4510b21522" (UID: "cb678697-dd92-42e8-836f-6f4510b21522"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:15:03 crc kubenswrapper[4732]: I0402 14:15:03.242602 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tsdd\" (UniqueName: \"kubernetes.io/projected/cb678697-dd92-42e8-836f-6f4510b21522-kube-api-access-7tsdd\") on node \"crc\" DevicePath \"\"" Apr 02 14:15:03 crc kubenswrapper[4732]: I0402 14:15:03.242667 4732 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb678697-dd92-42e8-836f-6f4510b21522-secret-volume\") on node \"crc\" DevicePath \"\"" Apr 02 14:15:03 crc kubenswrapper[4732]: I0402 14:15:03.242681 4732 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb678697-dd92-42e8-836f-6f4510b21522-config-volume\") on node \"crc\" DevicePath \"\"" Apr 02 14:15:03 crc kubenswrapper[4732]: I0402 14:15:03.637224 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29585655-m7mtf" event={"ID":"cb678697-dd92-42e8-836f-6f4510b21522","Type":"ContainerDied","Data":"f4942db9b01ef497521a8513ff2ea8b00bf571d0d6bac73839fc088a4119c4bd"} Apr 02 14:15:03 crc kubenswrapper[4732]: I0402 14:15:03.637283 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4942db9b01ef497521a8513ff2ea8b00bf571d0d6bac73839fc088a4119c4bd" Apr 02 14:15:03 crc kubenswrapper[4732]: I0402 14:15:03.637344 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29585655-m7mtf" Apr 02 14:15:04 crc kubenswrapper[4732]: I0402 14:15:04.101674 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29585610-wbzhk"] Apr 02 14:15:04 crc kubenswrapper[4732]: I0402 14:15:04.113179 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29585610-wbzhk"] Apr 02 14:15:04 crc kubenswrapper[4732]: I0402 14:15:04.695989 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67577cc5-6dee-4465-beee-ea424d976972" path="/var/lib/kubelet/pods/67577cc5-6dee-4465-beee-ea424d976972/volumes" Apr 02 14:15:06 crc kubenswrapper[4732]: I0402 14:15:06.778791 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m5tvn" Apr 02 14:15:06 crc kubenswrapper[4732]: I0402 14:15:06.823856 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m5tvn" Apr 02 14:15:07 crc kubenswrapper[4732]: I0402 14:15:07.022172 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m5tvn"] Apr 02 14:15:08 crc kubenswrapper[4732]: I0402 14:15:08.677915 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m5tvn" podUID="028fb44c-e0de-4f8f-b0f6-2896a0e110e7" containerName="registry-server" containerID="cri-o://7b2b723916ab5aef19dde721e1f849565d1c9fa18609604dd8321039c7c52f85" gracePeriod=2 Apr 02 14:15:09 crc kubenswrapper[4732]: I0402 14:15:09.111660 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5tvn" Apr 02 14:15:09 crc kubenswrapper[4732]: I0402 14:15:09.167173 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/028fb44c-e0de-4f8f-b0f6-2896a0e110e7-catalog-content\") pod \"028fb44c-e0de-4f8f-b0f6-2896a0e110e7\" (UID: \"028fb44c-e0de-4f8f-b0f6-2896a0e110e7\") " Apr 02 14:15:09 crc kubenswrapper[4732]: I0402 14:15:09.167254 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/028fb44c-e0de-4f8f-b0f6-2896a0e110e7-utilities\") pod \"028fb44c-e0de-4f8f-b0f6-2896a0e110e7\" (UID: \"028fb44c-e0de-4f8f-b0f6-2896a0e110e7\") " Apr 02 14:15:09 crc kubenswrapper[4732]: I0402 14:15:09.167352 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzfcw\" (UniqueName: \"kubernetes.io/projected/028fb44c-e0de-4f8f-b0f6-2896a0e110e7-kube-api-access-bzfcw\") pod \"028fb44c-e0de-4f8f-b0f6-2896a0e110e7\" (UID: \"028fb44c-e0de-4f8f-b0f6-2896a0e110e7\") " Apr 02 14:15:09 crc kubenswrapper[4732]: I0402 14:15:09.168104 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/028fb44c-e0de-4f8f-b0f6-2896a0e110e7-utilities" (OuterVolumeSpecName: "utilities") pod "028fb44c-e0de-4f8f-b0f6-2896a0e110e7" (UID: "028fb44c-e0de-4f8f-b0f6-2896a0e110e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:15:09 crc kubenswrapper[4732]: I0402 14:15:09.173476 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/028fb44c-e0de-4f8f-b0f6-2896a0e110e7-kube-api-access-bzfcw" (OuterVolumeSpecName: "kube-api-access-bzfcw") pod "028fb44c-e0de-4f8f-b0f6-2896a0e110e7" (UID: "028fb44c-e0de-4f8f-b0f6-2896a0e110e7"). InnerVolumeSpecName "kube-api-access-bzfcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:15:09 crc kubenswrapper[4732]: I0402 14:15:09.269082 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzfcw\" (UniqueName: \"kubernetes.io/projected/028fb44c-e0de-4f8f-b0f6-2896a0e110e7-kube-api-access-bzfcw\") on node \"crc\" DevicePath \"\"" Apr 02 14:15:09 crc kubenswrapper[4732]: I0402 14:15:09.269119 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/028fb44c-e0de-4f8f-b0f6-2896a0e110e7-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 14:15:09 crc kubenswrapper[4732]: I0402 14:15:09.284960 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/028fb44c-e0de-4f8f-b0f6-2896a0e110e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "028fb44c-e0de-4f8f-b0f6-2896a0e110e7" (UID: "028fb44c-e0de-4f8f-b0f6-2896a0e110e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:15:09 crc kubenswrapper[4732]: I0402 14:15:09.371676 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/028fb44c-e0de-4f8f-b0f6-2896a0e110e7-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 14:15:09 crc kubenswrapper[4732]: I0402 14:15:09.693917 4732 generic.go:334] "Generic (PLEG): container finished" podID="028fb44c-e0de-4f8f-b0f6-2896a0e110e7" containerID="7b2b723916ab5aef19dde721e1f849565d1c9fa18609604dd8321039c7c52f85" exitCode=0 Apr 02 14:15:09 crc kubenswrapper[4732]: I0402 14:15:09.693981 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5tvn" event={"ID":"028fb44c-e0de-4f8f-b0f6-2896a0e110e7","Type":"ContainerDied","Data":"7b2b723916ab5aef19dde721e1f849565d1c9fa18609604dd8321039c7c52f85"} Apr 02 14:15:09 crc kubenswrapper[4732]: I0402 14:15:09.693994 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5tvn" Apr 02 14:15:09 crc kubenswrapper[4732]: I0402 14:15:09.694020 4732 scope.go:117] "RemoveContainer" containerID="7b2b723916ab5aef19dde721e1f849565d1c9fa18609604dd8321039c7c52f85" Apr 02 14:15:09 crc kubenswrapper[4732]: I0402 14:15:09.694009 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5tvn" event={"ID":"028fb44c-e0de-4f8f-b0f6-2896a0e110e7","Type":"ContainerDied","Data":"06f0a6ee212867905e2e098dd88652870b9611fc75742e035ac0527e45eee4ec"} Apr 02 14:15:09 crc kubenswrapper[4732]: I0402 14:15:09.735199 4732 scope.go:117] "RemoveContainer" containerID="b5f385fafa8ae391e7fa6dd2a19f6486c0d73528451d9e4766d4815b47d463c9" Apr 02 14:15:09 crc kubenswrapper[4732]: I0402 14:15:09.741736 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m5tvn"] Apr 02 14:15:09 crc kubenswrapper[4732]: I0402 14:15:09.754652 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m5tvn"] Apr 02 14:15:09 crc kubenswrapper[4732]: I0402 14:15:09.772552 4732 scope.go:117] "RemoveContainer" containerID="5ed71a4176667817b6f6fda598598cf9a3aaa4eb7612e2f08219a7b025bbcc3b" Apr 02 14:15:09 crc kubenswrapper[4732]: I0402 14:15:09.813945 4732 scope.go:117] "RemoveContainer" containerID="7b2b723916ab5aef19dde721e1f849565d1c9fa18609604dd8321039c7c52f85" Apr 02 14:15:09 crc kubenswrapper[4732]: E0402 14:15:09.814476 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b2b723916ab5aef19dde721e1f849565d1c9fa18609604dd8321039c7c52f85\": container with ID starting with 7b2b723916ab5aef19dde721e1f849565d1c9fa18609604dd8321039c7c52f85 not found: ID does not exist" containerID="7b2b723916ab5aef19dde721e1f849565d1c9fa18609604dd8321039c7c52f85" Apr 02 14:15:09 crc kubenswrapper[4732]: I0402 14:15:09.814528 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b2b723916ab5aef19dde721e1f849565d1c9fa18609604dd8321039c7c52f85"} err="failed to get container status \"7b2b723916ab5aef19dde721e1f849565d1c9fa18609604dd8321039c7c52f85\": rpc error: code = NotFound desc = could not find container \"7b2b723916ab5aef19dde721e1f849565d1c9fa18609604dd8321039c7c52f85\": container with ID starting with 7b2b723916ab5aef19dde721e1f849565d1c9fa18609604dd8321039c7c52f85 not found: ID does not exist" Apr 02 14:15:09 crc kubenswrapper[4732]: I0402 14:15:09.814565 4732 scope.go:117] "RemoveContainer" containerID="b5f385fafa8ae391e7fa6dd2a19f6486c0d73528451d9e4766d4815b47d463c9" Apr 02 14:15:09 crc kubenswrapper[4732]: E0402 14:15:09.815163 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5f385fafa8ae391e7fa6dd2a19f6486c0d73528451d9e4766d4815b47d463c9\": container with ID starting with b5f385fafa8ae391e7fa6dd2a19f6486c0d73528451d9e4766d4815b47d463c9 not found: ID does not exist" containerID="b5f385fafa8ae391e7fa6dd2a19f6486c0d73528451d9e4766d4815b47d463c9" Apr 02 14:15:09 crc kubenswrapper[4732]: I0402 14:15:09.815232 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5f385fafa8ae391e7fa6dd2a19f6486c0d73528451d9e4766d4815b47d463c9"} err="failed to get container status \"b5f385fafa8ae391e7fa6dd2a19f6486c0d73528451d9e4766d4815b47d463c9\": rpc error: code = NotFound desc = could not find container \"b5f385fafa8ae391e7fa6dd2a19f6486c0d73528451d9e4766d4815b47d463c9\": container with ID starting with b5f385fafa8ae391e7fa6dd2a19f6486c0d73528451d9e4766d4815b47d463c9 not found: ID does not exist" Apr 02 14:15:09 crc kubenswrapper[4732]: I0402 14:15:09.815282 4732 scope.go:117] "RemoveContainer" containerID="5ed71a4176667817b6f6fda598598cf9a3aaa4eb7612e2f08219a7b025bbcc3b" Apr 02 14:15:09 crc kubenswrapper[4732]: E0402 14:15:09.815681 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ed71a4176667817b6f6fda598598cf9a3aaa4eb7612e2f08219a7b025bbcc3b\": container with ID starting with 5ed71a4176667817b6f6fda598598cf9a3aaa4eb7612e2f08219a7b025bbcc3b not found: ID does not exist" containerID="5ed71a4176667817b6f6fda598598cf9a3aaa4eb7612e2f08219a7b025bbcc3b" Apr 02 14:15:09 crc kubenswrapper[4732]: I0402 14:15:09.815749 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ed71a4176667817b6f6fda598598cf9a3aaa4eb7612e2f08219a7b025bbcc3b"} err="failed to get container status \"5ed71a4176667817b6f6fda598598cf9a3aaa4eb7612e2f08219a7b025bbcc3b\": rpc error: code = NotFound desc = could not find container \"5ed71a4176667817b6f6fda598598cf9a3aaa4eb7612e2f08219a7b025bbcc3b\": container with ID starting with 5ed71a4176667817b6f6fda598598cf9a3aaa4eb7612e2f08219a7b025bbcc3b not found: ID does not exist" Apr 02 14:15:10 crc kubenswrapper[4732]: I0402 14:15:10.722229 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="028fb44c-e0de-4f8f-b0f6-2896a0e110e7" path="/var/lib/kubelet/pods/028fb44c-e0de-4f8f-b0f6-2896a0e110e7/volumes" Apr 02 14:15:27 crc kubenswrapper[4732]: I0402 14:15:27.876929 4732 generic.go:334] "Generic (PLEG): container finished" podID="9e89ed59-ef4b-44a7-b6de-d98b2319ee10" containerID="b09139be10aeed3f655acbf4ad4a353e5efa19e236f2e613475fa89c92ad6326" exitCode=0 Apr 02 14:15:27 crc kubenswrapper[4732]: I0402 14:15:27.877064 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" event={"ID":"9e89ed59-ef4b-44a7-b6de-d98b2319ee10","Type":"ContainerDied","Data":"b09139be10aeed3f655acbf4ad4a353e5efa19e236f2e613475fa89c92ad6326"} Apr 02 14:15:29 crc kubenswrapper[4732]: I0402 14:15:29.280448 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" Apr 02 14:15:29 crc kubenswrapper[4732]: I0402 14:15:29.361665 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-neutron-ovn-metadata-agent-neutron-config-0\") pod \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\" (UID: \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\") " Apr 02 14:15:29 crc kubenswrapper[4732]: I0402 14:15:29.362424 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-nova-metadata-neutron-config-0\") pod \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\" (UID: \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\") " Apr 02 14:15:29 crc kubenswrapper[4732]: I0402 14:15:29.362652 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-ssh-key-openstack-edpm-ipam\") pod \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\" (UID: \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\") " Apr 02 14:15:29 crc kubenswrapper[4732]: I0402 14:15:29.362757 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-inventory\") pod \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\" (UID: \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\") " Apr 02 14:15:29 crc kubenswrapper[4732]: I0402 14:15:29.363065 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7b78\" (UniqueName: \"kubernetes.io/projected/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-kube-api-access-g7b78\") pod \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\" (UID: \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\") " Apr 02 14:15:29 crc kubenswrapper[4732]: I0402 14:15:29.363134 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-neutron-metadata-combined-ca-bundle\") pod \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\" (UID: \"9e89ed59-ef4b-44a7-b6de-d98b2319ee10\") " Apr 02 14:15:29 crc kubenswrapper[4732]: I0402 14:15:29.367030 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-kube-api-access-g7b78" (OuterVolumeSpecName: "kube-api-access-g7b78") pod "9e89ed59-ef4b-44a7-b6de-d98b2319ee10" (UID: "9e89ed59-ef4b-44a7-b6de-d98b2319ee10"). InnerVolumeSpecName "kube-api-access-g7b78". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:15:29 crc kubenswrapper[4732]: I0402 14:15:29.367893 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9e89ed59-ef4b-44a7-b6de-d98b2319ee10" (UID: "9e89ed59-ef4b-44a7-b6de-d98b2319ee10"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:15:29 crc kubenswrapper[4732]: I0402 14:15:29.390627 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "9e89ed59-ef4b-44a7-b6de-d98b2319ee10" (UID: "9e89ed59-ef4b-44a7-b6de-d98b2319ee10"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:15:29 crc kubenswrapper[4732]: I0402 14:15:29.391049 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "9e89ed59-ef4b-44a7-b6de-d98b2319ee10" (UID: "9e89ed59-ef4b-44a7-b6de-d98b2319ee10"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:15:29 crc kubenswrapper[4732]: I0402 14:15:29.392593 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9e89ed59-ef4b-44a7-b6de-d98b2319ee10" (UID: "9e89ed59-ef4b-44a7-b6de-d98b2319ee10"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:15:29 crc kubenswrapper[4732]: I0402 14:15:29.420820 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-inventory" (OuterVolumeSpecName: "inventory") pod "9e89ed59-ef4b-44a7-b6de-d98b2319ee10" (UID: "9e89ed59-ef4b-44a7-b6de-d98b2319ee10"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:15:29 crc kubenswrapper[4732]: I0402 14:15:29.466176 4732 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Apr 02 14:15:29 crc kubenswrapper[4732]: I0402 14:15:29.466453 4732 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Apr 02 14:15:29 crc kubenswrapper[4732]: I0402 14:15:29.466525 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 02 14:15:29 crc kubenswrapper[4732]: I0402 14:15:29.466602 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-inventory\") on node \"crc\" DevicePath \"\"" Apr 02 14:15:29 crc kubenswrapper[4732]: I0402 14:15:29.466691 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7b78\" (UniqueName: \"kubernetes.io/projected/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-kube-api-access-g7b78\") on node \"crc\" DevicePath \"\"" Apr 02 14:15:29 crc kubenswrapper[4732]: I0402 14:15:29.466757 4732 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e89ed59-ef4b-44a7-b6de-d98b2319ee10-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:15:29 crc kubenswrapper[4732]: I0402 14:15:29.907229 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" event={"ID":"9e89ed59-ef4b-44a7-b6de-d98b2319ee10","Type":"ContainerDied","Data":"14396479b937569d57e99124bf52469046a95216f457c01e3eacdcde4612fecb"} Apr 02 14:15:29 crc kubenswrapper[4732]: I0402 14:15:29.907294 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14396479b937569d57e99124bf52469046a95216f457c01e3eacdcde4612fecb" Apr 02 14:15:29 crc kubenswrapper[4732]: I0402 14:15:29.907599 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.035661 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h"] Apr 02 14:15:30 crc kubenswrapper[4732]: E0402 14:15:30.037342 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="028fb44c-e0de-4f8f-b0f6-2896a0e110e7" containerName="extract-content" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.037374 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="028fb44c-e0de-4f8f-b0f6-2896a0e110e7" containerName="extract-content" Apr 02 14:15:30 crc kubenswrapper[4732]: E0402 14:15:30.037385 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="028fb44c-e0de-4f8f-b0f6-2896a0e110e7" containerName="registry-server" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.037393 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="028fb44c-e0de-4f8f-b0f6-2896a0e110e7" containerName="registry-server" Apr 02 14:15:30 crc kubenswrapper[4732]: E0402 14:15:30.037410 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e89ed59-ef4b-44a7-b6de-d98b2319ee10" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.037419 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e89ed59-ef4b-44a7-b6de-d98b2319ee10" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Apr 02 14:15:30 crc kubenswrapper[4732]: E0402 14:15:30.037465 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb678697-dd92-42e8-836f-6f4510b21522" containerName="collect-profiles" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.037474 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb678697-dd92-42e8-836f-6f4510b21522" containerName="collect-profiles" Apr 02 14:15:30 crc kubenswrapper[4732]: E0402 14:15:30.037494 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="028fb44c-e0de-4f8f-b0f6-2896a0e110e7" containerName="extract-utilities" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.037502 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="028fb44c-e0de-4f8f-b0f6-2896a0e110e7" containerName="extract-utilities" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.037766 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb678697-dd92-42e8-836f-6f4510b21522" containerName="collect-profiles" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.037805 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="028fb44c-e0de-4f8f-b0f6-2896a0e110e7" containerName="registry-server" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.037823 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e89ed59-ef4b-44a7-b6de-d98b2319ee10" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.038718 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.040347 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.044452 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.044475 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.044477 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wdhd4" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.046293 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h"] Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.046715 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.179472 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfjcq\" (UniqueName: \"kubernetes.io/projected/6d71fa88-324b-440b-aefd-492ac7ff7cd5-kube-api-access-cfjcq\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h\" (UID: \"6d71fa88-324b-440b-aefd-492ac7ff7cd5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.179530 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d71fa88-324b-440b-aefd-492ac7ff7cd5-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h\" (UID: \"6d71fa88-324b-440b-aefd-492ac7ff7cd5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.179555 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d71fa88-324b-440b-aefd-492ac7ff7cd5-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h\" (UID: \"6d71fa88-324b-440b-aefd-492ac7ff7cd5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.179710 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6d71fa88-324b-440b-aefd-492ac7ff7cd5-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h\" (UID: \"6d71fa88-324b-440b-aefd-492ac7ff7cd5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.179745 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d71fa88-324b-440b-aefd-492ac7ff7cd5-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h\" (UID: \"6d71fa88-324b-440b-aefd-492ac7ff7cd5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.281485 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfjcq\" (UniqueName: \"kubernetes.io/projected/6d71fa88-324b-440b-aefd-492ac7ff7cd5-kube-api-access-cfjcq\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h\" (UID: \"6d71fa88-324b-440b-aefd-492ac7ff7cd5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.281969 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d71fa88-324b-440b-aefd-492ac7ff7cd5-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h\" (UID: \"6d71fa88-324b-440b-aefd-492ac7ff7cd5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.282029 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d71fa88-324b-440b-aefd-492ac7ff7cd5-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h\" (UID: \"6d71fa88-324b-440b-aefd-492ac7ff7cd5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.282400 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6d71fa88-324b-440b-aefd-492ac7ff7cd5-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h\" (UID: \"6d71fa88-324b-440b-aefd-492ac7ff7cd5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.282530 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d71fa88-324b-440b-aefd-492ac7ff7cd5-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h\" (UID: \"6d71fa88-324b-440b-aefd-492ac7ff7cd5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.286293 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d71fa88-324b-440b-aefd-492ac7ff7cd5-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h\" (UID: \"6d71fa88-324b-440b-aefd-492ac7ff7cd5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.286844 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d71fa88-324b-440b-aefd-492ac7ff7cd5-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h\" (UID: \"6d71fa88-324b-440b-aefd-492ac7ff7cd5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.286933 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6d71fa88-324b-440b-aefd-492ac7ff7cd5-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h\" (UID: \"6d71fa88-324b-440b-aefd-492ac7ff7cd5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.287225 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d71fa88-324b-440b-aefd-492ac7ff7cd5-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h\" (UID: \"6d71fa88-324b-440b-aefd-492ac7ff7cd5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.300687 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfjcq\" (UniqueName: \"kubernetes.io/projected/6d71fa88-324b-440b-aefd-492ac7ff7cd5-kube-api-access-cfjcq\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h\" (UID: \"6d71fa88-324b-440b-aefd-492ac7ff7cd5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.354837 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h" Apr 02 14:15:30 crc kubenswrapper[4732]: I0402 14:15:30.950994 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h"] Apr 02 14:15:31 crc kubenswrapper[4732]: I0402 14:15:31.923708 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h" event={"ID":"6d71fa88-324b-440b-aefd-492ac7ff7cd5","Type":"ContainerStarted","Data":"ad70dba06aded04ca498f6ccf972285ba61e7ebc5e26d36ad0f46d65969ddeb9"} Apr 02 14:15:31 crc kubenswrapper[4732]: I0402 14:15:31.924102 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h" event={"ID":"6d71fa88-324b-440b-aefd-492ac7ff7cd5","Type":"ContainerStarted","Data":"54f7f7fa1fc6deb2860c2d0cb5d7b2faaea53adf52c63888a82b7b6b69718c0a"} Apr 02 14:15:31 crc kubenswrapper[4732]: I0402 14:15:31.923895 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:15:31 crc kubenswrapper[4732]: I0402 14:15:31.924150 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:15:31 crc kubenswrapper[4732]: I0402 14:15:31.950108 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h" podStartSLOduration=1.549726997 podStartE2EDuration="1.950091261s" podCreationTimestamp="2026-04-02 14:15:30 +0000 UTC" firstStartedPulling="2026-04-02 14:15:30.965266206 +0000 UTC m=+2287.869673759" lastFinishedPulling="2026-04-02 14:15:31.36563047 +0000 UTC m=+2288.270038023" observedRunningTime="2026-04-02 14:15:31.944797986 +0000 UTC m=+2288.849205539" watchObservedRunningTime="2026-04-02 14:15:31.950091261 +0000 UTC m=+2288.854498814" Apr 02 14:15:51 crc kubenswrapper[4732]: I0402 14:15:51.362873 4732 scope.go:117] "RemoveContainer" containerID="93bddc4cc6346d65d466d01bb6e7bdfa76ee89770919cb5fa9a55eb9760f977f" Apr 02 14:16:00 crc kubenswrapper[4732]: I0402 14:16:00.138569 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585656-cn5h7"] Apr 02 14:16:00 crc kubenswrapper[4732]: I0402 14:16:00.141182 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585656-cn5h7" Apr 02 14:16:00 crc kubenswrapper[4732]: I0402 14:16:00.143439 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:16:00 crc kubenswrapper[4732]: I0402 14:16:00.143481 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:16:00 crc kubenswrapper[4732]: I0402 14:16:00.143875 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:16:00 crc kubenswrapper[4732]: I0402 14:16:00.149676 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585656-cn5h7"] Apr 02 14:16:00 crc kubenswrapper[4732]: I0402 14:16:00.245745 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb8vt\" (UniqueName: \"kubernetes.io/projected/ce03f501-71c1-4a1b-b9d9-4f72058b93a1-kube-api-access-rb8vt\") pod \"auto-csr-approver-29585656-cn5h7\" (UID: \"ce03f501-71c1-4a1b-b9d9-4f72058b93a1\") " pod="openshift-infra/auto-csr-approver-29585656-cn5h7" Apr 02 14:16:00 crc kubenswrapper[4732]: I0402 14:16:00.347267 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb8vt\" (UniqueName: \"kubernetes.io/projected/ce03f501-71c1-4a1b-b9d9-4f72058b93a1-kube-api-access-rb8vt\") pod \"auto-csr-approver-29585656-cn5h7\" (UID: \"ce03f501-71c1-4a1b-b9d9-4f72058b93a1\") " pod="openshift-infra/auto-csr-approver-29585656-cn5h7" Apr 02 14:16:00 crc kubenswrapper[4732]: I0402 14:16:00.365650 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb8vt\" (UniqueName: \"kubernetes.io/projected/ce03f501-71c1-4a1b-b9d9-4f72058b93a1-kube-api-access-rb8vt\") pod \"auto-csr-approver-29585656-cn5h7\" (UID: \"ce03f501-71c1-4a1b-b9d9-4f72058b93a1\") " pod="openshift-infra/auto-csr-approver-29585656-cn5h7" Apr 02 14:16:00 crc kubenswrapper[4732]: I0402 14:16:00.462434 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585656-cn5h7" Apr 02 14:16:00 crc kubenswrapper[4732]: I0402 14:16:00.750731 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585656-cn5h7"] Apr 02 14:16:01 crc kubenswrapper[4732]: I0402 14:16:01.162876 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585656-cn5h7" event={"ID":"ce03f501-71c1-4a1b-b9d9-4f72058b93a1","Type":"ContainerStarted","Data":"46fa0483af5728467d3a406e464c6fb939079c3de8f9762ea06a2925ca185abb"} Apr 02 14:16:01 crc kubenswrapper[4732]: I0402 14:16:01.924453 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:16:01 crc kubenswrapper[4732]: I0402 14:16:01.925047 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:16:01 crc kubenswrapper[4732]: I0402 14:16:01.925148 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 14:16:01 crc kubenswrapper[4732]: I0402 14:16:01.926501 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"63733271f299db2daa2f2ab3f85fd3de726a361d8aba6db8431f92a5c4cd2580"} pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 02 14:16:01 crc kubenswrapper[4732]: I0402 14:16:01.926572 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" containerID="cri-o://63733271f299db2daa2f2ab3f85fd3de726a361d8aba6db8431f92a5c4cd2580" gracePeriod=600 Apr 02 14:16:02 crc kubenswrapper[4732]: I0402 14:16:02.172679 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585656-cn5h7" event={"ID":"ce03f501-71c1-4a1b-b9d9-4f72058b93a1","Type":"ContainerStarted","Data":"8653ec8ca33f402b0f1d5ce232f46f3cab4d081240b9967e103318cb733a78f9"} Apr 02 14:16:02 crc kubenswrapper[4732]: I0402 14:16:02.175289 4732 generic.go:334] "Generic (PLEG): container finished" podID="38409e5e-4545-49da-8f6c-4bfb30582878" containerID="63733271f299db2daa2f2ab3f85fd3de726a361d8aba6db8431f92a5c4cd2580" exitCode=0 Apr 02 14:16:02 crc kubenswrapper[4732]: I0402 14:16:02.175321 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerDied","Data":"63733271f299db2daa2f2ab3f85fd3de726a361d8aba6db8431f92a5c4cd2580"} Apr 02 14:16:02 crc kubenswrapper[4732]: I0402 14:16:02.175343 4732 scope.go:117] "RemoveContainer" containerID="0f81e9ff2a9337c334b84f07419534908b39d0251d1366b4d7838bb8bbcae383" Apr 02 14:16:02 crc kubenswrapper[4732]: I0402 14:16:02.195753 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29585656-cn5h7" podStartSLOduration=1.233854222 podStartE2EDuration="2.19573482s" podCreationTimestamp="2026-04-02 14:16:00 +0000 UTC" firstStartedPulling="2026-04-02 14:16:00.744994147 +0000 UTC m=+2317.649401700" lastFinishedPulling="2026-04-02 14:16:01.706874745 +0000 UTC m=+2318.611282298" observedRunningTime="2026-04-02 14:16:02.186820787 +0000 UTC m=+2319.091228350" watchObservedRunningTime="2026-04-02 14:16:02.19573482 +0000 UTC m=+2319.100142373" Apr 02 14:16:03 crc kubenswrapper[4732]: I0402 14:16:03.187972 4732 generic.go:334] "Generic (PLEG): container finished" podID="ce03f501-71c1-4a1b-b9d9-4f72058b93a1" containerID="8653ec8ca33f402b0f1d5ce232f46f3cab4d081240b9967e103318cb733a78f9" exitCode=0 Apr 02 14:16:03 crc kubenswrapper[4732]: I0402 14:16:03.188049 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585656-cn5h7" event={"ID":"ce03f501-71c1-4a1b-b9d9-4f72058b93a1","Type":"ContainerDied","Data":"8653ec8ca33f402b0f1d5ce232f46f3cab4d081240b9967e103318cb733a78f9"} Apr 02 14:16:03 crc kubenswrapper[4732]: I0402 14:16:03.191376 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerStarted","Data":"e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904"} Apr 02 14:16:04 crc kubenswrapper[4732]: I0402 14:16:04.616834 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585656-cn5h7" Apr 02 14:16:04 crc kubenswrapper[4732]: I0402 14:16:04.758911 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb8vt\" (UniqueName: \"kubernetes.io/projected/ce03f501-71c1-4a1b-b9d9-4f72058b93a1-kube-api-access-rb8vt\") pod \"ce03f501-71c1-4a1b-b9d9-4f72058b93a1\" (UID: \"ce03f501-71c1-4a1b-b9d9-4f72058b93a1\") " Apr 02 14:16:04 crc kubenswrapper[4732]: I0402 14:16:04.765394 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce03f501-71c1-4a1b-b9d9-4f72058b93a1-kube-api-access-rb8vt" (OuterVolumeSpecName: "kube-api-access-rb8vt") pod "ce03f501-71c1-4a1b-b9d9-4f72058b93a1" (UID: "ce03f501-71c1-4a1b-b9d9-4f72058b93a1"). InnerVolumeSpecName "kube-api-access-rb8vt". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:16:04 crc kubenswrapper[4732]: I0402 14:16:04.861732 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb8vt\" (UniqueName: \"kubernetes.io/projected/ce03f501-71c1-4a1b-b9d9-4f72058b93a1-kube-api-access-rb8vt\") on node \"crc\" DevicePath \"\"" Apr 02 14:16:05 crc kubenswrapper[4732]: I0402 14:16:05.210353 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585656-cn5h7" event={"ID":"ce03f501-71c1-4a1b-b9d9-4f72058b93a1","Type":"ContainerDied","Data":"46fa0483af5728467d3a406e464c6fb939079c3de8f9762ea06a2925ca185abb"} Apr 02 14:16:05 crc kubenswrapper[4732]: I0402 14:16:05.210412 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46fa0483af5728467d3a406e464c6fb939079c3de8f9762ea06a2925ca185abb" Apr 02 14:16:05 crc kubenswrapper[4732]: I0402 14:16:05.210462 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585656-cn5h7" Apr 02 14:16:05 crc kubenswrapper[4732]: I0402 14:16:05.270653 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585650-2gx64"] Apr 02 14:16:05 crc kubenswrapper[4732]: I0402 14:16:05.287659 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585650-2gx64"] Apr 02 14:16:06 crc kubenswrapper[4732]: I0402 14:16:06.692145 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37663d23-9f83-4211-80df-a8167d95f79e" path="/var/lib/kubelet/pods/37663d23-9f83-4211-80df-a8167d95f79e/volumes" Apr 02 14:16:51 crc kubenswrapper[4732]: I0402 14:16:51.451961 4732 scope.go:117] "RemoveContainer" containerID="069d712375aa62b6ebf7a181cff281a8b2210a138171da75928ff6006c3aa34c" Apr 02 14:17:29 crc kubenswrapper[4732]: I0402 14:17:29.737299 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7qm4h"] Apr 02 14:17:29 crc kubenswrapper[4732]: E0402 14:17:29.738266 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce03f501-71c1-4a1b-b9d9-4f72058b93a1" containerName="oc" Apr 02 14:17:29 crc kubenswrapper[4732]: I0402 14:17:29.738280 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce03f501-71c1-4a1b-b9d9-4f72058b93a1" containerName="oc" Apr 02 14:17:29 crc kubenswrapper[4732]: I0402 14:17:29.741358 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce03f501-71c1-4a1b-b9d9-4f72058b93a1" containerName="oc" Apr 02 14:17:29 crc kubenswrapper[4732]: I0402 14:17:29.748460 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7qm4h" Apr 02 14:17:29 crc kubenswrapper[4732]: I0402 14:17:29.762910 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7qm4h"] Apr 02 14:17:29 crc kubenswrapper[4732]: I0402 14:17:29.879745 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad35517b-f125-441f-a982-1facae60fd2d-utilities\") pod \"redhat-marketplace-7qm4h\" (UID: \"ad35517b-f125-441f-a982-1facae60fd2d\") " pod="openshift-marketplace/redhat-marketplace-7qm4h" Apr 02 14:17:29 crc kubenswrapper[4732]: I0402 14:17:29.879988 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad35517b-f125-441f-a982-1facae60fd2d-catalog-content\") pod \"redhat-marketplace-7qm4h\" (UID: \"ad35517b-f125-441f-a982-1facae60fd2d\") " pod="openshift-marketplace/redhat-marketplace-7qm4h" Apr 02 14:17:29 crc kubenswrapper[4732]: I0402 14:17:29.880085 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xf6h\" (UniqueName: \"kubernetes.io/projected/ad35517b-f125-441f-a982-1facae60fd2d-kube-api-access-7xf6h\") pod \"redhat-marketplace-7qm4h\" (UID: \"ad35517b-f125-441f-a982-1facae60fd2d\") " pod="openshift-marketplace/redhat-marketplace-7qm4h" Apr 02 14:17:29 crc kubenswrapper[4732]: I0402 14:17:29.981944 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad35517b-f125-441f-a982-1facae60fd2d-utilities\") pod \"redhat-marketplace-7qm4h\" (UID: \"ad35517b-f125-441f-a982-1facae60fd2d\") " pod="openshift-marketplace/redhat-marketplace-7qm4h" Apr 02 14:17:29 crc kubenswrapper[4732]: I0402 14:17:29.982024 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad35517b-f125-441f-a982-1facae60fd2d-catalog-content\") pod \"redhat-marketplace-7qm4h\" (UID: \"ad35517b-f125-441f-a982-1facae60fd2d\") " pod="openshift-marketplace/redhat-marketplace-7qm4h" Apr 02 14:17:29 crc kubenswrapper[4732]: I0402 14:17:29.982057 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xf6h\" (UniqueName: \"kubernetes.io/projected/ad35517b-f125-441f-a982-1facae60fd2d-kube-api-access-7xf6h\") pod \"redhat-marketplace-7qm4h\" (UID: \"ad35517b-f125-441f-a982-1facae60fd2d\") " pod="openshift-marketplace/redhat-marketplace-7qm4h" Apr 02 14:17:29 crc kubenswrapper[4732]: I0402 14:17:29.982601 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad35517b-f125-441f-a982-1facae60fd2d-utilities\") pod \"redhat-marketplace-7qm4h\" (UID: \"ad35517b-f125-441f-a982-1facae60fd2d\") " pod="openshift-marketplace/redhat-marketplace-7qm4h" Apr 02 14:17:29 crc kubenswrapper[4732]: I0402 14:17:29.982706 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad35517b-f125-441f-a982-1facae60fd2d-catalog-content\") pod \"redhat-marketplace-7qm4h\" (UID: \"ad35517b-f125-441f-a982-1facae60fd2d\") " pod="openshift-marketplace/redhat-marketplace-7qm4h" Apr 02 14:17:30 crc kubenswrapper[4732]: I0402 14:17:30.006414 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xf6h\" (UniqueName: \"kubernetes.io/projected/ad35517b-f125-441f-a982-1facae60fd2d-kube-api-access-7xf6h\") pod \"redhat-marketplace-7qm4h\" (UID: \"ad35517b-f125-441f-a982-1facae60fd2d\") " pod="openshift-marketplace/redhat-marketplace-7qm4h" Apr 02 14:17:30 crc kubenswrapper[4732]: I0402 14:17:30.106001 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7qm4h" Apr 02 14:17:30 crc kubenswrapper[4732]: I0402 14:17:30.569053 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7qm4h"] Apr 02 14:17:30 crc kubenswrapper[4732]: I0402 14:17:30.677012 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qm4h" event={"ID":"ad35517b-f125-441f-a982-1facae60fd2d","Type":"ContainerStarted","Data":"08e5d5caa5cb4530710a83a27627c9c4a02157961f0103b6abc5a2b846c0ed01"} Apr 02 14:17:31 crc kubenswrapper[4732]: I0402 14:17:31.686171 4732 generic.go:334] "Generic (PLEG): container finished" podID="ad35517b-f125-441f-a982-1facae60fd2d" containerID="dd8ea0acb52479554a6f2e6f4a91ff2adaf548b5e7c33d08b31ce56cee090213" exitCode=0 Apr 02 14:17:31 crc kubenswrapper[4732]: I0402 14:17:31.686222 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qm4h" event={"ID":"ad35517b-f125-441f-a982-1facae60fd2d","Type":"ContainerDied","Data":"dd8ea0acb52479554a6f2e6f4a91ff2adaf548b5e7c33d08b31ce56cee090213"} Apr 02 14:17:32 crc kubenswrapper[4732]: I0402 14:17:32.699549 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qm4h" event={"ID":"ad35517b-f125-441f-a982-1facae60fd2d","Type":"ContainerStarted","Data":"a1ed26b7640ed8c7e48080bfbe96901b15c6fb01c1e4f4aa7adbea8f35c776d6"} Apr 02 14:17:33 crc kubenswrapper[4732]: I0402 14:17:33.123498 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-488xf"] Apr 02 14:17:33 crc kubenswrapper[4732]: I0402 14:17:33.126092 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-488xf" Apr 02 14:17:33 crc kubenswrapper[4732]: I0402 14:17:33.157058 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-488xf"] Apr 02 14:17:33 crc kubenswrapper[4732]: I0402 14:17:33.247824 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlsw7\" (UniqueName: \"kubernetes.io/projected/92e4194e-e4a0-4159-a798-cc781cd91d7c-kube-api-access-hlsw7\") pod \"community-operators-488xf\" (UID: \"92e4194e-e4a0-4159-a798-cc781cd91d7c\") " pod="openshift-marketplace/community-operators-488xf" Apr 02 14:17:33 crc kubenswrapper[4732]: I0402 14:17:33.248175 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92e4194e-e4a0-4159-a798-cc781cd91d7c-catalog-content\") pod \"community-operators-488xf\" (UID: \"92e4194e-e4a0-4159-a798-cc781cd91d7c\") " pod="openshift-marketplace/community-operators-488xf" Apr 02 14:17:33 crc kubenswrapper[4732]: I0402 14:17:33.248254 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92e4194e-e4a0-4159-a798-cc781cd91d7c-utilities\") pod \"community-operators-488xf\" (UID: \"92e4194e-e4a0-4159-a798-cc781cd91d7c\") " pod="openshift-marketplace/community-operators-488xf" Apr 02 14:17:33 crc kubenswrapper[4732]: I0402 14:17:33.350868 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92e4194e-e4a0-4159-a798-cc781cd91d7c-utilities\") pod \"community-operators-488xf\" (UID: \"92e4194e-e4a0-4159-a798-cc781cd91d7c\") " pod="openshift-marketplace/community-operators-488xf" Apr 02 14:17:33 crc kubenswrapper[4732]: I0402 14:17:33.351012 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlsw7\" (UniqueName: \"kubernetes.io/projected/92e4194e-e4a0-4159-a798-cc781cd91d7c-kube-api-access-hlsw7\") pod \"community-operators-488xf\" (UID: \"92e4194e-e4a0-4159-a798-cc781cd91d7c\") " pod="openshift-marketplace/community-operators-488xf" Apr 02 14:17:33 crc kubenswrapper[4732]: I0402 14:17:33.351115 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92e4194e-e4a0-4159-a798-cc781cd91d7c-catalog-content\") pod \"community-operators-488xf\" (UID: \"92e4194e-e4a0-4159-a798-cc781cd91d7c\") " pod="openshift-marketplace/community-operators-488xf" Apr 02 14:17:33 crc kubenswrapper[4732]: I0402 14:17:33.351419 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92e4194e-e4a0-4159-a798-cc781cd91d7c-utilities\") pod \"community-operators-488xf\" (UID: \"92e4194e-e4a0-4159-a798-cc781cd91d7c\") " pod="openshift-marketplace/community-operators-488xf" Apr 02 14:17:33 crc kubenswrapper[4732]: I0402 14:17:33.351726 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92e4194e-e4a0-4159-a798-cc781cd91d7c-catalog-content\") pod \"community-operators-488xf\" (UID: \"92e4194e-e4a0-4159-a798-cc781cd91d7c\") " pod="openshift-marketplace/community-operators-488xf" Apr 02 14:17:33 crc kubenswrapper[4732]: I0402 14:17:33.385033 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlsw7\" (UniqueName: \"kubernetes.io/projected/92e4194e-e4a0-4159-a798-cc781cd91d7c-kube-api-access-hlsw7\") pod \"community-operators-488xf\" (UID: \"92e4194e-e4a0-4159-a798-cc781cd91d7c\") " pod="openshift-marketplace/community-operators-488xf" Apr 02 14:17:33 crc kubenswrapper[4732]: I0402 14:17:33.444720 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-488xf" Apr 02 14:17:33 crc kubenswrapper[4732]: I0402 14:17:33.717339 4732 generic.go:334] "Generic (PLEG): container finished" podID="ad35517b-f125-441f-a982-1facae60fd2d" containerID="a1ed26b7640ed8c7e48080bfbe96901b15c6fb01c1e4f4aa7adbea8f35c776d6" exitCode=0 Apr 02 14:17:33 crc kubenswrapper[4732]: I0402 14:17:33.717477 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qm4h" event={"ID":"ad35517b-f125-441f-a982-1facae60fd2d","Type":"ContainerDied","Data":"a1ed26b7640ed8c7e48080bfbe96901b15c6fb01c1e4f4aa7adbea8f35c776d6"} Apr 02 14:17:33 crc kubenswrapper[4732]: I0402 14:17:33.987852 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-488xf"] Apr 02 14:17:34 crc kubenswrapper[4732]: I0402 14:17:34.728590 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qm4h" event={"ID":"ad35517b-f125-441f-a982-1facae60fd2d","Type":"ContainerStarted","Data":"0046b1526cc6c7e9ef45982ac642a364a831bc76989e680becc9d5f8ef46bfea"} Apr 02 14:17:34 crc kubenswrapper[4732]: I0402 14:17:34.733328 4732 generic.go:334] "Generic (PLEG): container finished" podID="92e4194e-e4a0-4159-a798-cc781cd91d7c" containerID="67ee0a0364d5f47136bdb8d979acc358018ddd7db15f74f0c5a53fd6f5125962" exitCode=0 Apr 02 14:17:34 crc kubenswrapper[4732]: I0402 14:17:34.733484 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-488xf" event={"ID":"92e4194e-e4a0-4159-a798-cc781cd91d7c","Type":"ContainerDied","Data":"67ee0a0364d5f47136bdb8d979acc358018ddd7db15f74f0c5a53fd6f5125962"} Apr 02 14:17:34 crc kubenswrapper[4732]: I0402 14:17:34.733563 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-488xf" event={"ID":"92e4194e-e4a0-4159-a798-cc781cd91d7c","Type":"ContainerStarted","Data":"0bcb61fcefdaa92852f1dee68cb3317769111d504188bdd621c854e197c6d8b7"} Apr 02 14:17:34 crc kubenswrapper[4732]: I0402 14:17:34.762161 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7qm4h" podStartSLOduration=3.275167551 podStartE2EDuration="5.762138964s" podCreationTimestamp="2026-04-02 14:17:29 +0000 UTC" firstStartedPulling="2026-04-02 14:17:31.688568551 +0000 UTC m=+2408.592976104" lastFinishedPulling="2026-04-02 14:17:34.175539964 +0000 UTC m=+2411.079947517" observedRunningTime="2026-04-02 14:17:34.760158831 +0000 UTC m=+2411.664566414" watchObservedRunningTime="2026-04-02 14:17:34.762138964 +0000 UTC m=+2411.666546517" Apr 02 14:17:36 crc kubenswrapper[4732]: I0402 14:17:36.752500 4732 generic.go:334] "Generic (PLEG): container finished" podID="92e4194e-e4a0-4159-a798-cc781cd91d7c" containerID="6a50ccd466428813877bf44f9f4824e7cc8e713a58d0a7631c65d374c92e4c34" exitCode=0 Apr 02 14:17:36 crc kubenswrapper[4732]: I0402 14:17:36.752581 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-488xf" event={"ID":"92e4194e-e4a0-4159-a798-cc781cd91d7c","Type":"ContainerDied","Data":"6a50ccd466428813877bf44f9f4824e7cc8e713a58d0a7631c65d374c92e4c34"} Apr 02 14:17:37 crc kubenswrapper[4732]: I0402 14:17:37.769302 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-488xf" event={"ID":"92e4194e-e4a0-4159-a798-cc781cd91d7c","Type":"ContainerStarted","Data":"0828c4efbe06a9205b16628544ec5c2492ae1acdea6dc394e0317659e8952463"} Apr 02 14:17:37 crc kubenswrapper[4732]: I0402 14:17:37.804200 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-488xf" podStartSLOduration=2.446009622 podStartE2EDuration="4.804136331s" podCreationTimestamp="2026-04-02 14:17:33 +0000 UTC" firstStartedPulling="2026-04-02 14:17:34.735858862 +0000 UTC m=+2411.640266415" lastFinishedPulling="2026-04-02 14:17:37.093985561 +0000 UTC m=+2413.998393124" observedRunningTime="2026-04-02 14:17:37.796268907 +0000 UTC m=+2414.700676470" watchObservedRunningTime="2026-04-02 14:17:37.804136331 +0000 UTC m=+2414.708543874" Apr 02 14:17:40 crc kubenswrapper[4732]: I0402 14:17:40.106830 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7qm4h" Apr 02 14:17:40 crc kubenswrapper[4732]: I0402 14:17:40.107246 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7qm4h" Apr 02 14:17:40 crc kubenswrapper[4732]: I0402 14:17:40.178188 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7qm4h" Apr 02 14:17:40 crc kubenswrapper[4732]: I0402 14:17:40.860579 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7qm4h" Apr 02 14:17:41 crc kubenswrapper[4732]: I0402 14:17:41.297024 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7qm4h"] Apr 02 14:17:42 crc kubenswrapper[4732]: I0402 14:17:42.831338 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7qm4h" podUID="ad35517b-f125-441f-a982-1facae60fd2d" containerName="registry-server" containerID="cri-o://0046b1526cc6c7e9ef45982ac642a364a831bc76989e680becc9d5f8ef46bfea" gracePeriod=2 Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.299384 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7qm4h" Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.444877 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-488xf" Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.445163 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-488xf" Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.472199 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad35517b-f125-441f-a982-1facae60fd2d-catalog-content\") pod \"ad35517b-f125-441f-a982-1facae60fd2d\" (UID: \"ad35517b-f125-441f-a982-1facae60fd2d\") " Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.472249 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xf6h\" (UniqueName: \"kubernetes.io/projected/ad35517b-f125-441f-a982-1facae60fd2d-kube-api-access-7xf6h\") pod \"ad35517b-f125-441f-a982-1facae60fd2d\" (UID: \"ad35517b-f125-441f-a982-1facae60fd2d\") " Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.472364 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad35517b-f125-441f-a982-1facae60fd2d-utilities\") pod \"ad35517b-f125-441f-a982-1facae60fd2d\" (UID: \"ad35517b-f125-441f-a982-1facae60fd2d\") " Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.473452 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad35517b-f125-441f-a982-1facae60fd2d-utilities" (OuterVolumeSpecName: "utilities") pod "ad35517b-f125-441f-a982-1facae60fd2d" (UID: "ad35517b-f125-441f-a982-1facae60fd2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.478917 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad35517b-f125-441f-a982-1facae60fd2d-kube-api-access-7xf6h" (OuterVolumeSpecName: "kube-api-access-7xf6h") pod "ad35517b-f125-441f-a982-1facae60fd2d" (UID: "ad35517b-f125-441f-a982-1facae60fd2d"). InnerVolumeSpecName "kube-api-access-7xf6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.495642 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-488xf" Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.509197 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad35517b-f125-441f-a982-1facae60fd2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad35517b-f125-441f-a982-1facae60fd2d" (UID: "ad35517b-f125-441f-a982-1facae60fd2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.575215 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad35517b-f125-441f-a982-1facae60fd2d-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.575258 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad35517b-f125-441f-a982-1facae60fd2d-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.575271 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xf6h\" (UniqueName: \"kubernetes.io/projected/ad35517b-f125-441f-a982-1facae60fd2d-kube-api-access-7xf6h\") on node \"crc\" DevicePath \"\"" Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.845974 4732 generic.go:334] "Generic (PLEG): container finished" podID="ad35517b-f125-441f-a982-1facae60fd2d" containerID="0046b1526cc6c7e9ef45982ac642a364a831bc76989e680becc9d5f8ef46bfea" exitCode=0 Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.846078 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7qm4h" Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.846112 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qm4h" event={"ID":"ad35517b-f125-441f-a982-1facae60fd2d","Type":"ContainerDied","Data":"0046b1526cc6c7e9ef45982ac642a364a831bc76989e680becc9d5f8ef46bfea"} Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.846164 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qm4h" event={"ID":"ad35517b-f125-441f-a982-1facae60fd2d","Type":"ContainerDied","Data":"08e5d5caa5cb4530710a83a27627c9c4a02157961f0103b6abc5a2b846c0ed01"} Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.846187 4732 scope.go:117] "RemoveContainer" containerID="0046b1526cc6c7e9ef45982ac642a364a831bc76989e680becc9d5f8ef46bfea" Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.890480 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7qm4h"] Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.890871 4732 scope.go:117] "RemoveContainer" containerID="a1ed26b7640ed8c7e48080bfbe96901b15c6fb01c1e4f4aa7adbea8f35c776d6" Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.901949 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7qm4h"] Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.911886 4732 scope.go:117] "RemoveContainer" containerID="dd8ea0acb52479554a6f2e6f4a91ff2adaf548b5e7c33d08b31ce56cee090213" Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.912214 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-488xf" Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.952051 4732 scope.go:117] "RemoveContainer" containerID="0046b1526cc6c7e9ef45982ac642a364a831bc76989e680becc9d5f8ef46bfea" Apr 02 14:17:43 crc kubenswrapper[4732]: E0402 14:17:43.952463 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0046b1526cc6c7e9ef45982ac642a364a831bc76989e680becc9d5f8ef46bfea\": container with ID starting with 0046b1526cc6c7e9ef45982ac642a364a831bc76989e680becc9d5f8ef46bfea not found: ID does not exist" containerID="0046b1526cc6c7e9ef45982ac642a364a831bc76989e680becc9d5f8ef46bfea" Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.952500 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0046b1526cc6c7e9ef45982ac642a364a831bc76989e680becc9d5f8ef46bfea"} err="failed to get container status \"0046b1526cc6c7e9ef45982ac642a364a831bc76989e680becc9d5f8ef46bfea\": rpc error: code = NotFound desc = could not find container \"0046b1526cc6c7e9ef45982ac642a364a831bc76989e680becc9d5f8ef46bfea\": container with ID starting with 0046b1526cc6c7e9ef45982ac642a364a831bc76989e680becc9d5f8ef46bfea not found: ID does not exist" Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.952524 4732 scope.go:117] "RemoveContainer" containerID="a1ed26b7640ed8c7e48080bfbe96901b15c6fb01c1e4f4aa7adbea8f35c776d6" Apr 02 14:17:43 crc kubenswrapper[4732]: E0402 14:17:43.952930 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1ed26b7640ed8c7e48080bfbe96901b15c6fb01c1e4f4aa7adbea8f35c776d6\": container with ID starting with a1ed26b7640ed8c7e48080bfbe96901b15c6fb01c1e4f4aa7adbea8f35c776d6 not found: ID does not exist" containerID="a1ed26b7640ed8c7e48080bfbe96901b15c6fb01c1e4f4aa7adbea8f35c776d6" Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.952967 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1ed26b7640ed8c7e48080bfbe96901b15c6fb01c1e4f4aa7adbea8f35c776d6"} err="failed to get container status \"a1ed26b7640ed8c7e48080bfbe96901b15c6fb01c1e4f4aa7adbea8f35c776d6\": rpc error: code = NotFound desc = could not find container \"a1ed26b7640ed8c7e48080bfbe96901b15c6fb01c1e4f4aa7adbea8f35c776d6\": container with ID starting with a1ed26b7640ed8c7e48080bfbe96901b15c6fb01c1e4f4aa7adbea8f35c776d6 not found: ID does not exist" Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.952989 4732 scope.go:117] "RemoveContainer" containerID="dd8ea0acb52479554a6f2e6f4a91ff2adaf548b5e7c33d08b31ce56cee090213" Apr 02 14:17:43 crc kubenswrapper[4732]: E0402 14:17:43.953258 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd8ea0acb52479554a6f2e6f4a91ff2adaf548b5e7c33d08b31ce56cee090213\": container with ID starting with dd8ea0acb52479554a6f2e6f4a91ff2adaf548b5e7c33d08b31ce56cee090213 not found: ID does not exist" containerID="dd8ea0acb52479554a6f2e6f4a91ff2adaf548b5e7c33d08b31ce56cee090213" Apr 02 14:17:43 crc kubenswrapper[4732]: I0402 14:17:43.953284 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd8ea0acb52479554a6f2e6f4a91ff2adaf548b5e7c33d08b31ce56cee090213"} err="failed to get container status \"dd8ea0acb52479554a6f2e6f4a91ff2adaf548b5e7c33d08b31ce56cee090213\": rpc error: code = NotFound desc = could not find container \"dd8ea0acb52479554a6f2e6f4a91ff2adaf548b5e7c33d08b31ce56cee090213\": container with ID starting with dd8ea0acb52479554a6f2e6f4a91ff2adaf548b5e7c33d08b31ce56cee090213 not found: ID does not exist" Apr 02 14:17:44 crc kubenswrapper[4732]: I0402 14:17:44.696526 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad35517b-f125-441f-a982-1facae60fd2d" path="/var/lib/kubelet/pods/ad35517b-f125-441f-a982-1facae60fd2d/volumes" Apr 02 14:17:45 crc kubenswrapper[4732]: I0402 14:17:45.499935 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-488xf"] Apr 02 14:17:45 crc kubenswrapper[4732]: I0402 14:17:45.873277 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-488xf" podUID="92e4194e-e4a0-4159-a798-cc781cd91d7c" containerName="registry-server" containerID="cri-o://0828c4efbe06a9205b16628544ec5c2492ae1acdea6dc394e0317659e8952463" gracePeriod=2 Apr 02 14:17:46 crc kubenswrapper[4732]: I0402 14:17:46.344697 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-488xf" Apr 02 14:17:46 crc kubenswrapper[4732]: I0402 14:17:46.425866 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlsw7\" (UniqueName: \"kubernetes.io/projected/92e4194e-e4a0-4159-a798-cc781cd91d7c-kube-api-access-hlsw7\") pod \"92e4194e-e4a0-4159-a798-cc781cd91d7c\" (UID: \"92e4194e-e4a0-4159-a798-cc781cd91d7c\") " Apr 02 14:17:46 crc kubenswrapper[4732]: I0402 14:17:46.425909 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92e4194e-e4a0-4159-a798-cc781cd91d7c-catalog-content\") pod \"92e4194e-e4a0-4159-a798-cc781cd91d7c\" (UID: \"92e4194e-e4a0-4159-a798-cc781cd91d7c\") " Apr 02 14:17:46 crc kubenswrapper[4732]: I0402 14:17:46.426048 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92e4194e-e4a0-4159-a798-cc781cd91d7c-utilities\") pod \"92e4194e-e4a0-4159-a798-cc781cd91d7c\" (UID: \"92e4194e-e4a0-4159-a798-cc781cd91d7c\") " Apr 02 14:17:46 crc kubenswrapper[4732]: I0402 14:17:46.427583 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92e4194e-e4a0-4159-a798-cc781cd91d7c-utilities" (OuterVolumeSpecName: "utilities") pod "92e4194e-e4a0-4159-a798-cc781cd91d7c" (UID: "92e4194e-e4a0-4159-a798-cc781cd91d7c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:17:46 crc kubenswrapper[4732]: I0402 14:17:46.431765 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92e4194e-e4a0-4159-a798-cc781cd91d7c-kube-api-access-hlsw7" (OuterVolumeSpecName: "kube-api-access-hlsw7") pod "92e4194e-e4a0-4159-a798-cc781cd91d7c" (UID: "92e4194e-e4a0-4159-a798-cc781cd91d7c"). InnerVolumeSpecName "kube-api-access-hlsw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:17:46 crc kubenswrapper[4732]: I0402 14:17:46.486777 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92e4194e-e4a0-4159-a798-cc781cd91d7c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92e4194e-e4a0-4159-a798-cc781cd91d7c" (UID: "92e4194e-e4a0-4159-a798-cc781cd91d7c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:17:46 crc kubenswrapper[4732]: I0402 14:17:46.528764 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92e4194e-e4a0-4159-a798-cc781cd91d7c-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 14:17:46 crc kubenswrapper[4732]: I0402 14:17:46.528806 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlsw7\" (UniqueName: \"kubernetes.io/projected/92e4194e-e4a0-4159-a798-cc781cd91d7c-kube-api-access-hlsw7\") on node \"crc\" DevicePath \"\"" Apr 02 14:17:46 crc kubenswrapper[4732]: I0402 14:17:46.528817 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92e4194e-e4a0-4159-a798-cc781cd91d7c-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 14:17:46 crc kubenswrapper[4732]: I0402 14:17:46.885443 4732 generic.go:334] "Generic (PLEG): container finished" podID="92e4194e-e4a0-4159-a798-cc781cd91d7c" containerID="0828c4efbe06a9205b16628544ec5c2492ae1acdea6dc394e0317659e8952463" exitCode=0 Apr 02 14:17:46 crc kubenswrapper[4732]: I0402 14:17:46.885572 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-488xf" Apr 02 14:17:46 crc kubenswrapper[4732]: I0402 14:17:46.885708 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-488xf" event={"ID":"92e4194e-e4a0-4159-a798-cc781cd91d7c","Type":"ContainerDied","Data":"0828c4efbe06a9205b16628544ec5c2492ae1acdea6dc394e0317659e8952463"} Apr 02 14:17:46 crc kubenswrapper[4732]: I0402 14:17:46.885935 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-488xf" event={"ID":"92e4194e-e4a0-4159-a798-cc781cd91d7c","Type":"ContainerDied","Data":"0bcb61fcefdaa92852f1dee68cb3317769111d504188bdd621c854e197c6d8b7"} Apr 02 14:17:46 crc kubenswrapper[4732]: I0402 14:17:46.885967 4732 scope.go:117] "RemoveContainer" containerID="0828c4efbe06a9205b16628544ec5c2492ae1acdea6dc394e0317659e8952463" Apr 02 14:17:46 crc kubenswrapper[4732]: I0402 14:17:46.913230 4732 scope.go:117] "RemoveContainer" containerID="6a50ccd466428813877bf44f9f4824e7cc8e713a58d0a7631c65d374c92e4c34" Apr 02 14:17:46 crc kubenswrapper[4732]: I0402 14:17:46.918083 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-488xf"] Apr 02 14:17:46 crc kubenswrapper[4732]: I0402 14:17:46.928340 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-488xf"] Apr 02 14:17:46 crc kubenswrapper[4732]: I0402 14:17:46.938186 4732 scope.go:117] "RemoveContainer" containerID="67ee0a0364d5f47136bdb8d979acc358018ddd7db15f74f0c5a53fd6f5125962" Apr 02 14:17:46 crc kubenswrapper[4732]: I0402 14:17:46.985596 4732 scope.go:117] "RemoveContainer" containerID="0828c4efbe06a9205b16628544ec5c2492ae1acdea6dc394e0317659e8952463" Apr 02 14:17:46 crc kubenswrapper[4732]: E0402 14:17:46.986558 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0828c4efbe06a9205b16628544ec5c2492ae1acdea6dc394e0317659e8952463\": container with ID starting with 0828c4efbe06a9205b16628544ec5c2492ae1acdea6dc394e0317659e8952463 not found: ID does not exist" containerID="0828c4efbe06a9205b16628544ec5c2492ae1acdea6dc394e0317659e8952463" Apr 02 14:17:46 crc kubenswrapper[4732]: I0402 14:17:46.986681 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0828c4efbe06a9205b16628544ec5c2492ae1acdea6dc394e0317659e8952463"} err="failed to get container status \"0828c4efbe06a9205b16628544ec5c2492ae1acdea6dc394e0317659e8952463\": rpc error: code = NotFound desc = could not find container \"0828c4efbe06a9205b16628544ec5c2492ae1acdea6dc394e0317659e8952463\": container with ID starting with 0828c4efbe06a9205b16628544ec5c2492ae1acdea6dc394e0317659e8952463 not found: ID does not exist" Apr 02 14:17:46 crc kubenswrapper[4732]: I0402 14:17:46.986706 4732 scope.go:117] "RemoveContainer" containerID="6a50ccd466428813877bf44f9f4824e7cc8e713a58d0a7631c65d374c92e4c34" Apr 02 14:17:46 crc kubenswrapper[4732]: E0402 14:17:46.987236 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a50ccd466428813877bf44f9f4824e7cc8e713a58d0a7631c65d374c92e4c34\": container with ID starting with 6a50ccd466428813877bf44f9f4824e7cc8e713a58d0a7631c65d374c92e4c34 not found: ID does not exist" containerID="6a50ccd466428813877bf44f9f4824e7cc8e713a58d0a7631c65d374c92e4c34" Apr 02 14:17:46 crc kubenswrapper[4732]: I0402 14:17:46.987280 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a50ccd466428813877bf44f9f4824e7cc8e713a58d0a7631c65d374c92e4c34"} err="failed to get container status \"6a50ccd466428813877bf44f9f4824e7cc8e713a58d0a7631c65d374c92e4c34\": rpc error: code = NotFound desc = could not find container \"6a50ccd466428813877bf44f9f4824e7cc8e713a58d0a7631c65d374c92e4c34\": container with ID starting with 6a50ccd466428813877bf44f9f4824e7cc8e713a58d0a7631c65d374c92e4c34 not found: ID does not exist" Apr 02 14:17:46 crc kubenswrapper[4732]: I0402 14:17:46.987305 4732 scope.go:117] "RemoveContainer" containerID="67ee0a0364d5f47136bdb8d979acc358018ddd7db15f74f0c5a53fd6f5125962" Apr 02 14:17:46 crc kubenswrapper[4732]: E0402 14:17:46.987600 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67ee0a0364d5f47136bdb8d979acc358018ddd7db15f74f0c5a53fd6f5125962\": container with ID starting with 67ee0a0364d5f47136bdb8d979acc358018ddd7db15f74f0c5a53fd6f5125962 not found: ID does not exist" containerID="67ee0a0364d5f47136bdb8d979acc358018ddd7db15f74f0c5a53fd6f5125962" Apr 02 14:17:46 crc kubenswrapper[4732]: I0402 14:17:46.987669 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67ee0a0364d5f47136bdb8d979acc358018ddd7db15f74f0c5a53fd6f5125962"} err="failed to get container status \"67ee0a0364d5f47136bdb8d979acc358018ddd7db15f74f0c5a53fd6f5125962\": rpc error: code = NotFound desc = could not find container \"67ee0a0364d5f47136bdb8d979acc358018ddd7db15f74f0c5a53fd6f5125962\": container with ID starting with 67ee0a0364d5f47136bdb8d979acc358018ddd7db15f74f0c5a53fd6f5125962 not found: ID does not exist" Apr 02 14:17:48 crc kubenswrapper[4732]: I0402 14:17:48.691764 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92e4194e-e4a0-4159-a798-cc781cd91d7c" path="/var/lib/kubelet/pods/92e4194e-e4a0-4159-a798-cc781cd91d7c/volumes" Apr 02 14:17:54 crc kubenswrapper[4732]: I0402 14:17:54.887647 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8ff6q"] Apr 02 14:17:54 crc kubenswrapper[4732]: E0402 14:17:54.888542 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92e4194e-e4a0-4159-a798-cc781cd91d7c" containerName="extract-content" Apr 02 14:17:54 crc kubenswrapper[4732]: I0402 14:17:54.888562 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="92e4194e-e4a0-4159-a798-cc781cd91d7c" containerName="extract-content" Apr 02 14:17:54 crc kubenswrapper[4732]: E0402 14:17:54.888577 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92e4194e-e4a0-4159-a798-cc781cd91d7c" containerName="extract-utilities" Apr 02 14:17:54 crc kubenswrapper[4732]: I0402 14:17:54.888585 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="92e4194e-e4a0-4159-a798-cc781cd91d7c" containerName="extract-utilities" Apr 02 14:17:54 crc kubenswrapper[4732]: E0402 14:17:54.888648 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad35517b-f125-441f-a982-1facae60fd2d" containerName="extract-utilities" Apr 02 14:17:54 crc kubenswrapper[4732]: I0402 14:17:54.888660 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad35517b-f125-441f-a982-1facae60fd2d" containerName="extract-utilities" Apr 02 14:17:54 crc kubenswrapper[4732]: E0402 14:17:54.888694 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92e4194e-e4a0-4159-a798-cc781cd91d7c" containerName="registry-server" Apr 02 14:17:54 crc kubenswrapper[4732]: I0402 14:17:54.888707 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="92e4194e-e4a0-4159-a798-cc781cd91d7c" containerName="registry-server" Apr 02 14:17:54 crc kubenswrapper[4732]: E0402 14:17:54.888727 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad35517b-f125-441f-a982-1facae60fd2d" containerName="extract-content" Apr 02 14:17:54 crc kubenswrapper[4732]: I0402 14:17:54.888736 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad35517b-f125-441f-a982-1facae60fd2d" containerName="extract-content" Apr 02 14:17:54 crc kubenswrapper[4732]: E0402 14:17:54.888756 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad35517b-f125-441f-a982-1facae60fd2d" containerName="registry-server" Apr 02 14:17:54 crc kubenswrapper[4732]: I0402 14:17:54.888764 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad35517b-f125-441f-a982-1facae60fd2d" containerName="registry-server" Apr 02 14:17:54 crc kubenswrapper[4732]: I0402 14:17:54.888988 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad35517b-f125-441f-a982-1facae60fd2d" containerName="registry-server" Apr 02 14:17:54 crc kubenswrapper[4732]: I0402 14:17:54.889005 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="92e4194e-e4a0-4159-a798-cc781cd91d7c" containerName="registry-server" Apr 02 14:17:54 crc kubenswrapper[4732]: I0402 14:17:54.891051 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8ff6q" Apr 02 14:17:54 crc kubenswrapper[4732]: I0402 14:17:54.901279 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8ff6q"] Apr 02 14:17:54 crc kubenswrapper[4732]: I0402 14:17:54.992079 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa789afd-a972-43a3-bd23-df0ddaa04fc9-utilities\") pod \"certified-operators-8ff6q\" (UID: \"aa789afd-a972-43a3-bd23-df0ddaa04fc9\") " pod="openshift-marketplace/certified-operators-8ff6q" Apr 02 14:17:54 crc kubenswrapper[4732]: I0402 14:17:54.992157 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm5c2\" (UniqueName: \"kubernetes.io/projected/aa789afd-a972-43a3-bd23-df0ddaa04fc9-kube-api-access-zm5c2\") pod \"certified-operators-8ff6q\" (UID: \"aa789afd-a972-43a3-bd23-df0ddaa04fc9\") " pod="openshift-marketplace/certified-operators-8ff6q" Apr 02 14:17:54 crc kubenswrapper[4732]: I0402 14:17:54.992187 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa789afd-a972-43a3-bd23-df0ddaa04fc9-catalog-content\") pod \"certified-operators-8ff6q\" (UID: \"aa789afd-a972-43a3-bd23-df0ddaa04fc9\") " pod="openshift-marketplace/certified-operators-8ff6q" Apr 02 14:17:55 crc kubenswrapper[4732]: I0402 14:17:55.094433 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa789afd-a972-43a3-bd23-df0ddaa04fc9-utilities\") pod \"certified-operators-8ff6q\" (UID: \"aa789afd-a972-43a3-bd23-df0ddaa04fc9\") " pod="openshift-marketplace/certified-operators-8ff6q" Apr 02 14:17:55 crc kubenswrapper[4732]: I0402 14:17:55.094546 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm5c2\" (UniqueName: \"kubernetes.io/projected/aa789afd-a972-43a3-bd23-df0ddaa04fc9-kube-api-access-zm5c2\") pod \"certified-operators-8ff6q\" (UID: \"aa789afd-a972-43a3-bd23-df0ddaa04fc9\") " pod="openshift-marketplace/certified-operators-8ff6q" Apr 02 14:17:55 crc kubenswrapper[4732]: I0402 14:17:55.094571 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa789afd-a972-43a3-bd23-df0ddaa04fc9-catalog-content\") pod \"certified-operators-8ff6q\" (UID: \"aa789afd-a972-43a3-bd23-df0ddaa04fc9\") " pod="openshift-marketplace/certified-operators-8ff6q" Apr 02 14:17:55 crc kubenswrapper[4732]: I0402 14:17:55.094967 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa789afd-a972-43a3-bd23-df0ddaa04fc9-utilities\") pod \"certified-operators-8ff6q\" (UID: \"aa789afd-a972-43a3-bd23-df0ddaa04fc9\") " pod="openshift-marketplace/certified-operators-8ff6q" Apr 02 14:17:55 crc kubenswrapper[4732]: I0402 14:17:55.095066 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa789afd-a972-43a3-bd23-df0ddaa04fc9-catalog-content\") pod \"certified-operators-8ff6q\" (UID: \"aa789afd-a972-43a3-bd23-df0ddaa04fc9\") " pod="openshift-marketplace/certified-operators-8ff6q" Apr 02 14:17:55 crc kubenswrapper[4732]: I0402 14:17:55.114060 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm5c2\" (UniqueName: \"kubernetes.io/projected/aa789afd-a972-43a3-bd23-df0ddaa04fc9-kube-api-access-zm5c2\") pod \"certified-operators-8ff6q\" (UID: \"aa789afd-a972-43a3-bd23-df0ddaa04fc9\") " pod="openshift-marketplace/certified-operators-8ff6q" Apr 02 14:17:55 crc kubenswrapper[4732]: I0402 14:17:55.211163 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8ff6q" Apr 02 14:17:55 crc kubenswrapper[4732]: I0402 14:17:55.685865 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8ff6q"] Apr 02 14:17:55 crc kubenswrapper[4732]: W0402 14:17:55.699663 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa789afd_a972_43a3_bd23_df0ddaa04fc9.slice/crio-4f843dd8e8c85a71429dceed82ddbfc81ee178ddb38e082d6e24dd85b3af673c WatchSource:0}: Error finding container 4f843dd8e8c85a71429dceed82ddbfc81ee178ddb38e082d6e24dd85b3af673c: Status 404 returned error can't find the container with id 4f843dd8e8c85a71429dceed82ddbfc81ee178ddb38e082d6e24dd85b3af673c Apr 02 14:17:55 crc kubenswrapper[4732]: I0402 14:17:55.974881 4732 generic.go:334] "Generic (PLEG): container finished" podID="aa789afd-a972-43a3-bd23-df0ddaa04fc9" containerID="29002c0773fa3d7ba700e1230605853c9a3f6d0e27be7dfe1561e030d288c2f6" exitCode=0 Apr 02 14:17:55 crc kubenswrapper[4732]: I0402 14:17:55.974930 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ff6q" event={"ID":"aa789afd-a972-43a3-bd23-df0ddaa04fc9","Type":"ContainerDied","Data":"29002c0773fa3d7ba700e1230605853c9a3f6d0e27be7dfe1561e030d288c2f6"} Apr 02 14:17:55 crc kubenswrapper[4732]: I0402 14:17:55.975181 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ff6q" event={"ID":"aa789afd-a972-43a3-bd23-df0ddaa04fc9","Type":"ContainerStarted","Data":"4f843dd8e8c85a71429dceed82ddbfc81ee178ddb38e082d6e24dd85b3af673c"} Apr 02 14:17:55 crc kubenswrapper[4732]: I0402 14:17:55.976859 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 02 14:17:56 crc kubenswrapper[4732]: I0402 14:17:56.984768 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ff6q" event={"ID":"aa789afd-a972-43a3-bd23-df0ddaa04fc9","Type":"ContainerStarted","Data":"c46e183b4976c46c0ed75711a532413663cf1a205cc166fea654ff96a63c00bc"} Apr 02 14:17:57 crc kubenswrapper[4732]: I0402 14:17:57.996390 4732 generic.go:334] "Generic (PLEG): container finished" podID="aa789afd-a972-43a3-bd23-df0ddaa04fc9" containerID="c46e183b4976c46c0ed75711a532413663cf1a205cc166fea654ff96a63c00bc" exitCode=0 Apr 02 14:17:57 crc kubenswrapper[4732]: I0402 14:17:57.996535 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ff6q" event={"ID":"aa789afd-a972-43a3-bd23-df0ddaa04fc9","Type":"ContainerDied","Data":"c46e183b4976c46c0ed75711a532413663cf1a205cc166fea654ff96a63c00bc"} Apr 02 14:17:59 crc kubenswrapper[4732]: I0402 14:17:59.007296 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ff6q" event={"ID":"aa789afd-a972-43a3-bd23-df0ddaa04fc9","Type":"ContainerStarted","Data":"e5172b1c60eed9159594da968bbfa80fe8484ddb2c4afd30476087023b678e7e"} Apr 02 14:17:59 crc kubenswrapper[4732]: I0402 14:17:59.027816 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8ff6q" podStartSLOduration=2.608310147 podStartE2EDuration="5.02780052s" podCreationTimestamp="2026-04-02 14:17:54 +0000 UTC" firstStartedPulling="2026-04-02 14:17:55.976639525 +0000 UTC m=+2432.881047078" lastFinishedPulling="2026-04-02 14:17:58.396129868 +0000 UTC m=+2435.300537451" observedRunningTime="2026-04-02 14:17:59.026073933 +0000 UTC m=+2435.930481536" watchObservedRunningTime="2026-04-02 14:17:59.02780052 +0000 UTC m=+2435.932208073" Apr 02 14:18:00 crc kubenswrapper[4732]: I0402 14:18:00.152507 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585658-cxm7h"] Apr 02 14:18:00 crc kubenswrapper[4732]: I0402 14:18:00.154947 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585658-cxm7h" Apr 02 14:18:00 crc kubenswrapper[4732]: I0402 14:18:00.157041 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:18:00 crc kubenswrapper[4732]: I0402 14:18:00.158751 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:18:00 crc kubenswrapper[4732]: I0402 14:18:00.159046 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:18:00 crc kubenswrapper[4732]: I0402 14:18:00.166974 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585658-cxm7h"] Apr 02 14:18:00 crc kubenswrapper[4732]: I0402 14:18:00.307369 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr7h6\" (UniqueName: \"kubernetes.io/projected/fb1cb3c0-3800-41c1-bfa3-064d473acdc5-kube-api-access-tr7h6\") pod \"auto-csr-approver-29585658-cxm7h\" (UID: \"fb1cb3c0-3800-41c1-bfa3-064d473acdc5\") " pod="openshift-infra/auto-csr-approver-29585658-cxm7h" Apr 02 14:18:00 crc kubenswrapper[4732]: I0402 14:18:00.409455 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr7h6\" (UniqueName: \"kubernetes.io/projected/fb1cb3c0-3800-41c1-bfa3-064d473acdc5-kube-api-access-tr7h6\") pod \"auto-csr-approver-29585658-cxm7h\" (UID: \"fb1cb3c0-3800-41c1-bfa3-064d473acdc5\") " pod="openshift-infra/auto-csr-approver-29585658-cxm7h" Apr 02 14:18:00 crc kubenswrapper[4732]: I0402 14:18:00.432541 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr7h6\" (UniqueName: \"kubernetes.io/projected/fb1cb3c0-3800-41c1-bfa3-064d473acdc5-kube-api-access-tr7h6\") pod \"auto-csr-approver-29585658-cxm7h\" (UID: \"fb1cb3c0-3800-41c1-bfa3-064d473acdc5\") " pod="openshift-infra/auto-csr-approver-29585658-cxm7h" Apr 02 14:18:00 crc kubenswrapper[4732]: I0402 14:18:00.480043 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585658-cxm7h" Apr 02 14:18:00 crc kubenswrapper[4732]: I0402 14:18:00.915968 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585658-cxm7h"] Apr 02 14:18:00 crc kubenswrapper[4732]: W0402 14:18:00.920003 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb1cb3c0_3800_41c1_bfa3_064d473acdc5.slice/crio-081e8017bdfa23c4479a35404a67777135bf6c457e2609f507daec33ac2e2f73 WatchSource:0}: Error finding container 081e8017bdfa23c4479a35404a67777135bf6c457e2609f507daec33ac2e2f73: Status 404 returned error can't find the container with id 081e8017bdfa23c4479a35404a67777135bf6c457e2609f507daec33ac2e2f73 Apr 02 14:18:01 crc kubenswrapper[4732]: I0402 14:18:01.033039 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585658-cxm7h" event={"ID":"fb1cb3c0-3800-41c1-bfa3-064d473acdc5","Type":"ContainerStarted","Data":"081e8017bdfa23c4479a35404a67777135bf6c457e2609f507daec33ac2e2f73"} Apr 02 14:18:03 crc kubenswrapper[4732]: I0402 14:18:03.066701 4732 generic.go:334] "Generic (PLEG): container finished" podID="fb1cb3c0-3800-41c1-bfa3-064d473acdc5" containerID="8dfd95432144351c7292d99043f7cfcd6c0ba1a99e2b9171a1b54729e33ffa6d" exitCode=0 Apr 02 14:18:03 crc kubenswrapper[4732]: I0402 14:18:03.066788 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585658-cxm7h" event={"ID":"fb1cb3c0-3800-41c1-bfa3-064d473acdc5","Type":"ContainerDied","Data":"8dfd95432144351c7292d99043f7cfcd6c0ba1a99e2b9171a1b54729e33ffa6d"} Apr 02 14:18:04 crc kubenswrapper[4732]: I0402 14:18:04.568657 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585658-cxm7h" Apr 02 14:18:04 crc kubenswrapper[4732]: I0402 14:18:04.691316 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr7h6\" (UniqueName: \"kubernetes.io/projected/fb1cb3c0-3800-41c1-bfa3-064d473acdc5-kube-api-access-tr7h6\") pod \"fb1cb3c0-3800-41c1-bfa3-064d473acdc5\" (UID: \"fb1cb3c0-3800-41c1-bfa3-064d473acdc5\") " Apr 02 14:18:04 crc kubenswrapper[4732]: I0402 14:18:04.697225 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb1cb3c0-3800-41c1-bfa3-064d473acdc5-kube-api-access-tr7h6" (OuterVolumeSpecName: "kube-api-access-tr7h6") pod "fb1cb3c0-3800-41c1-bfa3-064d473acdc5" (UID: "fb1cb3c0-3800-41c1-bfa3-064d473acdc5"). InnerVolumeSpecName "kube-api-access-tr7h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:18:04 crc kubenswrapper[4732]: I0402 14:18:04.794222 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr7h6\" (UniqueName: \"kubernetes.io/projected/fb1cb3c0-3800-41c1-bfa3-064d473acdc5-kube-api-access-tr7h6\") on node \"crc\" DevicePath \"\"" Apr 02 14:18:05 crc kubenswrapper[4732]: I0402 14:18:05.086067 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585658-cxm7h" event={"ID":"fb1cb3c0-3800-41c1-bfa3-064d473acdc5","Type":"ContainerDied","Data":"081e8017bdfa23c4479a35404a67777135bf6c457e2609f507daec33ac2e2f73"} Apr 02 14:18:05 crc kubenswrapper[4732]: I0402 14:18:05.086097 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585658-cxm7h" Apr 02 14:18:05 crc kubenswrapper[4732]: I0402 14:18:05.086110 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="081e8017bdfa23c4479a35404a67777135bf6c457e2609f507daec33ac2e2f73" Apr 02 14:18:05 crc kubenswrapper[4732]: I0402 14:18:05.211717 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8ff6q" Apr 02 14:18:05 crc kubenswrapper[4732]: I0402 14:18:05.211796 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8ff6q" Apr 02 14:18:05 crc kubenswrapper[4732]: I0402 14:18:05.270293 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8ff6q" Apr 02 14:18:05 crc kubenswrapper[4732]: I0402 14:18:05.655361 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585652-6p5dx"] Apr 02 14:18:05 crc kubenswrapper[4732]: I0402 14:18:05.667392 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585652-6p5dx"] Apr 02 14:18:06 crc kubenswrapper[4732]: I0402 14:18:06.145411 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8ff6q" Apr 02 14:18:06 crc kubenswrapper[4732]: I0402 14:18:06.191111 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8ff6q"] Apr 02 14:18:06 crc kubenswrapper[4732]: I0402 14:18:06.734096 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa9c5076-d469-46c7-9467-8d7ee80a71ff" path="/var/lib/kubelet/pods/aa9c5076-d469-46c7-9467-8d7ee80a71ff/volumes" Apr 02 14:18:08 crc kubenswrapper[4732]: I0402 14:18:08.111485 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8ff6q" podUID="aa789afd-a972-43a3-bd23-df0ddaa04fc9" containerName="registry-server" containerID="cri-o://e5172b1c60eed9159594da968bbfa80fe8484ddb2c4afd30476087023b678e7e" gracePeriod=2 Apr 02 14:18:08 crc kubenswrapper[4732]: I0402 14:18:08.593320 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8ff6q" Apr 02 14:18:08 crc kubenswrapper[4732]: I0402 14:18:08.773935 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa789afd-a972-43a3-bd23-df0ddaa04fc9-utilities\") pod \"aa789afd-a972-43a3-bd23-df0ddaa04fc9\" (UID: \"aa789afd-a972-43a3-bd23-df0ddaa04fc9\") " Apr 02 14:18:08 crc kubenswrapper[4732]: I0402 14:18:08.774156 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa789afd-a972-43a3-bd23-df0ddaa04fc9-catalog-content\") pod \"aa789afd-a972-43a3-bd23-df0ddaa04fc9\" (UID: \"aa789afd-a972-43a3-bd23-df0ddaa04fc9\") " Apr 02 14:18:08 crc kubenswrapper[4732]: I0402 14:18:08.774193 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm5c2\" (UniqueName: \"kubernetes.io/projected/aa789afd-a972-43a3-bd23-df0ddaa04fc9-kube-api-access-zm5c2\") pod \"aa789afd-a972-43a3-bd23-df0ddaa04fc9\" (UID: \"aa789afd-a972-43a3-bd23-df0ddaa04fc9\") " Apr 02 14:18:08 crc kubenswrapper[4732]: I0402 14:18:08.777222 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa789afd-a972-43a3-bd23-df0ddaa04fc9-utilities" (OuterVolumeSpecName: "utilities") pod "aa789afd-a972-43a3-bd23-df0ddaa04fc9" (UID: "aa789afd-a972-43a3-bd23-df0ddaa04fc9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:18:08 crc kubenswrapper[4732]: I0402 14:18:08.780332 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa789afd-a972-43a3-bd23-df0ddaa04fc9-kube-api-access-zm5c2" (OuterVolumeSpecName: "kube-api-access-zm5c2") pod "aa789afd-a972-43a3-bd23-df0ddaa04fc9" (UID: "aa789afd-a972-43a3-bd23-df0ddaa04fc9"). InnerVolumeSpecName "kube-api-access-zm5c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:18:08 crc kubenswrapper[4732]: I0402 14:18:08.830329 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa789afd-a972-43a3-bd23-df0ddaa04fc9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa789afd-a972-43a3-bd23-df0ddaa04fc9" (UID: "aa789afd-a972-43a3-bd23-df0ddaa04fc9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:18:08 crc kubenswrapper[4732]: I0402 14:18:08.877113 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa789afd-a972-43a3-bd23-df0ddaa04fc9-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 14:18:08 crc kubenswrapper[4732]: I0402 14:18:08.877144 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa789afd-a972-43a3-bd23-df0ddaa04fc9-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 14:18:08 crc kubenswrapper[4732]: I0402 14:18:08.877155 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm5c2\" (UniqueName: \"kubernetes.io/projected/aa789afd-a972-43a3-bd23-df0ddaa04fc9-kube-api-access-zm5c2\") on node \"crc\" DevicePath \"\"" Apr 02 14:18:09 crc kubenswrapper[4732]: I0402 14:18:09.126442 4732 generic.go:334] "Generic (PLEG): container finished" podID="aa789afd-a972-43a3-bd23-df0ddaa04fc9" containerID="e5172b1c60eed9159594da968bbfa80fe8484ddb2c4afd30476087023b678e7e" exitCode=0 Apr 02 14:18:09 crc kubenswrapper[4732]: I0402 14:18:09.126768 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ff6q" event={"ID":"aa789afd-a972-43a3-bd23-df0ddaa04fc9","Type":"ContainerDied","Data":"e5172b1c60eed9159594da968bbfa80fe8484ddb2c4afd30476087023b678e7e"} Apr 02 14:18:09 crc kubenswrapper[4732]: I0402 14:18:09.126800 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ff6q" event={"ID":"aa789afd-a972-43a3-bd23-df0ddaa04fc9","Type":"ContainerDied","Data":"4f843dd8e8c85a71429dceed82ddbfc81ee178ddb38e082d6e24dd85b3af673c"} Apr 02 14:18:09 crc kubenswrapper[4732]: I0402 14:18:09.126822 4732 scope.go:117] "RemoveContainer" containerID="e5172b1c60eed9159594da968bbfa80fe8484ddb2c4afd30476087023b678e7e" Apr 02 14:18:09 crc kubenswrapper[4732]: I0402 14:18:09.127067 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8ff6q" Apr 02 14:18:09 crc kubenswrapper[4732]: I0402 14:18:09.163968 4732 scope.go:117] "RemoveContainer" containerID="c46e183b4976c46c0ed75711a532413663cf1a205cc166fea654ff96a63c00bc" Apr 02 14:18:09 crc kubenswrapper[4732]: I0402 14:18:09.165031 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8ff6q"] Apr 02 14:18:09 crc kubenswrapper[4732]: I0402 14:18:09.188726 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8ff6q"] Apr 02 14:18:09 crc kubenswrapper[4732]: I0402 14:18:09.190048 4732 scope.go:117] "RemoveContainer" containerID="29002c0773fa3d7ba700e1230605853c9a3f6d0e27be7dfe1561e030d288c2f6" Apr 02 14:18:09 crc kubenswrapper[4732]: I0402 14:18:09.228651 4732 scope.go:117] "RemoveContainer" containerID="e5172b1c60eed9159594da968bbfa80fe8484ddb2c4afd30476087023b678e7e" Apr 02 14:18:09 crc kubenswrapper[4732]: E0402 14:18:09.229715 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5172b1c60eed9159594da968bbfa80fe8484ddb2c4afd30476087023b678e7e\": container with ID starting with e5172b1c60eed9159594da968bbfa80fe8484ddb2c4afd30476087023b678e7e not found: ID does not exist" containerID="e5172b1c60eed9159594da968bbfa80fe8484ddb2c4afd30476087023b678e7e" Apr 02 14:18:09 crc kubenswrapper[4732]: I0402 14:18:09.229755 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5172b1c60eed9159594da968bbfa80fe8484ddb2c4afd30476087023b678e7e"} err="failed to get container status \"e5172b1c60eed9159594da968bbfa80fe8484ddb2c4afd30476087023b678e7e\": rpc error: code = NotFound desc = could not find container \"e5172b1c60eed9159594da968bbfa80fe8484ddb2c4afd30476087023b678e7e\": container with ID starting with e5172b1c60eed9159594da968bbfa80fe8484ddb2c4afd30476087023b678e7e not found: ID does not exist" Apr 02 14:18:09 crc kubenswrapper[4732]: I0402 14:18:09.229781 4732 scope.go:117] "RemoveContainer" containerID="c46e183b4976c46c0ed75711a532413663cf1a205cc166fea654ff96a63c00bc" Apr 02 14:18:09 crc kubenswrapper[4732]: E0402 14:18:09.230219 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c46e183b4976c46c0ed75711a532413663cf1a205cc166fea654ff96a63c00bc\": container with ID starting with c46e183b4976c46c0ed75711a532413663cf1a205cc166fea654ff96a63c00bc not found: ID does not exist" containerID="c46e183b4976c46c0ed75711a532413663cf1a205cc166fea654ff96a63c00bc" Apr 02 14:18:09 crc kubenswrapper[4732]: I0402 14:18:09.230267 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c46e183b4976c46c0ed75711a532413663cf1a205cc166fea654ff96a63c00bc"} err="failed to get container status \"c46e183b4976c46c0ed75711a532413663cf1a205cc166fea654ff96a63c00bc\": rpc error: code = NotFound desc = could not find container \"c46e183b4976c46c0ed75711a532413663cf1a205cc166fea654ff96a63c00bc\": container with ID starting with c46e183b4976c46c0ed75711a532413663cf1a205cc166fea654ff96a63c00bc not found: ID does not exist" Apr 02 14:18:09 crc kubenswrapper[4732]: I0402 14:18:09.230312 4732 scope.go:117] "RemoveContainer" containerID="29002c0773fa3d7ba700e1230605853c9a3f6d0e27be7dfe1561e030d288c2f6" Apr 02 14:18:09 crc kubenswrapper[4732]: E0402 14:18:09.230827 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29002c0773fa3d7ba700e1230605853c9a3f6d0e27be7dfe1561e030d288c2f6\": container with ID starting with 29002c0773fa3d7ba700e1230605853c9a3f6d0e27be7dfe1561e030d288c2f6 not found: ID does not exist" containerID="29002c0773fa3d7ba700e1230605853c9a3f6d0e27be7dfe1561e030d288c2f6" Apr 02 14:18:09 crc kubenswrapper[4732]: I0402 14:18:09.230866 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29002c0773fa3d7ba700e1230605853c9a3f6d0e27be7dfe1561e030d288c2f6"} err="failed to get container status \"29002c0773fa3d7ba700e1230605853c9a3f6d0e27be7dfe1561e030d288c2f6\": rpc error: code = NotFound desc = could not find container \"29002c0773fa3d7ba700e1230605853c9a3f6d0e27be7dfe1561e030d288c2f6\": container with ID starting with 29002c0773fa3d7ba700e1230605853c9a3f6d0e27be7dfe1561e030d288c2f6 not found: ID does not exist" Apr 02 14:18:10 crc kubenswrapper[4732]: I0402 14:18:10.734697 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa789afd-a972-43a3-bd23-df0ddaa04fc9" path="/var/lib/kubelet/pods/aa789afd-a972-43a3-bd23-df0ddaa04fc9/volumes" Apr 02 14:18:31 crc kubenswrapper[4732]: I0402 14:18:31.924221 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:18:31 crc kubenswrapper[4732]: I0402 14:18:31.924963 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:18:51 crc kubenswrapper[4732]: I0402 14:18:51.567134 4732 scope.go:117] "RemoveContainer" containerID="3ad43324fbecf368f6fcc06a0ab1f3f5db739bf2832b0b8f34d462fb20763345" Apr 02 14:19:00 crc kubenswrapper[4732]: I0402 14:19:00.606414 4732 generic.go:334] "Generic (PLEG): container finished" podID="6d71fa88-324b-440b-aefd-492ac7ff7cd5" containerID="ad70dba06aded04ca498f6ccf972285ba61e7ebc5e26d36ad0f46d65969ddeb9" exitCode=0 Apr 02 14:19:00 crc kubenswrapper[4732]: I0402 14:19:00.606544 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h" event={"ID":"6d71fa88-324b-440b-aefd-492ac7ff7cd5","Type":"ContainerDied","Data":"ad70dba06aded04ca498f6ccf972285ba61e7ebc5e26d36ad0f46d65969ddeb9"} Apr 02 14:19:01 crc kubenswrapper[4732]: I0402 14:19:01.924409 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:19:01 crc kubenswrapper[4732]: I0402 14:19:01.924838 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.033086 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.181629 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d71fa88-324b-440b-aefd-492ac7ff7cd5-libvirt-combined-ca-bundle\") pod \"6d71fa88-324b-440b-aefd-492ac7ff7cd5\" (UID: \"6d71fa88-324b-440b-aefd-492ac7ff7cd5\") " Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.181796 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d71fa88-324b-440b-aefd-492ac7ff7cd5-ssh-key-openstack-edpm-ipam\") pod \"6d71fa88-324b-440b-aefd-492ac7ff7cd5\" (UID: \"6d71fa88-324b-440b-aefd-492ac7ff7cd5\") " Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.181820 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6d71fa88-324b-440b-aefd-492ac7ff7cd5-libvirt-secret-0\") pod \"6d71fa88-324b-440b-aefd-492ac7ff7cd5\" (UID: \"6d71fa88-324b-440b-aefd-492ac7ff7cd5\") " Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.181915 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfjcq\" (UniqueName: \"kubernetes.io/projected/6d71fa88-324b-440b-aefd-492ac7ff7cd5-kube-api-access-cfjcq\") pod \"6d71fa88-324b-440b-aefd-492ac7ff7cd5\" (UID: \"6d71fa88-324b-440b-aefd-492ac7ff7cd5\") " Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.182033 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d71fa88-324b-440b-aefd-492ac7ff7cd5-inventory\") pod \"6d71fa88-324b-440b-aefd-492ac7ff7cd5\" (UID: \"6d71fa88-324b-440b-aefd-492ac7ff7cd5\") " Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.188274 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d71fa88-324b-440b-aefd-492ac7ff7cd5-kube-api-access-cfjcq" (OuterVolumeSpecName: "kube-api-access-cfjcq") pod "6d71fa88-324b-440b-aefd-492ac7ff7cd5" (UID: "6d71fa88-324b-440b-aefd-492ac7ff7cd5"). InnerVolumeSpecName "kube-api-access-cfjcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.189965 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d71fa88-324b-440b-aefd-492ac7ff7cd5-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "6d71fa88-324b-440b-aefd-492ac7ff7cd5" (UID: "6d71fa88-324b-440b-aefd-492ac7ff7cd5"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.220453 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d71fa88-324b-440b-aefd-492ac7ff7cd5-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "6d71fa88-324b-440b-aefd-492ac7ff7cd5" (UID: "6d71fa88-324b-440b-aefd-492ac7ff7cd5"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.222105 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d71fa88-324b-440b-aefd-492ac7ff7cd5-inventory" (OuterVolumeSpecName: "inventory") pod "6d71fa88-324b-440b-aefd-492ac7ff7cd5" (UID: "6d71fa88-324b-440b-aefd-492ac7ff7cd5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.222798 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d71fa88-324b-440b-aefd-492ac7ff7cd5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6d71fa88-324b-440b-aefd-492ac7ff7cd5" (UID: "6d71fa88-324b-440b-aefd-492ac7ff7cd5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.284194 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d71fa88-324b-440b-aefd-492ac7ff7cd5-inventory\") on node \"crc\" DevicePath \"\"" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.284238 4732 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d71fa88-324b-440b-aefd-492ac7ff7cd5-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.284254 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d71fa88-324b-440b-aefd-492ac7ff7cd5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.284267 4732 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6d71fa88-324b-440b-aefd-492ac7ff7cd5-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.284280 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfjcq\" (UniqueName: \"kubernetes.io/projected/6d71fa88-324b-440b-aefd-492ac7ff7cd5-kube-api-access-cfjcq\") on node \"crc\" DevicePath \"\"" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.635389 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h" event={"ID":"6d71fa88-324b-440b-aefd-492ac7ff7cd5","Type":"ContainerDied","Data":"54f7f7fa1fc6deb2860c2d0cb5d7b2faaea53adf52c63888a82b7b6b69718c0a"} Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.635434 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54f7f7fa1fc6deb2860c2d0cb5d7b2faaea53adf52c63888a82b7b6b69718c0a" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.635440 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.729826 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c"] Apr 02 14:19:02 crc kubenswrapper[4732]: E0402 14:19:02.730639 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa789afd-a972-43a3-bd23-df0ddaa04fc9" containerName="extract-content" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.730741 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa789afd-a972-43a3-bd23-df0ddaa04fc9" containerName="extract-content" Apr 02 14:19:02 crc kubenswrapper[4732]: E0402 14:19:02.730809 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d71fa88-324b-440b-aefd-492ac7ff7cd5" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.730868 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d71fa88-324b-440b-aefd-492ac7ff7cd5" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Apr 02 14:19:02 crc kubenswrapper[4732]: E0402 14:19:02.730922 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1cb3c0-3800-41c1-bfa3-064d473acdc5" containerName="oc" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.730980 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1cb3c0-3800-41c1-bfa3-064d473acdc5" containerName="oc" Apr 02 14:19:02 crc kubenswrapper[4732]: E0402 14:19:02.731076 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa789afd-a972-43a3-bd23-df0ddaa04fc9" containerName="extract-utilities" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.731144 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa789afd-a972-43a3-bd23-df0ddaa04fc9" containerName="extract-utilities" Apr 02 14:19:02 crc kubenswrapper[4732]: E0402 14:19:02.731215 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa789afd-a972-43a3-bd23-df0ddaa04fc9" containerName="registry-server" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.731279 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa789afd-a972-43a3-bd23-df0ddaa04fc9" containerName="registry-server" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.731534 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa789afd-a972-43a3-bd23-df0ddaa04fc9" containerName="registry-server" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.731597 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb1cb3c0-3800-41c1-bfa3-064d473acdc5" containerName="oc" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.731695 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d71fa88-324b-440b-aefd-492ac7ff7cd5" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.732439 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.738046 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.738098 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.738207 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.738286 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wdhd4" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.738493 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.738610 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.738677 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.749823 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c"] Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.901248 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.901310 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.901348 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.901421 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzgfw\" (UniqueName: \"kubernetes.io/projected/f0eca204-c72d-4909-89ba-03d2b1976e07-kube-api-access-gzgfw\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.901469 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.901582 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.901659 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.901731 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.901782 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.901868 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:02 crc kubenswrapper[4732]: I0402 14:19:02.901966 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:03 crc kubenswrapper[4732]: I0402 14:19:03.004265 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:03 crc kubenswrapper[4732]: I0402 14:19:03.004407 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:03 crc kubenswrapper[4732]: I0402 14:19:03.004475 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:03 crc kubenswrapper[4732]: I0402 14:19:03.004513 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:03 crc kubenswrapper[4732]: I0402 14:19:03.004555 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:03 crc kubenswrapper[4732]: I0402 14:19:03.004597 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzgfw\" (UniqueName: \"kubernetes.io/projected/f0eca204-c72d-4909-89ba-03d2b1976e07-kube-api-access-gzgfw\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:03 crc kubenswrapper[4732]: I0402 14:19:03.004857 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:03 crc kubenswrapper[4732]: I0402 14:19:03.004922 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:03 crc kubenswrapper[4732]: I0402 14:19:03.004988 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:03 crc kubenswrapper[4732]: I0402 14:19:03.005022 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:03 crc kubenswrapper[4732]: I0402 14:19:03.005054 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:03 crc kubenswrapper[4732]: I0402 14:19:03.006876 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:03 crc kubenswrapper[4732]: I0402 14:19:03.010385 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:03 crc kubenswrapper[4732]: I0402 14:19:03.010398 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:03 crc kubenswrapper[4732]: I0402 14:19:03.010934 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:03 crc kubenswrapper[4732]: I0402 14:19:03.012892 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:03 crc kubenswrapper[4732]: I0402 14:19:03.013116 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:03 crc kubenswrapper[4732]: I0402 14:19:03.013157 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:03 crc kubenswrapper[4732]: I0402 14:19:03.015203 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:03 crc kubenswrapper[4732]: I0402 14:19:03.015643 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:03 crc kubenswrapper[4732]: I0402 14:19:03.015909 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:03 crc kubenswrapper[4732]: I0402 14:19:03.024072 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzgfw\" (UniqueName: \"kubernetes.io/projected/f0eca204-c72d-4909-89ba-03d2b1976e07-kube-api-access-gzgfw\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gmk6c\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:03 crc kubenswrapper[4732]: I0402 14:19:03.055885 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:19:03 crc kubenswrapper[4732]: W0402 14:19:03.600645 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0eca204_c72d_4909_89ba_03d2b1976e07.slice/crio-62a2dc5471248a5e5dcab8894977158c386b1af290e4e83674bd0b22627147f6 WatchSource:0}: Error finding container 62a2dc5471248a5e5dcab8894977158c386b1af290e4e83674bd0b22627147f6: Status 404 returned error can't find the container with id 62a2dc5471248a5e5dcab8894977158c386b1af290e4e83674bd0b22627147f6 Apr 02 14:19:03 crc kubenswrapper[4732]: I0402 14:19:03.604925 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c"] Apr 02 14:19:03 crc kubenswrapper[4732]: I0402 14:19:03.646430 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" event={"ID":"f0eca204-c72d-4909-89ba-03d2b1976e07","Type":"ContainerStarted","Data":"62a2dc5471248a5e5dcab8894977158c386b1af290e4e83674bd0b22627147f6"} Apr 02 14:19:04 crc kubenswrapper[4732]: I0402 14:19:04.659257 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" event={"ID":"f0eca204-c72d-4909-89ba-03d2b1976e07","Type":"ContainerStarted","Data":"bba3e043e1b71abd3aaa6bdec31a07770ae6093f7ffc8e5988bbe2eaedcf035e"} Apr 02 14:19:31 crc kubenswrapper[4732]: I0402 14:19:31.925002 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:19:31 crc kubenswrapper[4732]: I0402 14:19:31.925668 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:19:31 crc kubenswrapper[4732]: I0402 14:19:31.925744 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 14:19:31 crc kubenswrapper[4732]: I0402 14:19:31.926809 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904"} pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 02 14:19:31 crc kubenswrapper[4732]: I0402 14:19:31.926903 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" containerID="cri-o://e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" gracePeriod=600 Apr 02 14:19:32 crc kubenswrapper[4732]: E0402 14:19:32.052780 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:19:32 crc kubenswrapper[4732]: I0402 14:19:32.916713 4732 generic.go:334] "Generic (PLEG): container finished" podID="38409e5e-4545-49da-8f6c-4bfb30582878" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" exitCode=0 Apr 02 14:19:32 crc kubenswrapper[4732]: I0402 14:19:32.916787 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerDied","Data":"e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904"} Apr 02 14:19:32 crc kubenswrapper[4732]: I0402 14:19:32.917196 4732 scope.go:117] "RemoveContainer" containerID="63733271f299db2daa2f2ab3f85fd3de726a361d8aba6db8431f92a5c4cd2580" Apr 02 14:19:32 crc kubenswrapper[4732]: I0402 14:19:32.918307 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:19:32 crc kubenswrapper[4732]: E0402 14:19:32.918992 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:19:32 crc kubenswrapper[4732]: I0402 14:19:32.943570 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" podStartSLOduration=30.416645208 podStartE2EDuration="30.94353732s" podCreationTimestamp="2026-04-02 14:19:02 +0000 UTC" firstStartedPulling="2026-04-02 14:19:03.605364177 +0000 UTC m=+2500.509771740" lastFinishedPulling="2026-04-02 14:19:04.132256289 +0000 UTC m=+2501.036663852" observedRunningTime="2026-04-02 14:19:04.690952032 +0000 UTC m=+2501.595359615" watchObservedRunningTime="2026-04-02 14:19:32.94353732 +0000 UTC m=+2529.847944913" Apr 02 14:19:43 crc kubenswrapper[4732]: I0402 14:19:43.680336 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:19:43 crc kubenswrapper[4732]: E0402 14:19:43.681074 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:19:57 crc kubenswrapper[4732]: I0402 14:19:57.680738 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:19:57 crc kubenswrapper[4732]: E0402 14:19:57.681694 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:20:00 crc kubenswrapper[4732]: I0402 14:20:00.147165 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585660-vzv74"] Apr 02 14:20:00 crc kubenswrapper[4732]: I0402 14:20:00.149416 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585660-vzv74" Apr 02 14:20:00 crc kubenswrapper[4732]: I0402 14:20:00.152128 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:20:00 crc kubenswrapper[4732]: I0402 14:20:00.152319 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:20:00 crc kubenswrapper[4732]: I0402 14:20:00.158398 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:20:00 crc kubenswrapper[4732]: I0402 14:20:00.167780 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585660-vzv74"] Apr 02 14:20:00 crc kubenswrapper[4732]: I0402 14:20:00.194701 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgzp8\" (UniqueName: \"kubernetes.io/projected/dc762ced-e5a5-43ce-b204-7fef2eb152e4-kube-api-access-cgzp8\") pod \"auto-csr-approver-29585660-vzv74\" (UID: \"dc762ced-e5a5-43ce-b204-7fef2eb152e4\") " pod="openshift-infra/auto-csr-approver-29585660-vzv74" Apr 02 14:20:00 crc kubenswrapper[4732]: I0402 14:20:00.296654 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgzp8\" (UniqueName: \"kubernetes.io/projected/dc762ced-e5a5-43ce-b204-7fef2eb152e4-kube-api-access-cgzp8\") pod \"auto-csr-approver-29585660-vzv74\" (UID: \"dc762ced-e5a5-43ce-b204-7fef2eb152e4\") " pod="openshift-infra/auto-csr-approver-29585660-vzv74" Apr 02 14:20:00 crc kubenswrapper[4732]: I0402 14:20:00.314684 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgzp8\" (UniqueName: \"kubernetes.io/projected/dc762ced-e5a5-43ce-b204-7fef2eb152e4-kube-api-access-cgzp8\") pod \"auto-csr-approver-29585660-vzv74\" (UID: \"dc762ced-e5a5-43ce-b204-7fef2eb152e4\") " pod="openshift-infra/auto-csr-approver-29585660-vzv74" Apr 02 14:20:00 crc kubenswrapper[4732]: I0402 14:20:00.496410 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585660-vzv74" Apr 02 14:20:00 crc kubenswrapper[4732]: I0402 14:20:00.960309 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585660-vzv74"] Apr 02 14:20:01 crc kubenswrapper[4732]: I0402 14:20:01.190150 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585660-vzv74" event={"ID":"dc762ced-e5a5-43ce-b204-7fef2eb152e4","Type":"ContainerStarted","Data":"c364d98fbb6c0977a48cf7ef17e1fe772004c9aca3d8bd02420a78d7082aa4bf"} Apr 02 14:20:03 crc kubenswrapper[4732]: I0402 14:20:03.215482 4732 generic.go:334] "Generic (PLEG): container finished" podID="dc762ced-e5a5-43ce-b204-7fef2eb152e4" containerID="3645625a5c2ba46e55d1e0ddd8c4fd8bafefbe5d0cc6f8572696bb0e80e8d0a6" exitCode=0 Apr 02 14:20:03 crc kubenswrapper[4732]: I0402 14:20:03.215589 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585660-vzv74" event={"ID":"dc762ced-e5a5-43ce-b204-7fef2eb152e4","Type":"ContainerDied","Data":"3645625a5c2ba46e55d1e0ddd8c4fd8bafefbe5d0cc6f8572696bb0e80e8d0a6"} Apr 02 14:20:04 crc kubenswrapper[4732]: I0402 14:20:04.651021 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585660-vzv74" Apr 02 14:20:04 crc kubenswrapper[4732]: I0402 14:20:04.781731 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgzp8\" (UniqueName: \"kubernetes.io/projected/dc762ced-e5a5-43ce-b204-7fef2eb152e4-kube-api-access-cgzp8\") pod \"dc762ced-e5a5-43ce-b204-7fef2eb152e4\" (UID: \"dc762ced-e5a5-43ce-b204-7fef2eb152e4\") " Apr 02 14:20:04 crc kubenswrapper[4732]: I0402 14:20:04.787608 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc762ced-e5a5-43ce-b204-7fef2eb152e4-kube-api-access-cgzp8" (OuterVolumeSpecName: "kube-api-access-cgzp8") pod "dc762ced-e5a5-43ce-b204-7fef2eb152e4" (UID: "dc762ced-e5a5-43ce-b204-7fef2eb152e4"). InnerVolumeSpecName "kube-api-access-cgzp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:20:04 crc kubenswrapper[4732]: I0402 14:20:04.884774 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgzp8\" (UniqueName: \"kubernetes.io/projected/dc762ced-e5a5-43ce-b204-7fef2eb152e4-kube-api-access-cgzp8\") on node \"crc\" DevicePath \"\"" Apr 02 14:20:05 crc kubenswrapper[4732]: I0402 14:20:05.239693 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585660-vzv74" event={"ID":"dc762ced-e5a5-43ce-b204-7fef2eb152e4","Type":"ContainerDied","Data":"c364d98fbb6c0977a48cf7ef17e1fe772004c9aca3d8bd02420a78d7082aa4bf"} Apr 02 14:20:05 crc kubenswrapper[4732]: I0402 14:20:05.239756 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c364d98fbb6c0977a48cf7ef17e1fe772004c9aca3d8bd02420a78d7082aa4bf" Apr 02 14:20:05 crc kubenswrapper[4732]: I0402 14:20:05.240262 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585660-vzv74" Apr 02 14:20:05 crc kubenswrapper[4732]: I0402 14:20:05.744715 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585654-np2bl"] Apr 02 14:20:05 crc kubenswrapper[4732]: I0402 14:20:05.754057 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585654-np2bl"] Apr 02 14:20:06 crc kubenswrapper[4732]: I0402 14:20:06.699888 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b24501bc-bc9f-464d-be9c-ab1dcc94bec7" path="/var/lib/kubelet/pods/b24501bc-bc9f-464d-be9c-ab1dcc94bec7/volumes" Apr 02 14:20:10 crc kubenswrapper[4732]: I0402 14:20:10.683002 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:20:10 crc kubenswrapper[4732]: E0402 14:20:10.684052 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:20:23 crc kubenswrapper[4732]: I0402 14:20:23.680079 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:20:23 crc kubenswrapper[4732]: E0402 14:20:23.681162 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:20:38 crc kubenswrapper[4732]: I0402 14:20:38.680507 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:20:38 crc kubenswrapper[4732]: E0402 14:20:38.681203 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:20:49 crc kubenswrapper[4732]: I0402 14:20:49.680838 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:20:49 crc kubenswrapper[4732]: E0402 14:20:49.681539 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:20:51 crc kubenswrapper[4732]: I0402 14:20:51.665145 4732 scope.go:117] "RemoveContainer" containerID="3d9111f7026d45fd9c347915f69d6a6382d72447fc924385237d2a9409770a67" Apr 02 14:21:01 crc kubenswrapper[4732]: I0402 14:21:01.680692 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:21:01 crc kubenswrapper[4732]: E0402 14:21:01.681449 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:21:12 crc kubenswrapper[4732]: I0402 14:21:12.680547 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:21:12 crc kubenswrapper[4732]: E0402 14:21:12.681392 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:21:15 crc kubenswrapper[4732]: I0402 14:21:15.907069 4732 generic.go:334] "Generic (PLEG): container finished" podID="f0eca204-c72d-4909-89ba-03d2b1976e07" containerID="bba3e043e1b71abd3aaa6bdec31a07770ae6093f7ffc8e5988bbe2eaedcf035e" exitCode=0 Apr 02 14:21:15 crc kubenswrapper[4732]: I0402 14:21:15.907383 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" event={"ID":"f0eca204-c72d-4909-89ba-03d2b1976e07","Type":"ContainerDied","Data":"bba3e043e1b71abd3aaa6bdec31a07770ae6093f7ffc8e5988bbe2eaedcf035e"} Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.326540 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.418368 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-migration-ssh-key-0\") pod \"f0eca204-c72d-4909-89ba-03d2b1976e07\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.418494 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-cell1-compute-config-0\") pod \"f0eca204-c72d-4909-89ba-03d2b1976e07\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.418523 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-cell1-compute-config-2\") pod \"f0eca204-c72d-4909-89ba-03d2b1976e07\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.418551 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-inventory\") pod \"f0eca204-c72d-4909-89ba-03d2b1976e07\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.418574 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-migration-ssh-key-1\") pod \"f0eca204-c72d-4909-89ba-03d2b1976e07\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.418630 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-cell1-compute-config-3\") pod \"f0eca204-c72d-4909-89ba-03d2b1976e07\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.418697 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-ssh-key-openstack-edpm-ipam\") pod \"f0eca204-c72d-4909-89ba-03d2b1976e07\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.418716 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-extra-config-0\") pod \"f0eca204-c72d-4909-89ba-03d2b1976e07\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.418751 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-cell1-compute-config-1\") pod \"f0eca204-c72d-4909-89ba-03d2b1976e07\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.418768 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzgfw\" (UniqueName: \"kubernetes.io/projected/f0eca204-c72d-4909-89ba-03d2b1976e07-kube-api-access-gzgfw\") pod \"f0eca204-c72d-4909-89ba-03d2b1976e07\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.418818 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-combined-ca-bundle\") pod \"f0eca204-c72d-4909-89ba-03d2b1976e07\" (UID: \"f0eca204-c72d-4909-89ba-03d2b1976e07\") " Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.433020 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "f0eca204-c72d-4909-89ba-03d2b1976e07" (UID: "f0eca204-c72d-4909-89ba-03d2b1976e07"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.433039 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0eca204-c72d-4909-89ba-03d2b1976e07-kube-api-access-gzgfw" (OuterVolumeSpecName: "kube-api-access-gzgfw") pod "f0eca204-c72d-4909-89ba-03d2b1976e07" (UID: "f0eca204-c72d-4909-89ba-03d2b1976e07"). InnerVolumeSpecName "kube-api-access-gzgfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.456420 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "f0eca204-c72d-4909-89ba-03d2b1976e07" (UID: "f0eca204-c72d-4909-89ba-03d2b1976e07"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.463729 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f0eca204-c72d-4909-89ba-03d2b1976e07" (UID: "f0eca204-c72d-4909-89ba-03d2b1976e07"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.468764 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "f0eca204-c72d-4909-89ba-03d2b1976e07" (UID: "f0eca204-c72d-4909-89ba-03d2b1976e07"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.472105 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-inventory" (OuterVolumeSpecName: "inventory") pod "f0eca204-c72d-4909-89ba-03d2b1976e07" (UID: "f0eca204-c72d-4909-89ba-03d2b1976e07"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.472142 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "f0eca204-c72d-4909-89ba-03d2b1976e07" (UID: "f0eca204-c72d-4909-89ba-03d2b1976e07"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.474325 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "f0eca204-c72d-4909-89ba-03d2b1976e07" (UID: "f0eca204-c72d-4909-89ba-03d2b1976e07"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.488104 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "f0eca204-c72d-4909-89ba-03d2b1976e07" (UID: "f0eca204-c72d-4909-89ba-03d2b1976e07"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.510283 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "f0eca204-c72d-4909-89ba-03d2b1976e07" (UID: "f0eca204-c72d-4909-89ba-03d2b1976e07"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.515035 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "f0eca204-c72d-4909-89ba-03d2b1976e07" (UID: "f0eca204-c72d-4909-89ba-03d2b1976e07"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.523865 4732 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.523897 4732 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.523908 4732 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.523917 4732 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.523928 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-inventory\") on node \"crc\" DevicePath \"\"" Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.523937 4732 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.523945 4732 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.523956 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.523966 4732 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.523975 4732 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f0eca204-c72d-4909-89ba-03d2b1976e07-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.523983 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzgfw\" (UniqueName: \"kubernetes.io/projected/f0eca204-c72d-4909-89ba-03d2b1976e07-kube-api-access-gzgfw\") on node \"crc\" DevicePath \"\"" Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.933079 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" event={"ID":"f0eca204-c72d-4909-89ba-03d2b1976e07","Type":"ContainerDied","Data":"62a2dc5471248a5e5dcab8894977158c386b1af290e4e83674bd0b22627147f6"} Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.933125 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62a2dc5471248a5e5dcab8894977158c386b1af290e4e83674bd0b22627147f6" Apr 02 14:21:17 crc kubenswrapper[4732]: I0402 14:21:17.933145 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gmk6c" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.056078 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg"] Apr 02 14:21:18 crc kubenswrapper[4732]: E0402 14:21:18.057248 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc762ced-e5a5-43ce-b204-7fef2eb152e4" containerName="oc" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.057355 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc762ced-e5a5-43ce-b204-7fef2eb152e4" containerName="oc" Apr 02 14:21:18 crc kubenswrapper[4732]: E0402 14:21:18.057442 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0eca204-c72d-4909-89ba-03d2b1976e07" containerName="nova-edpm-deployment-openstack-edpm-ipam" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.057509 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0eca204-c72d-4909-89ba-03d2b1976e07" containerName="nova-edpm-deployment-openstack-edpm-ipam" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.057964 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0eca204-c72d-4909-89ba-03d2b1976e07" containerName="nova-edpm-deployment-openstack-edpm-ipam" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.058038 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc762ced-e5a5-43ce-b204-7fef2eb152e4" containerName="oc" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.058911 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.062990 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.063210 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wdhd4" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.063821 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.063969 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.064024 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.068927 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg"] Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.145397 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9frcg\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.145494 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9frcg\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.145737 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9frcg\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.145860 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plz5f\" (UniqueName: \"kubernetes.io/projected/0e3946af-2a00-4313-9a3b-79acd9152f58-kube-api-access-plz5f\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9frcg\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.146053 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9frcg\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.146125 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9frcg\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.146303 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9frcg\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.248726 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9frcg\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.248794 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plz5f\" (UniqueName: \"kubernetes.io/projected/0e3946af-2a00-4313-9a3b-79acd9152f58-kube-api-access-plz5f\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9frcg\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.248825 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9frcg\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.248844 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9frcg\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.248873 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9frcg\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.248936 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9frcg\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.249017 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9frcg\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.253435 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9frcg\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.253441 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9frcg\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.253927 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9frcg\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.254445 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9frcg\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.254532 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9frcg\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.257001 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9frcg\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.268056 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plz5f\" (UniqueName: \"kubernetes.io/projected/0e3946af-2a00-4313-9a3b-79acd9152f58-kube-api-access-plz5f\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9frcg\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.408135 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" Apr 02 14:21:18 crc kubenswrapper[4732]: I0402 14:21:18.958127 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg"] Apr 02 14:21:19 crc kubenswrapper[4732]: I0402 14:21:19.950057 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" event={"ID":"0e3946af-2a00-4313-9a3b-79acd9152f58","Type":"ContainerStarted","Data":"4541c7ac1ce2a9794bbb1a0cc0d7c986eadfe5c0a3a238ce5c826faa41562df1"} Apr 02 14:21:19 crc kubenswrapper[4732]: I0402 14:21:19.950511 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" event={"ID":"0e3946af-2a00-4313-9a3b-79acd9152f58","Type":"ContainerStarted","Data":"680eefb9143605d27fecca95cde730311d32b6b086dfffd6e089e052fdf8fd53"} Apr 02 14:21:19 crc kubenswrapper[4732]: I0402 14:21:19.965777 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" podStartSLOduration=1.486243503 podStartE2EDuration="1.965760346s" podCreationTimestamp="2026-04-02 14:21:18 +0000 UTC" firstStartedPulling="2026-04-02 14:21:18.963602311 +0000 UTC m=+2635.868009864" lastFinishedPulling="2026-04-02 14:21:19.443119154 +0000 UTC m=+2636.347526707" observedRunningTime="2026-04-02 14:21:19.965753406 +0000 UTC m=+2636.870160979" watchObservedRunningTime="2026-04-02 14:21:19.965760346 +0000 UTC m=+2636.870167899" Apr 02 14:21:27 crc kubenswrapper[4732]: I0402 14:21:27.681460 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:21:27 crc kubenswrapper[4732]: E0402 14:21:27.682664 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:21:39 crc kubenswrapper[4732]: I0402 14:21:39.681118 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:21:39 crc kubenswrapper[4732]: E0402 14:21:39.682397 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:21:53 crc kubenswrapper[4732]: I0402 14:21:53.680559 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:21:53 crc kubenswrapper[4732]: E0402 14:21:53.681565 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:22:00 crc kubenswrapper[4732]: I0402 14:22:00.160324 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585662-g8zxb"] Apr 02 14:22:00 crc kubenswrapper[4732]: I0402 14:22:00.162908 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585662-g8zxb" Apr 02 14:22:00 crc kubenswrapper[4732]: I0402 14:22:00.164632 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:22:00 crc kubenswrapper[4732]: I0402 14:22:00.166162 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:22:00 crc kubenswrapper[4732]: I0402 14:22:00.166260 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:22:00 crc kubenswrapper[4732]: I0402 14:22:00.171560 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585662-g8zxb"] Apr 02 14:22:00 crc kubenswrapper[4732]: I0402 14:22:00.260385 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4pd2\" (UniqueName: \"kubernetes.io/projected/11cf71a3-dcfe-48e1-a37f-064849f9af8c-kube-api-access-d4pd2\") pod \"auto-csr-approver-29585662-g8zxb\" (UID: \"11cf71a3-dcfe-48e1-a37f-064849f9af8c\") " pod="openshift-infra/auto-csr-approver-29585662-g8zxb" Apr 02 14:22:00 crc kubenswrapper[4732]: I0402 14:22:00.362839 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4pd2\" (UniqueName: \"kubernetes.io/projected/11cf71a3-dcfe-48e1-a37f-064849f9af8c-kube-api-access-d4pd2\") pod \"auto-csr-approver-29585662-g8zxb\" (UID: \"11cf71a3-dcfe-48e1-a37f-064849f9af8c\") " pod="openshift-infra/auto-csr-approver-29585662-g8zxb" Apr 02 14:22:00 crc kubenswrapper[4732]: I0402 14:22:00.382925 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4pd2\" (UniqueName: \"kubernetes.io/projected/11cf71a3-dcfe-48e1-a37f-064849f9af8c-kube-api-access-d4pd2\") pod \"auto-csr-approver-29585662-g8zxb\" (UID: \"11cf71a3-dcfe-48e1-a37f-064849f9af8c\") " pod="openshift-infra/auto-csr-approver-29585662-g8zxb" Apr 02 14:22:00 crc kubenswrapper[4732]: I0402 14:22:00.488563 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585662-g8zxb" Apr 02 14:22:00 crc kubenswrapper[4732]: I0402 14:22:00.939056 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585662-g8zxb"] Apr 02 14:22:01 crc kubenswrapper[4732]: I0402 14:22:01.357569 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585662-g8zxb" event={"ID":"11cf71a3-dcfe-48e1-a37f-064849f9af8c","Type":"ContainerStarted","Data":"a55076831a9eb7fe48ddfc3ee41567820cd16ff3a689622763a9f1616785f4fb"} Apr 02 14:22:02 crc kubenswrapper[4732]: I0402 14:22:02.367176 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585662-g8zxb" event={"ID":"11cf71a3-dcfe-48e1-a37f-064849f9af8c","Type":"ContainerStarted","Data":"74ee7f47e875de8d4547eff19fe786f49f38d4ce0193a1f3cda4dc35cd73caf4"} Apr 02 14:22:02 crc kubenswrapper[4732]: I0402 14:22:02.384554 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29585662-g8zxb" podStartSLOduration=1.302969836 podStartE2EDuration="2.384535344s" podCreationTimestamp="2026-04-02 14:22:00 +0000 UTC" firstStartedPulling="2026-04-02 14:22:00.944426183 +0000 UTC m=+2677.848833736" lastFinishedPulling="2026-04-02 14:22:02.025991691 +0000 UTC m=+2678.930399244" observedRunningTime="2026-04-02 14:22:02.382753506 +0000 UTC m=+2679.287161089" watchObservedRunningTime="2026-04-02 14:22:02.384535344 +0000 UTC m=+2679.288942897" Apr 02 14:22:03 crc kubenswrapper[4732]: I0402 14:22:03.377778 4732 generic.go:334] "Generic (PLEG): container finished" podID="11cf71a3-dcfe-48e1-a37f-064849f9af8c" containerID="74ee7f47e875de8d4547eff19fe786f49f38d4ce0193a1f3cda4dc35cd73caf4" exitCode=0 Apr 02 14:22:03 crc kubenswrapper[4732]: I0402 14:22:03.377852 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585662-g8zxb" event={"ID":"11cf71a3-dcfe-48e1-a37f-064849f9af8c","Type":"ContainerDied","Data":"74ee7f47e875de8d4547eff19fe786f49f38d4ce0193a1f3cda4dc35cd73caf4"} Apr 02 14:22:04 crc kubenswrapper[4732]: I0402 14:22:04.727403 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585662-g8zxb" Apr 02 14:22:04 crc kubenswrapper[4732]: I0402 14:22:04.849223 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4pd2\" (UniqueName: \"kubernetes.io/projected/11cf71a3-dcfe-48e1-a37f-064849f9af8c-kube-api-access-d4pd2\") pod \"11cf71a3-dcfe-48e1-a37f-064849f9af8c\" (UID: \"11cf71a3-dcfe-48e1-a37f-064849f9af8c\") " Apr 02 14:22:04 crc kubenswrapper[4732]: I0402 14:22:04.857420 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11cf71a3-dcfe-48e1-a37f-064849f9af8c-kube-api-access-d4pd2" (OuterVolumeSpecName: "kube-api-access-d4pd2") pod "11cf71a3-dcfe-48e1-a37f-064849f9af8c" (UID: "11cf71a3-dcfe-48e1-a37f-064849f9af8c"). InnerVolumeSpecName "kube-api-access-d4pd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:22:04 crc kubenswrapper[4732]: I0402 14:22:04.952125 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4pd2\" (UniqueName: \"kubernetes.io/projected/11cf71a3-dcfe-48e1-a37f-064849f9af8c-kube-api-access-d4pd2\") on node \"crc\" DevicePath \"\"" Apr 02 14:22:05 crc kubenswrapper[4732]: I0402 14:22:05.404973 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585662-g8zxb" event={"ID":"11cf71a3-dcfe-48e1-a37f-064849f9af8c","Type":"ContainerDied","Data":"a55076831a9eb7fe48ddfc3ee41567820cd16ff3a689622763a9f1616785f4fb"} Apr 02 14:22:05 crc kubenswrapper[4732]: I0402 14:22:05.405767 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a55076831a9eb7fe48ddfc3ee41567820cd16ff3a689622763a9f1616785f4fb" Apr 02 14:22:05 crc kubenswrapper[4732]: I0402 14:22:05.405093 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585662-g8zxb" Apr 02 14:22:05 crc kubenswrapper[4732]: I0402 14:22:05.460222 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585656-cn5h7"] Apr 02 14:22:05 crc kubenswrapper[4732]: I0402 14:22:05.469289 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585656-cn5h7"] Apr 02 14:22:06 crc kubenswrapper[4732]: I0402 14:22:06.693278 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce03f501-71c1-4a1b-b9d9-4f72058b93a1" path="/var/lib/kubelet/pods/ce03f501-71c1-4a1b-b9d9-4f72058b93a1/volumes" Apr 02 14:22:08 crc kubenswrapper[4732]: I0402 14:22:08.681058 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:22:08 crc kubenswrapper[4732]: E0402 14:22:08.681983 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:22:20 crc kubenswrapper[4732]: I0402 14:22:20.680825 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:22:20 crc kubenswrapper[4732]: E0402 14:22:20.681684 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:22:33 crc kubenswrapper[4732]: I0402 14:22:33.680741 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:22:33 crc kubenswrapper[4732]: E0402 14:22:33.681839 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:22:44 crc kubenswrapper[4732]: I0402 14:22:44.689022 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:22:44 crc kubenswrapper[4732]: E0402 14:22:44.689649 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:22:51 crc kubenswrapper[4732]: I0402 14:22:51.780055 4732 scope.go:117] "RemoveContainer" containerID="8653ec8ca33f402b0f1d5ce232f46f3cab4d081240b9967e103318cb733a78f9" Apr 02 14:22:57 crc kubenswrapper[4732]: I0402 14:22:57.680201 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:22:57 crc kubenswrapper[4732]: E0402 14:22:57.681256 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:23:09 crc kubenswrapper[4732]: I0402 14:23:09.680950 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:23:09 crc kubenswrapper[4732]: E0402 14:23:09.682150 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:23:23 crc kubenswrapper[4732]: I0402 14:23:23.680288 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:23:23 crc kubenswrapper[4732]: E0402 14:23:23.681142 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:23:32 crc kubenswrapper[4732]: I0402 14:23:32.695747 4732 generic.go:334] "Generic (PLEG): container finished" podID="0e3946af-2a00-4313-9a3b-79acd9152f58" containerID="4541c7ac1ce2a9794bbb1a0cc0d7c986eadfe5c0a3a238ce5c826faa41562df1" exitCode=0 Apr 02 14:23:32 crc kubenswrapper[4732]: I0402 14:23:32.695815 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" event={"ID":"0e3946af-2a00-4313-9a3b-79acd9152f58","Type":"ContainerDied","Data":"4541c7ac1ce2a9794bbb1a0cc0d7c986eadfe5c0a3a238ce5c826faa41562df1"} Apr 02 14:23:34 crc kubenswrapper[4732]: I0402 14:23:34.090471 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" Apr 02 14:23:34 crc kubenswrapper[4732]: I0402 14:23:34.276316 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plz5f\" (UniqueName: \"kubernetes.io/projected/0e3946af-2a00-4313-9a3b-79acd9152f58-kube-api-access-plz5f\") pod \"0e3946af-2a00-4313-9a3b-79acd9152f58\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " Apr 02 14:23:34 crc kubenswrapper[4732]: I0402 14:23:34.276405 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-ceilometer-compute-config-data-2\") pod \"0e3946af-2a00-4313-9a3b-79acd9152f58\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " Apr 02 14:23:34 crc kubenswrapper[4732]: I0402 14:23:34.276462 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-ceilometer-compute-config-data-0\") pod \"0e3946af-2a00-4313-9a3b-79acd9152f58\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " Apr 02 14:23:34 crc kubenswrapper[4732]: I0402 14:23:34.276550 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-ssh-key-openstack-edpm-ipam\") pod \"0e3946af-2a00-4313-9a3b-79acd9152f58\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " Apr 02 14:23:34 crc kubenswrapper[4732]: I0402 14:23:34.276602 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-ceilometer-compute-config-data-1\") pod \"0e3946af-2a00-4313-9a3b-79acd9152f58\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " Apr 02 14:23:34 crc kubenswrapper[4732]: I0402 14:23:34.276684 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-inventory\") pod \"0e3946af-2a00-4313-9a3b-79acd9152f58\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " Apr 02 14:23:34 crc kubenswrapper[4732]: I0402 14:23:34.277536 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-telemetry-combined-ca-bundle\") pod \"0e3946af-2a00-4313-9a3b-79acd9152f58\" (UID: \"0e3946af-2a00-4313-9a3b-79acd9152f58\") " Apr 02 14:23:34 crc kubenswrapper[4732]: I0402 14:23:34.282139 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "0e3946af-2a00-4313-9a3b-79acd9152f58" (UID: "0e3946af-2a00-4313-9a3b-79acd9152f58"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:23:34 crc kubenswrapper[4732]: I0402 14:23:34.282547 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e3946af-2a00-4313-9a3b-79acd9152f58-kube-api-access-plz5f" (OuterVolumeSpecName: "kube-api-access-plz5f") pod "0e3946af-2a00-4313-9a3b-79acd9152f58" (UID: "0e3946af-2a00-4313-9a3b-79acd9152f58"). InnerVolumeSpecName "kube-api-access-plz5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:23:34 crc kubenswrapper[4732]: I0402 14:23:34.304394 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "0e3946af-2a00-4313-9a3b-79acd9152f58" (UID: "0e3946af-2a00-4313-9a3b-79acd9152f58"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:23:34 crc kubenswrapper[4732]: I0402 14:23:34.306764 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "0e3946af-2a00-4313-9a3b-79acd9152f58" (UID: "0e3946af-2a00-4313-9a3b-79acd9152f58"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:23:34 crc kubenswrapper[4732]: I0402 14:23:34.308865 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "0e3946af-2a00-4313-9a3b-79acd9152f58" (UID: "0e3946af-2a00-4313-9a3b-79acd9152f58"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:23:34 crc kubenswrapper[4732]: I0402 14:23:34.311884 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-inventory" (OuterVolumeSpecName: "inventory") pod "0e3946af-2a00-4313-9a3b-79acd9152f58" (UID: "0e3946af-2a00-4313-9a3b-79acd9152f58"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:23:34 crc kubenswrapper[4732]: I0402 14:23:34.323367 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0e3946af-2a00-4313-9a3b-79acd9152f58" (UID: "0e3946af-2a00-4313-9a3b-79acd9152f58"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:23:34 crc kubenswrapper[4732]: I0402 14:23:34.379865 4732 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Apr 02 14:23:34 crc kubenswrapper[4732]: I0402 14:23:34.379899 4732 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Apr 02 14:23:34 crc kubenswrapper[4732]: I0402 14:23:34.379911 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 02 14:23:34 crc kubenswrapper[4732]: I0402 14:23:34.379921 4732 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Apr 02 14:23:34 crc kubenswrapper[4732]: I0402 14:23:34.379933 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-inventory\") on node \"crc\" DevicePath \"\"" Apr 02 14:23:34 crc kubenswrapper[4732]: I0402 14:23:34.379944 4732 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3946af-2a00-4313-9a3b-79acd9152f58-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 02 14:23:34 crc kubenswrapper[4732]: I0402 14:23:34.379954 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plz5f\" (UniqueName: \"kubernetes.io/projected/0e3946af-2a00-4313-9a3b-79acd9152f58-kube-api-access-plz5f\") on node \"crc\" DevicePath \"\"" Apr 02 14:23:34 crc kubenswrapper[4732]: I0402 14:23:34.735964 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" event={"ID":"0e3946af-2a00-4313-9a3b-79acd9152f58","Type":"ContainerDied","Data":"680eefb9143605d27fecca95cde730311d32b6b086dfffd6e089e052fdf8fd53"} Apr 02 14:23:34 crc kubenswrapper[4732]: I0402 14:23:34.736029 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="680eefb9143605d27fecca95cde730311d32b6b086dfffd6e089e052fdf8fd53" Apr 02 14:23:34 crc kubenswrapper[4732]: I0402 14:23:34.736164 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9frcg" Apr 02 14:23:37 crc kubenswrapper[4732]: I0402 14:23:37.679864 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:23:37 crc kubenswrapper[4732]: E0402 14:23:37.682046 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:23:51 crc kubenswrapper[4732]: I0402 14:23:51.680261 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:23:51 crc kubenswrapper[4732]: E0402 14:23:51.681081 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:24:00 crc kubenswrapper[4732]: I0402 14:24:00.163764 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585664-kmmbv"] Apr 02 14:24:00 crc kubenswrapper[4732]: E0402 14:24:00.164912 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e3946af-2a00-4313-9a3b-79acd9152f58" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Apr 02 14:24:00 crc kubenswrapper[4732]: I0402 14:24:00.164933 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e3946af-2a00-4313-9a3b-79acd9152f58" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Apr 02 14:24:00 crc kubenswrapper[4732]: E0402 14:24:00.164964 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11cf71a3-dcfe-48e1-a37f-064849f9af8c" containerName="oc" Apr 02 14:24:00 crc kubenswrapper[4732]: I0402 14:24:00.164977 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="11cf71a3-dcfe-48e1-a37f-064849f9af8c" containerName="oc" Apr 02 14:24:00 crc kubenswrapper[4732]: I0402 14:24:00.165311 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e3946af-2a00-4313-9a3b-79acd9152f58" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Apr 02 14:24:00 crc kubenswrapper[4732]: I0402 14:24:00.165518 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="11cf71a3-dcfe-48e1-a37f-064849f9af8c" containerName="oc" Apr 02 14:24:00 crc kubenswrapper[4732]: I0402 14:24:00.166539 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585664-kmmbv" Apr 02 14:24:00 crc kubenswrapper[4732]: I0402 14:24:00.170236 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:24:00 crc kubenswrapper[4732]: I0402 14:24:00.170326 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:24:00 crc kubenswrapper[4732]: I0402 14:24:00.171274 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:24:00 crc kubenswrapper[4732]: I0402 14:24:00.177400 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585664-kmmbv"] Apr 02 14:24:00 crc kubenswrapper[4732]: I0402 14:24:00.222018 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76zzg\" (UniqueName: \"kubernetes.io/projected/38bd5a65-41cb-4390-933d-6df35e2438c7-kube-api-access-76zzg\") pod \"auto-csr-approver-29585664-kmmbv\" (UID: \"38bd5a65-41cb-4390-933d-6df35e2438c7\") " pod="openshift-infra/auto-csr-approver-29585664-kmmbv" Apr 02 14:24:00 crc kubenswrapper[4732]: I0402 14:24:00.323584 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76zzg\" (UniqueName: \"kubernetes.io/projected/38bd5a65-41cb-4390-933d-6df35e2438c7-kube-api-access-76zzg\") pod \"auto-csr-approver-29585664-kmmbv\" (UID: \"38bd5a65-41cb-4390-933d-6df35e2438c7\") " pod="openshift-infra/auto-csr-approver-29585664-kmmbv" Apr 02 14:24:00 crc kubenswrapper[4732]: I0402 14:24:00.361482 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76zzg\" (UniqueName: \"kubernetes.io/projected/38bd5a65-41cb-4390-933d-6df35e2438c7-kube-api-access-76zzg\") pod \"auto-csr-approver-29585664-kmmbv\" (UID: \"38bd5a65-41cb-4390-933d-6df35e2438c7\") " pod="openshift-infra/auto-csr-approver-29585664-kmmbv" Apr 02 14:24:00 crc kubenswrapper[4732]: I0402 14:24:00.488060 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585664-kmmbv" Apr 02 14:24:00 crc kubenswrapper[4732]: I0402 14:24:00.964042 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585664-kmmbv"] Apr 02 14:24:00 crc kubenswrapper[4732]: I0402 14:24:00.972986 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 02 14:24:01 crc kubenswrapper[4732]: I0402 14:24:01.037927 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585664-kmmbv" event={"ID":"38bd5a65-41cb-4390-933d-6df35e2438c7","Type":"ContainerStarted","Data":"69e71f427b01e9ffc8ea1914f38bb8504d9336725d7f9d30b3d133f76019366f"} Apr 02 14:24:03 crc kubenswrapper[4732]: I0402 14:24:03.069877 4732 generic.go:334] "Generic (PLEG): container finished" podID="38bd5a65-41cb-4390-933d-6df35e2438c7" containerID="a8b2676b1287709c5661be7b28fbfa8d6b460cf24ca56ca91aef3a433cb1c1a4" exitCode=0 Apr 02 14:24:03 crc kubenswrapper[4732]: I0402 14:24:03.070212 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585664-kmmbv" event={"ID":"38bd5a65-41cb-4390-933d-6df35e2438c7","Type":"ContainerDied","Data":"a8b2676b1287709c5661be7b28fbfa8d6b460cf24ca56ca91aef3a433cb1c1a4"} Apr 02 14:24:03 crc kubenswrapper[4732]: I0402 14:24:03.680956 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:24:03 crc kubenswrapper[4732]: E0402 14:24:03.681675 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:24:04 crc kubenswrapper[4732]: I0402 14:24:04.398245 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585664-kmmbv" Apr 02 14:24:04 crc kubenswrapper[4732]: I0402 14:24:04.501983 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76zzg\" (UniqueName: \"kubernetes.io/projected/38bd5a65-41cb-4390-933d-6df35e2438c7-kube-api-access-76zzg\") pod \"38bd5a65-41cb-4390-933d-6df35e2438c7\" (UID: \"38bd5a65-41cb-4390-933d-6df35e2438c7\") " Apr 02 14:24:04 crc kubenswrapper[4732]: I0402 14:24:04.508292 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38bd5a65-41cb-4390-933d-6df35e2438c7-kube-api-access-76zzg" (OuterVolumeSpecName: "kube-api-access-76zzg") pod "38bd5a65-41cb-4390-933d-6df35e2438c7" (UID: "38bd5a65-41cb-4390-933d-6df35e2438c7"). InnerVolumeSpecName "kube-api-access-76zzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:24:04 crc kubenswrapper[4732]: I0402 14:24:04.605021 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76zzg\" (UniqueName: \"kubernetes.io/projected/38bd5a65-41cb-4390-933d-6df35e2438c7-kube-api-access-76zzg\") on node \"crc\" DevicePath \"\"" Apr 02 14:24:05 crc kubenswrapper[4732]: I0402 14:24:05.092541 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585664-kmmbv" event={"ID":"38bd5a65-41cb-4390-933d-6df35e2438c7","Type":"ContainerDied","Data":"69e71f427b01e9ffc8ea1914f38bb8504d9336725d7f9d30b3d133f76019366f"} Apr 02 14:24:05 crc kubenswrapper[4732]: I0402 14:24:05.093022 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69e71f427b01e9ffc8ea1914f38bb8504d9336725d7f9d30b3d133f76019366f" Apr 02 14:24:05 crc kubenswrapper[4732]: I0402 14:24:05.092635 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585664-kmmbv" Apr 02 14:24:05 crc kubenswrapper[4732]: I0402 14:24:05.474435 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585658-cxm7h"] Apr 02 14:24:05 crc kubenswrapper[4732]: I0402 14:24:05.485638 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585658-cxm7h"] Apr 02 14:24:06 crc kubenswrapper[4732]: I0402 14:24:06.694244 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb1cb3c0-3800-41c1-bfa3-064d473acdc5" path="/var/lib/kubelet/pods/fb1cb3c0-3800-41c1-bfa3-064d473acdc5/volumes" Apr 02 14:24:15 crc kubenswrapper[4732]: I0402 14:24:15.680887 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:24:15 crc kubenswrapper[4732]: E0402 14:24:15.681896 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.470640 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Apr 02 14:24:24 crc kubenswrapper[4732]: E0402 14:24:24.471918 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38bd5a65-41cb-4390-933d-6df35e2438c7" containerName="oc" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.471942 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="38bd5a65-41cb-4390-933d-6df35e2438c7" containerName="oc" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.472246 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="38bd5a65-41cb-4390-933d-6df35e2438c7" containerName="oc" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.473116 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.475705 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.476035 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-8fmdn" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.477517 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.477768 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.495942 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.640531 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.640596 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/17645883-477c-437a-b87a-b412f9bbe29e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.640666 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/17645883-477c-437a-b87a-b412f9bbe29e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.640906 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/17645883-477c-437a-b87a-b412f9bbe29e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.641004 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/17645883-477c-437a-b87a-b412f9bbe29e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.641027 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/17645883-477c-437a-b87a-b412f9bbe29e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.641153 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17645883-477c-437a-b87a-b412f9bbe29e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.641187 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xm6j\" (UniqueName: \"kubernetes.io/projected/17645883-477c-437a-b87a-b412f9bbe29e-kube-api-access-6xm6j\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.641279 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17645883-477c-437a-b87a-b412f9bbe29e-config-data\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.742849 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/17645883-477c-437a-b87a-b412f9bbe29e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.742983 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/17645883-477c-437a-b87a-b412f9bbe29e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.743039 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/17645883-477c-437a-b87a-b412f9bbe29e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.743069 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/17645883-477c-437a-b87a-b412f9bbe29e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.743129 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17645883-477c-437a-b87a-b412f9bbe29e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.743163 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xm6j\" (UniqueName: \"kubernetes.io/projected/17645883-477c-437a-b87a-b412f9bbe29e-kube-api-access-6xm6j\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.743210 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17645883-477c-437a-b87a-b412f9bbe29e-config-data\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.743278 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.743586 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/17645883-477c-437a-b87a-b412f9bbe29e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.743886 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/17645883-477c-437a-b87a-b412f9bbe29e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.744127 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/17645883-477c-437a-b87a-b412f9bbe29e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.744776 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/17645883-477c-437a-b87a-b412f9bbe29e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.745955 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.746671 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.746826 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.752120 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/17645883-477c-437a-b87a-b412f9bbe29e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.755899 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17645883-477c-437a-b87a-b412f9bbe29e-config-data\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.757141 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/17645883-477c-437a-b87a-b412f9bbe29e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.760116 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xm6j\" (UniqueName: \"kubernetes.io/projected/17645883-477c-437a-b87a-b412f9bbe29e-kube-api-access-6xm6j\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.762224 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17645883-477c-437a-b87a-b412f9bbe29e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:24 crc kubenswrapper[4732]: I0402 14:24:24.801588 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " pod="openstack/tempest-tests-tempest" Apr 02 14:24:25 crc kubenswrapper[4732]: I0402 14:24:25.106159 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-8fmdn" Apr 02 14:24:25 crc kubenswrapper[4732]: I0402 14:24:25.114141 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Apr 02 14:24:25 crc kubenswrapper[4732]: I0402 14:24:25.647067 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Apr 02 14:24:26 crc kubenswrapper[4732]: I0402 14:24:26.299931 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"17645883-477c-437a-b87a-b412f9bbe29e","Type":"ContainerStarted","Data":"b0d5f795305bf6232df4d60677049908699b9f3e11c852cf61d3c8afbd703a82"} Apr 02 14:24:29 crc kubenswrapper[4732]: I0402 14:24:29.680989 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:24:29 crc kubenswrapper[4732]: E0402 14:24:29.682867 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:24:42 crc kubenswrapper[4732]: I0402 14:24:42.681031 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:24:50 crc kubenswrapper[4732]: E0402 14:24:50.369939 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Apr 02 14:24:50 crc kubenswrapper[4732]: E0402 14:24:50.370797 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6xm6j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN NET_RAW],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(17645883-477c-437a-b87a-b412f9bbe29e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Apr 02 14:24:50 crc kubenswrapper[4732]: E0402 14:24:50.371998 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="17645883-477c-437a-b87a-b412f9bbe29e" Apr 02 14:24:50 crc kubenswrapper[4732]: I0402 14:24:50.562683 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerStarted","Data":"16d76f892a3a796d1d3cc9570734bbadcf82aeca2b7902fb2000aee8a3ef6008"} Apr 02 14:24:50 crc kubenswrapper[4732]: E0402 14:24:50.565609 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="17645883-477c-437a-b87a-b412f9bbe29e" Apr 02 14:24:51 crc kubenswrapper[4732]: I0402 14:24:51.887361 4732 scope.go:117] "RemoveContainer" containerID="8dfd95432144351c7292d99043f7cfcd6c0ba1a99e2b9171a1b54729e33ffa6d" Apr 02 14:25:05 crc kubenswrapper[4732]: I0402 14:25:05.148398 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Apr 02 14:25:06 crc kubenswrapper[4732]: I0402 14:25:06.752893 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"17645883-477c-437a-b87a-b412f9bbe29e","Type":"ContainerStarted","Data":"34150bd36365edf1c470c6c9391be59b88169f3c84fe78522f74eecbaea9e9be"} Apr 02 14:25:06 crc kubenswrapper[4732]: I0402 14:25:06.773813 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.284902292 podStartE2EDuration="43.773791622s" podCreationTimestamp="2026-04-02 14:24:23 +0000 UTC" firstStartedPulling="2026-04-02 14:24:25.656262926 +0000 UTC m=+2822.560670519" lastFinishedPulling="2026-04-02 14:25:05.145152296 +0000 UTC m=+2862.049559849" observedRunningTime="2026-04-02 14:25:06.768381446 +0000 UTC m=+2863.672789009" watchObservedRunningTime="2026-04-02 14:25:06.773791622 +0000 UTC m=+2863.678199195" Apr 02 14:25:16 crc kubenswrapper[4732]: I0402 14:25:16.307952 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zh74x"] Apr 02 14:25:16 crc kubenswrapper[4732]: I0402 14:25:16.322850 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zh74x" Apr 02 14:25:16 crc kubenswrapper[4732]: I0402 14:25:16.328021 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zh74x"] Apr 02 14:25:16 crc kubenswrapper[4732]: I0402 14:25:16.412376 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75c0273e-e7e0-475c-8192-c7164dbd2fee-catalog-content\") pod \"redhat-operators-zh74x\" (UID: \"75c0273e-e7e0-475c-8192-c7164dbd2fee\") " pod="openshift-marketplace/redhat-operators-zh74x" Apr 02 14:25:16 crc kubenswrapper[4732]: I0402 14:25:16.412522 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rhtk\" (UniqueName: \"kubernetes.io/projected/75c0273e-e7e0-475c-8192-c7164dbd2fee-kube-api-access-9rhtk\") pod \"redhat-operators-zh74x\" (UID: \"75c0273e-e7e0-475c-8192-c7164dbd2fee\") " pod="openshift-marketplace/redhat-operators-zh74x" Apr 02 14:25:16 crc kubenswrapper[4732]: I0402 14:25:16.412815 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75c0273e-e7e0-475c-8192-c7164dbd2fee-utilities\") pod \"redhat-operators-zh74x\" (UID: \"75c0273e-e7e0-475c-8192-c7164dbd2fee\") " pod="openshift-marketplace/redhat-operators-zh74x" Apr 02 14:25:16 crc kubenswrapper[4732]: I0402 14:25:16.514420 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75c0273e-e7e0-475c-8192-c7164dbd2fee-utilities\") pod \"redhat-operators-zh74x\" (UID: \"75c0273e-e7e0-475c-8192-c7164dbd2fee\") " pod="openshift-marketplace/redhat-operators-zh74x" Apr 02 14:25:16 crc kubenswrapper[4732]: I0402 14:25:16.514919 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75c0273e-e7e0-475c-8192-c7164dbd2fee-catalog-content\") pod \"redhat-operators-zh74x\" (UID: \"75c0273e-e7e0-475c-8192-c7164dbd2fee\") " pod="openshift-marketplace/redhat-operators-zh74x" Apr 02 14:25:16 crc kubenswrapper[4732]: I0402 14:25:16.514943 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75c0273e-e7e0-475c-8192-c7164dbd2fee-utilities\") pod \"redhat-operators-zh74x\" (UID: \"75c0273e-e7e0-475c-8192-c7164dbd2fee\") " pod="openshift-marketplace/redhat-operators-zh74x" Apr 02 14:25:16 crc kubenswrapper[4732]: I0402 14:25:16.514969 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rhtk\" (UniqueName: \"kubernetes.io/projected/75c0273e-e7e0-475c-8192-c7164dbd2fee-kube-api-access-9rhtk\") pod \"redhat-operators-zh74x\" (UID: \"75c0273e-e7e0-475c-8192-c7164dbd2fee\") " pod="openshift-marketplace/redhat-operators-zh74x" Apr 02 14:25:16 crc kubenswrapper[4732]: I0402 14:25:16.515270 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75c0273e-e7e0-475c-8192-c7164dbd2fee-catalog-content\") pod \"redhat-operators-zh74x\" (UID: \"75c0273e-e7e0-475c-8192-c7164dbd2fee\") " pod="openshift-marketplace/redhat-operators-zh74x" Apr 02 14:25:16 crc kubenswrapper[4732]: I0402 14:25:16.533064 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rhtk\" (UniqueName: \"kubernetes.io/projected/75c0273e-e7e0-475c-8192-c7164dbd2fee-kube-api-access-9rhtk\") pod \"redhat-operators-zh74x\" (UID: \"75c0273e-e7e0-475c-8192-c7164dbd2fee\") " pod="openshift-marketplace/redhat-operators-zh74x" Apr 02 14:25:16 crc kubenswrapper[4732]: I0402 14:25:16.655215 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zh74x" Apr 02 14:25:17 crc kubenswrapper[4732]: I0402 14:25:17.178940 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zh74x"] Apr 02 14:25:17 crc kubenswrapper[4732]: I0402 14:25:17.874268 4732 generic.go:334] "Generic (PLEG): container finished" podID="75c0273e-e7e0-475c-8192-c7164dbd2fee" containerID="e734bc6c346bd1c18653c23dc7b448976ea8fce58431f65081a10a542b04d94d" exitCode=0 Apr 02 14:25:17 crc kubenswrapper[4732]: I0402 14:25:17.874325 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh74x" event={"ID":"75c0273e-e7e0-475c-8192-c7164dbd2fee","Type":"ContainerDied","Data":"e734bc6c346bd1c18653c23dc7b448976ea8fce58431f65081a10a542b04d94d"} Apr 02 14:25:17 crc kubenswrapper[4732]: I0402 14:25:17.874350 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh74x" event={"ID":"75c0273e-e7e0-475c-8192-c7164dbd2fee","Type":"ContainerStarted","Data":"49e64ce98b51b2a387afc18c296890fb11f8c9142ac0d82c80404a2c6b49dc6c"} Apr 02 14:25:18 crc kubenswrapper[4732]: I0402 14:25:18.886669 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh74x" event={"ID":"75c0273e-e7e0-475c-8192-c7164dbd2fee","Type":"ContainerStarted","Data":"a7abce640a2635822b5821dfb42449b7f1e478a62285edde5fb314ef4aab1bf3"} Apr 02 14:25:19 crc kubenswrapper[4732]: I0402 14:25:19.902047 4732 generic.go:334] "Generic (PLEG): container finished" podID="75c0273e-e7e0-475c-8192-c7164dbd2fee" containerID="a7abce640a2635822b5821dfb42449b7f1e478a62285edde5fb314ef4aab1bf3" exitCode=0 Apr 02 14:25:19 crc kubenswrapper[4732]: I0402 14:25:19.902390 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh74x" event={"ID":"75c0273e-e7e0-475c-8192-c7164dbd2fee","Type":"ContainerDied","Data":"a7abce640a2635822b5821dfb42449b7f1e478a62285edde5fb314ef4aab1bf3"} Apr 02 14:25:21 crc kubenswrapper[4732]: I0402 14:25:21.923316 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh74x" event={"ID":"75c0273e-e7e0-475c-8192-c7164dbd2fee","Type":"ContainerStarted","Data":"f855b9a40e41669825980be66d2cda66bbf65fe972e6822d310479ae54f60c58"} Apr 02 14:25:21 crc kubenswrapper[4732]: I0402 14:25:21.954380 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zh74x" podStartSLOduration=2.814617531 podStartE2EDuration="5.9543624s" podCreationTimestamp="2026-04-02 14:25:16 +0000 UTC" firstStartedPulling="2026-04-02 14:25:17.878825716 +0000 UTC m=+2874.783233269" lastFinishedPulling="2026-04-02 14:25:21.018570575 +0000 UTC m=+2877.922978138" observedRunningTime="2026-04-02 14:25:21.946042975 +0000 UTC m=+2878.850450528" watchObservedRunningTime="2026-04-02 14:25:21.9543624 +0000 UTC m=+2878.858769953" Apr 02 14:25:26 crc kubenswrapper[4732]: I0402 14:25:26.655584 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zh74x" Apr 02 14:25:26 crc kubenswrapper[4732]: I0402 14:25:26.657200 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zh74x" Apr 02 14:25:27 crc kubenswrapper[4732]: I0402 14:25:27.717446 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zh74x" podUID="75c0273e-e7e0-475c-8192-c7164dbd2fee" containerName="registry-server" probeResult="failure" output=< Apr 02 14:25:27 crc kubenswrapper[4732]: timeout: failed to connect service ":50051" within 1s Apr 02 14:25:27 crc kubenswrapper[4732]: > Apr 02 14:25:36 crc kubenswrapper[4732]: I0402 14:25:36.713408 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zh74x" Apr 02 14:25:36 crc kubenswrapper[4732]: I0402 14:25:36.786977 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zh74x" Apr 02 14:25:36 crc kubenswrapper[4732]: I0402 14:25:36.961755 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zh74x"] Apr 02 14:25:38 crc kubenswrapper[4732]: I0402 14:25:38.090392 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zh74x" podUID="75c0273e-e7e0-475c-8192-c7164dbd2fee" containerName="registry-server" containerID="cri-o://f855b9a40e41669825980be66d2cda66bbf65fe972e6822d310479ae54f60c58" gracePeriod=2 Apr 02 14:25:38 crc kubenswrapper[4732]: I0402 14:25:38.632799 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zh74x" Apr 02 14:25:38 crc kubenswrapper[4732]: I0402 14:25:38.673381 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75c0273e-e7e0-475c-8192-c7164dbd2fee-catalog-content\") pod \"75c0273e-e7e0-475c-8192-c7164dbd2fee\" (UID: \"75c0273e-e7e0-475c-8192-c7164dbd2fee\") " Apr 02 14:25:38 crc kubenswrapper[4732]: I0402 14:25:38.673527 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75c0273e-e7e0-475c-8192-c7164dbd2fee-utilities\") pod \"75c0273e-e7e0-475c-8192-c7164dbd2fee\" (UID: \"75c0273e-e7e0-475c-8192-c7164dbd2fee\") " Apr 02 14:25:38 crc kubenswrapper[4732]: I0402 14:25:38.673851 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rhtk\" (UniqueName: \"kubernetes.io/projected/75c0273e-e7e0-475c-8192-c7164dbd2fee-kube-api-access-9rhtk\") pod \"75c0273e-e7e0-475c-8192-c7164dbd2fee\" (UID: \"75c0273e-e7e0-475c-8192-c7164dbd2fee\") " Apr 02 14:25:38 crc kubenswrapper[4732]: I0402 14:25:38.674168 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75c0273e-e7e0-475c-8192-c7164dbd2fee-utilities" (OuterVolumeSpecName: "utilities") pod "75c0273e-e7e0-475c-8192-c7164dbd2fee" (UID: "75c0273e-e7e0-475c-8192-c7164dbd2fee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:25:38 crc kubenswrapper[4732]: I0402 14:25:38.674608 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75c0273e-e7e0-475c-8192-c7164dbd2fee-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 14:25:38 crc kubenswrapper[4732]: I0402 14:25:38.695888 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75c0273e-e7e0-475c-8192-c7164dbd2fee-kube-api-access-9rhtk" (OuterVolumeSpecName: "kube-api-access-9rhtk") pod "75c0273e-e7e0-475c-8192-c7164dbd2fee" (UID: "75c0273e-e7e0-475c-8192-c7164dbd2fee"). InnerVolumeSpecName "kube-api-access-9rhtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:25:38 crc kubenswrapper[4732]: I0402 14:25:38.775976 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rhtk\" (UniqueName: \"kubernetes.io/projected/75c0273e-e7e0-475c-8192-c7164dbd2fee-kube-api-access-9rhtk\") on node \"crc\" DevicePath \"\"" Apr 02 14:25:38 crc kubenswrapper[4732]: I0402 14:25:38.812243 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75c0273e-e7e0-475c-8192-c7164dbd2fee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75c0273e-e7e0-475c-8192-c7164dbd2fee" (UID: "75c0273e-e7e0-475c-8192-c7164dbd2fee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:25:38 crc kubenswrapper[4732]: I0402 14:25:38.877414 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75c0273e-e7e0-475c-8192-c7164dbd2fee-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 14:25:39 crc kubenswrapper[4732]: I0402 14:25:39.102953 4732 generic.go:334] "Generic (PLEG): container finished" podID="75c0273e-e7e0-475c-8192-c7164dbd2fee" containerID="f855b9a40e41669825980be66d2cda66bbf65fe972e6822d310479ae54f60c58" exitCode=0 Apr 02 14:25:39 crc kubenswrapper[4732]: I0402 14:25:39.102992 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh74x" event={"ID":"75c0273e-e7e0-475c-8192-c7164dbd2fee","Type":"ContainerDied","Data":"f855b9a40e41669825980be66d2cda66bbf65fe972e6822d310479ae54f60c58"} Apr 02 14:25:39 crc kubenswrapper[4732]: I0402 14:25:39.103032 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zh74x" Apr 02 14:25:39 crc kubenswrapper[4732]: I0402 14:25:39.103052 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh74x" event={"ID":"75c0273e-e7e0-475c-8192-c7164dbd2fee","Type":"ContainerDied","Data":"49e64ce98b51b2a387afc18c296890fb11f8c9142ac0d82c80404a2c6b49dc6c"} Apr 02 14:25:39 crc kubenswrapper[4732]: I0402 14:25:39.103072 4732 scope.go:117] "RemoveContainer" containerID="f855b9a40e41669825980be66d2cda66bbf65fe972e6822d310479ae54f60c58" Apr 02 14:25:39 crc kubenswrapper[4732]: I0402 14:25:39.124493 4732 scope.go:117] "RemoveContainer" containerID="a7abce640a2635822b5821dfb42449b7f1e478a62285edde5fb314ef4aab1bf3" Apr 02 14:25:39 crc kubenswrapper[4732]: I0402 14:25:39.150996 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zh74x"] Apr 02 14:25:39 crc kubenswrapper[4732]: I0402 14:25:39.159533 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zh74x"] Apr 02 14:25:39 crc kubenswrapper[4732]: I0402 14:25:39.165524 4732 scope.go:117] "RemoveContainer" containerID="e734bc6c346bd1c18653c23dc7b448976ea8fce58431f65081a10a542b04d94d" Apr 02 14:25:39 crc kubenswrapper[4732]: I0402 14:25:39.200621 4732 scope.go:117] "RemoveContainer" containerID="f855b9a40e41669825980be66d2cda66bbf65fe972e6822d310479ae54f60c58" Apr 02 14:25:39 crc kubenswrapper[4732]: E0402 14:25:39.201006 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f855b9a40e41669825980be66d2cda66bbf65fe972e6822d310479ae54f60c58\": container with ID starting with f855b9a40e41669825980be66d2cda66bbf65fe972e6822d310479ae54f60c58 not found: ID does not exist" containerID="f855b9a40e41669825980be66d2cda66bbf65fe972e6822d310479ae54f60c58" Apr 02 14:25:39 crc kubenswrapper[4732]: I0402 14:25:39.201049 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f855b9a40e41669825980be66d2cda66bbf65fe972e6822d310479ae54f60c58"} err="failed to get container status \"f855b9a40e41669825980be66d2cda66bbf65fe972e6822d310479ae54f60c58\": rpc error: code = NotFound desc = could not find container \"f855b9a40e41669825980be66d2cda66bbf65fe972e6822d310479ae54f60c58\": container with ID starting with f855b9a40e41669825980be66d2cda66bbf65fe972e6822d310479ae54f60c58 not found: ID does not exist" Apr 02 14:25:39 crc kubenswrapper[4732]: I0402 14:25:39.201077 4732 scope.go:117] "RemoveContainer" containerID="a7abce640a2635822b5821dfb42449b7f1e478a62285edde5fb314ef4aab1bf3" Apr 02 14:25:39 crc kubenswrapper[4732]: E0402 14:25:39.201303 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7abce640a2635822b5821dfb42449b7f1e478a62285edde5fb314ef4aab1bf3\": container with ID starting with a7abce640a2635822b5821dfb42449b7f1e478a62285edde5fb314ef4aab1bf3 not found: ID does not exist" containerID="a7abce640a2635822b5821dfb42449b7f1e478a62285edde5fb314ef4aab1bf3" Apr 02 14:25:39 crc kubenswrapper[4732]: I0402 14:25:39.201323 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7abce640a2635822b5821dfb42449b7f1e478a62285edde5fb314ef4aab1bf3"} err="failed to get container status \"a7abce640a2635822b5821dfb42449b7f1e478a62285edde5fb314ef4aab1bf3\": rpc error: code = NotFound desc = could not find container \"a7abce640a2635822b5821dfb42449b7f1e478a62285edde5fb314ef4aab1bf3\": container with ID starting with a7abce640a2635822b5821dfb42449b7f1e478a62285edde5fb314ef4aab1bf3 not found: ID does not exist" Apr 02 14:25:39 crc kubenswrapper[4732]: I0402 14:25:39.201344 4732 scope.go:117] "RemoveContainer" containerID="e734bc6c346bd1c18653c23dc7b448976ea8fce58431f65081a10a542b04d94d" Apr 02 14:25:39 crc kubenswrapper[4732]: E0402 14:25:39.201578 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e734bc6c346bd1c18653c23dc7b448976ea8fce58431f65081a10a542b04d94d\": container with ID starting with e734bc6c346bd1c18653c23dc7b448976ea8fce58431f65081a10a542b04d94d not found: ID does not exist" containerID="e734bc6c346bd1c18653c23dc7b448976ea8fce58431f65081a10a542b04d94d" Apr 02 14:25:39 crc kubenswrapper[4732]: I0402 14:25:39.201599 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e734bc6c346bd1c18653c23dc7b448976ea8fce58431f65081a10a542b04d94d"} err="failed to get container status \"e734bc6c346bd1c18653c23dc7b448976ea8fce58431f65081a10a542b04d94d\": rpc error: code = NotFound desc = could not find container \"e734bc6c346bd1c18653c23dc7b448976ea8fce58431f65081a10a542b04d94d\": container with ID starting with e734bc6c346bd1c18653c23dc7b448976ea8fce58431f65081a10a542b04d94d not found: ID does not exist" Apr 02 14:25:40 crc kubenswrapper[4732]: I0402 14:25:40.697355 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75c0273e-e7e0-475c-8192-c7164dbd2fee" path="/var/lib/kubelet/pods/75c0273e-e7e0-475c-8192-c7164dbd2fee/volumes" Apr 02 14:26:00 crc kubenswrapper[4732]: I0402 14:26:00.145705 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585666-9z9sf"] Apr 02 14:26:00 crc kubenswrapper[4732]: E0402 14:26:00.146642 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75c0273e-e7e0-475c-8192-c7164dbd2fee" containerName="extract-content" Apr 02 14:26:00 crc kubenswrapper[4732]: I0402 14:26:00.146657 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="75c0273e-e7e0-475c-8192-c7164dbd2fee" containerName="extract-content" Apr 02 14:26:00 crc kubenswrapper[4732]: E0402 14:26:00.146669 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75c0273e-e7e0-475c-8192-c7164dbd2fee" containerName="registry-server" Apr 02 14:26:00 crc kubenswrapper[4732]: I0402 14:26:00.146677 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="75c0273e-e7e0-475c-8192-c7164dbd2fee" containerName="registry-server" Apr 02 14:26:00 crc kubenswrapper[4732]: E0402 14:26:00.146706 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75c0273e-e7e0-475c-8192-c7164dbd2fee" containerName="extract-utilities" Apr 02 14:26:00 crc kubenswrapper[4732]: I0402 14:26:00.146715 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="75c0273e-e7e0-475c-8192-c7164dbd2fee" containerName="extract-utilities" Apr 02 14:26:00 crc kubenswrapper[4732]: I0402 14:26:00.146932 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="75c0273e-e7e0-475c-8192-c7164dbd2fee" containerName="registry-server" Apr 02 14:26:00 crc kubenswrapper[4732]: I0402 14:26:00.147828 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585666-9z9sf" Apr 02 14:26:00 crc kubenswrapper[4732]: I0402 14:26:00.149976 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:26:00 crc kubenswrapper[4732]: I0402 14:26:00.150206 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:26:00 crc kubenswrapper[4732]: I0402 14:26:00.150278 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:26:00 crc kubenswrapper[4732]: I0402 14:26:00.156207 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585666-9z9sf"] Apr 02 14:26:00 crc kubenswrapper[4732]: I0402 14:26:00.193924 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq4v2\" (UniqueName: \"kubernetes.io/projected/f72fbefc-1812-464a-8993-75c02e51514f-kube-api-access-rq4v2\") pod \"auto-csr-approver-29585666-9z9sf\" (UID: \"f72fbefc-1812-464a-8993-75c02e51514f\") " pod="openshift-infra/auto-csr-approver-29585666-9z9sf" Apr 02 14:26:00 crc kubenswrapper[4732]: I0402 14:26:00.295041 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq4v2\" (UniqueName: \"kubernetes.io/projected/f72fbefc-1812-464a-8993-75c02e51514f-kube-api-access-rq4v2\") pod \"auto-csr-approver-29585666-9z9sf\" (UID: \"f72fbefc-1812-464a-8993-75c02e51514f\") " pod="openshift-infra/auto-csr-approver-29585666-9z9sf" Apr 02 14:26:00 crc kubenswrapper[4732]: I0402 14:26:00.319731 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq4v2\" (UniqueName: \"kubernetes.io/projected/f72fbefc-1812-464a-8993-75c02e51514f-kube-api-access-rq4v2\") pod \"auto-csr-approver-29585666-9z9sf\" (UID: \"f72fbefc-1812-464a-8993-75c02e51514f\") " pod="openshift-infra/auto-csr-approver-29585666-9z9sf" Apr 02 14:26:00 crc kubenswrapper[4732]: I0402 14:26:00.465014 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585666-9z9sf" Apr 02 14:26:00 crc kubenswrapper[4732]: I0402 14:26:00.901877 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585666-9z9sf"] Apr 02 14:26:01 crc kubenswrapper[4732]: I0402 14:26:01.315555 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585666-9z9sf" event={"ID":"f72fbefc-1812-464a-8993-75c02e51514f","Type":"ContainerStarted","Data":"b54b5c909208b3696a6a7b352b591eb88baef70e848fd581ae8b436b57e9d80a"} Apr 02 14:26:02 crc kubenswrapper[4732]: I0402 14:26:02.325184 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585666-9z9sf" event={"ID":"f72fbefc-1812-464a-8993-75c02e51514f","Type":"ContainerStarted","Data":"02a1ff0a1b427720eeee8e741ed3cbed54d1320afd56a18898886be8d068b9c2"} Apr 02 14:26:02 crc kubenswrapper[4732]: I0402 14:26:02.347125 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29585666-9z9sf" podStartSLOduration=1.459520819 podStartE2EDuration="2.347110668s" podCreationTimestamp="2026-04-02 14:26:00 +0000 UTC" firstStartedPulling="2026-04-02 14:26:00.90856593 +0000 UTC m=+2917.812973483" lastFinishedPulling="2026-04-02 14:26:01.796155759 +0000 UTC m=+2918.700563332" observedRunningTime="2026-04-02 14:26:02.339687097 +0000 UTC m=+2919.244094670" watchObservedRunningTime="2026-04-02 14:26:02.347110668 +0000 UTC m=+2919.251518221" Apr 02 14:26:03 crc kubenswrapper[4732]: I0402 14:26:03.337840 4732 generic.go:334] "Generic (PLEG): container finished" podID="f72fbefc-1812-464a-8993-75c02e51514f" containerID="02a1ff0a1b427720eeee8e741ed3cbed54d1320afd56a18898886be8d068b9c2" exitCode=0 Apr 02 14:26:03 crc kubenswrapper[4732]: I0402 14:26:03.337922 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585666-9z9sf" event={"ID":"f72fbefc-1812-464a-8993-75c02e51514f","Type":"ContainerDied","Data":"02a1ff0a1b427720eeee8e741ed3cbed54d1320afd56a18898886be8d068b9c2"} Apr 02 14:26:04 crc kubenswrapper[4732]: I0402 14:26:04.737925 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585666-9z9sf" Apr 02 14:26:04 crc kubenswrapper[4732]: I0402 14:26:04.788067 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq4v2\" (UniqueName: \"kubernetes.io/projected/f72fbefc-1812-464a-8993-75c02e51514f-kube-api-access-rq4v2\") pod \"f72fbefc-1812-464a-8993-75c02e51514f\" (UID: \"f72fbefc-1812-464a-8993-75c02e51514f\") " Apr 02 14:26:04 crc kubenswrapper[4732]: I0402 14:26:04.799010 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f72fbefc-1812-464a-8993-75c02e51514f-kube-api-access-rq4v2" (OuterVolumeSpecName: "kube-api-access-rq4v2") pod "f72fbefc-1812-464a-8993-75c02e51514f" (UID: "f72fbefc-1812-464a-8993-75c02e51514f"). InnerVolumeSpecName "kube-api-access-rq4v2". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:26:04 crc kubenswrapper[4732]: I0402 14:26:04.890933 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq4v2\" (UniqueName: \"kubernetes.io/projected/f72fbefc-1812-464a-8993-75c02e51514f-kube-api-access-rq4v2\") on node \"crc\" DevicePath \"\"" Apr 02 14:26:05 crc kubenswrapper[4732]: I0402 14:26:05.358444 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585666-9z9sf" event={"ID":"f72fbefc-1812-464a-8993-75c02e51514f","Type":"ContainerDied","Data":"b54b5c909208b3696a6a7b352b591eb88baef70e848fd581ae8b436b57e9d80a"} Apr 02 14:26:05 crc kubenswrapper[4732]: I0402 14:26:05.358519 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b54b5c909208b3696a6a7b352b591eb88baef70e848fd581ae8b436b57e9d80a" Apr 02 14:26:05 crc kubenswrapper[4732]: I0402 14:26:05.358596 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585666-9z9sf" Apr 02 14:26:05 crc kubenswrapper[4732]: I0402 14:26:05.426666 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585660-vzv74"] Apr 02 14:26:05 crc kubenswrapper[4732]: I0402 14:26:05.444571 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585660-vzv74"] Apr 02 14:26:06 crc kubenswrapper[4732]: I0402 14:26:06.701476 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc762ced-e5a5-43ce-b204-7fef2eb152e4" path="/var/lib/kubelet/pods/dc762ced-e5a5-43ce-b204-7fef2eb152e4/volumes" Apr 02 14:26:52 crc kubenswrapper[4732]: I0402 14:26:52.031747 4732 scope.go:117] "RemoveContainer" containerID="3645625a5c2ba46e55d1e0ddd8c4fd8bafefbe5d0cc6f8572696bb0e80e8d0a6" Apr 02 14:27:02 crc kubenswrapper[4732]: I0402 14:27:01.924146 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:27:02 crc kubenswrapper[4732]: I0402 14:27:01.925083 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:27:31 crc kubenswrapper[4732]: I0402 14:27:31.924820 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:27:31 crc kubenswrapper[4732]: I0402 14:27:31.925337 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:27:56 crc kubenswrapper[4732]: I0402 14:27:56.813367 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tqggn"] Apr 02 14:27:56 crc kubenswrapper[4732]: E0402 14:27:56.814690 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f72fbefc-1812-464a-8993-75c02e51514f" containerName="oc" Apr 02 14:27:56 crc kubenswrapper[4732]: I0402 14:27:56.814708 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f72fbefc-1812-464a-8993-75c02e51514f" containerName="oc" Apr 02 14:27:56 crc kubenswrapper[4732]: I0402 14:27:56.814924 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f72fbefc-1812-464a-8993-75c02e51514f" containerName="oc" Apr 02 14:27:56 crc kubenswrapper[4732]: I0402 14:27:56.817156 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tqggn" Apr 02 14:27:56 crc kubenswrapper[4732]: I0402 14:27:56.827633 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tqggn"] Apr 02 14:27:57 crc kubenswrapper[4732]: I0402 14:27:57.004294 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfpkp\" (UniqueName: \"kubernetes.io/projected/803381a8-7c12-4270-bf63-f3121ec1b870-kube-api-access-mfpkp\") pod \"community-operators-tqggn\" (UID: \"803381a8-7c12-4270-bf63-f3121ec1b870\") " pod="openshift-marketplace/community-operators-tqggn" Apr 02 14:27:57 crc kubenswrapper[4732]: I0402 14:27:57.004349 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/803381a8-7c12-4270-bf63-f3121ec1b870-catalog-content\") pod \"community-operators-tqggn\" (UID: \"803381a8-7c12-4270-bf63-f3121ec1b870\") " pod="openshift-marketplace/community-operators-tqggn" Apr 02 14:27:57 crc kubenswrapper[4732]: I0402 14:27:57.004514 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/803381a8-7c12-4270-bf63-f3121ec1b870-utilities\") pod \"community-operators-tqggn\" (UID: \"803381a8-7c12-4270-bf63-f3121ec1b870\") " pod="openshift-marketplace/community-operators-tqggn" Apr 02 14:27:57 crc kubenswrapper[4732]: I0402 14:27:57.106241 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/803381a8-7c12-4270-bf63-f3121ec1b870-utilities\") pod \"community-operators-tqggn\" (UID: \"803381a8-7c12-4270-bf63-f3121ec1b870\") " pod="openshift-marketplace/community-operators-tqggn" Apr 02 14:27:57 crc kubenswrapper[4732]: I0402 14:27:57.106410 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfpkp\" (UniqueName: \"kubernetes.io/projected/803381a8-7c12-4270-bf63-f3121ec1b870-kube-api-access-mfpkp\") pod \"community-operators-tqggn\" (UID: \"803381a8-7c12-4270-bf63-f3121ec1b870\") " pod="openshift-marketplace/community-operators-tqggn" Apr 02 14:27:57 crc kubenswrapper[4732]: I0402 14:27:57.106482 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/803381a8-7c12-4270-bf63-f3121ec1b870-catalog-content\") pod \"community-operators-tqggn\" (UID: \"803381a8-7c12-4270-bf63-f3121ec1b870\") " pod="openshift-marketplace/community-operators-tqggn" Apr 02 14:27:57 crc kubenswrapper[4732]: I0402 14:27:57.106850 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/803381a8-7c12-4270-bf63-f3121ec1b870-utilities\") pod \"community-operators-tqggn\" (UID: \"803381a8-7c12-4270-bf63-f3121ec1b870\") " pod="openshift-marketplace/community-operators-tqggn" Apr 02 14:27:57 crc kubenswrapper[4732]: I0402 14:27:57.107245 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/803381a8-7c12-4270-bf63-f3121ec1b870-catalog-content\") pod \"community-operators-tqggn\" (UID: \"803381a8-7c12-4270-bf63-f3121ec1b870\") " pod="openshift-marketplace/community-operators-tqggn" Apr 02 14:27:57 crc kubenswrapper[4732]: I0402 14:27:57.125720 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfpkp\" (UniqueName: \"kubernetes.io/projected/803381a8-7c12-4270-bf63-f3121ec1b870-kube-api-access-mfpkp\") pod \"community-operators-tqggn\" (UID: \"803381a8-7c12-4270-bf63-f3121ec1b870\") " pod="openshift-marketplace/community-operators-tqggn" Apr 02 14:27:57 crc kubenswrapper[4732]: I0402 14:27:57.156554 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tqggn" Apr 02 14:27:57 crc kubenswrapper[4732]: I0402 14:27:57.673094 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tqggn"] Apr 02 14:27:58 crc kubenswrapper[4732]: I0402 14:27:58.452692 4732 generic.go:334] "Generic (PLEG): container finished" podID="803381a8-7c12-4270-bf63-f3121ec1b870" containerID="e58dbe27718570d86c79df08c243948e267da50bcb50da81dc81cae6e2f09ea8" exitCode=0 Apr 02 14:27:58 crc kubenswrapper[4732]: I0402 14:27:58.452798 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tqggn" event={"ID":"803381a8-7c12-4270-bf63-f3121ec1b870","Type":"ContainerDied","Data":"e58dbe27718570d86c79df08c243948e267da50bcb50da81dc81cae6e2f09ea8"} Apr 02 14:27:58 crc kubenswrapper[4732]: I0402 14:27:58.453133 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tqggn" event={"ID":"803381a8-7c12-4270-bf63-f3121ec1b870","Type":"ContainerStarted","Data":"54a3cab5b097bb59efc13ecf4732303a4a5415df25cb9da5a841ccc3c324ca85"} Apr 02 14:28:00 crc kubenswrapper[4732]: I0402 14:28:00.149522 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585668-p66xv"] Apr 02 14:28:00 crc kubenswrapper[4732]: I0402 14:28:00.151461 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585668-p66xv" Apr 02 14:28:00 crc kubenswrapper[4732]: I0402 14:28:00.153644 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:28:00 crc kubenswrapper[4732]: I0402 14:28:00.154287 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:28:00 crc kubenswrapper[4732]: I0402 14:28:00.157290 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:28:00 crc kubenswrapper[4732]: I0402 14:28:00.159398 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585668-p66xv"] Apr 02 14:28:00 crc kubenswrapper[4732]: I0402 14:28:00.279581 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkd8j\" (UniqueName: \"kubernetes.io/projected/2e48ef20-b922-4518-98fa-3286d5da4b2e-kube-api-access-xkd8j\") pod \"auto-csr-approver-29585668-p66xv\" (UID: \"2e48ef20-b922-4518-98fa-3286d5da4b2e\") " pod="openshift-infra/auto-csr-approver-29585668-p66xv" Apr 02 14:28:00 crc kubenswrapper[4732]: I0402 14:28:00.381754 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkd8j\" (UniqueName: \"kubernetes.io/projected/2e48ef20-b922-4518-98fa-3286d5da4b2e-kube-api-access-xkd8j\") pod \"auto-csr-approver-29585668-p66xv\" (UID: \"2e48ef20-b922-4518-98fa-3286d5da4b2e\") " pod="openshift-infra/auto-csr-approver-29585668-p66xv" Apr 02 14:28:00 crc kubenswrapper[4732]: I0402 14:28:00.403476 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkd8j\" (UniqueName: \"kubernetes.io/projected/2e48ef20-b922-4518-98fa-3286d5da4b2e-kube-api-access-xkd8j\") pod \"auto-csr-approver-29585668-p66xv\" (UID: \"2e48ef20-b922-4518-98fa-3286d5da4b2e\") " pod="openshift-infra/auto-csr-approver-29585668-p66xv" Apr 02 14:28:00 crc kubenswrapper[4732]: I0402 14:28:00.474523 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tqggn" event={"ID":"803381a8-7c12-4270-bf63-f3121ec1b870","Type":"ContainerStarted","Data":"e1115c4caf2acb31826842f326979f8b1ff9603e8808ab0645099f64e18bb220"} Apr 02 14:28:00 crc kubenswrapper[4732]: I0402 14:28:00.479781 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585668-p66xv" Apr 02 14:28:00 crc kubenswrapper[4732]: I0402 14:28:00.942771 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585668-p66xv"] Apr 02 14:28:01 crc kubenswrapper[4732]: I0402 14:28:01.485563 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585668-p66xv" event={"ID":"2e48ef20-b922-4518-98fa-3286d5da4b2e","Type":"ContainerStarted","Data":"f52195501aa3bd44f385fecba9917094bfbb462c17cb7391fab70ce7c8e1e221"} Apr 02 14:28:01 crc kubenswrapper[4732]: I0402 14:28:01.490104 4732 generic.go:334] "Generic (PLEG): container finished" podID="803381a8-7c12-4270-bf63-f3121ec1b870" containerID="e1115c4caf2acb31826842f326979f8b1ff9603e8808ab0645099f64e18bb220" exitCode=0 Apr 02 14:28:01 crc kubenswrapper[4732]: I0402 14:28:01.490153 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tqggn" event={"ID":"803381a8-7c12-4270-bf63-f3121ec1b870","Type":"ContainerDied","Data":"e1115c4caf2acb31826842f326979f8b1ff9603e8808ab0645099f64e18bb220"} Apr 02 14:28:01 crc kubenswrapper[4732]: I0402 14:28:01.924781 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:28:01 crc kubenswrapper[4732]: I0402 14:28:01.924860 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:28:01 crc kubenswrapper[4732]: I0402 14:28:01.924906 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 14:28:01 crc kubenswrapper[4732]: I0402 14:28:01.925546 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"16d76f892a3a796d1d3cc9570734bbadcf82aeca2b7902fb2000aee8a3ef6008"} pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 02 14:28:01 crc kubenswrapper[4732]: I0402 14:28:01.925624 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" containerID="cri-o://16d76f892a3a796d1d3cc9570734bbadcf82aeca2b7902fb2000aee8a3ef6008" gracePeriod=600 Apr 02 14:28:02 crc kubenswrapper[4732]: I0402 14:28:02.503070 4732 generic.go:334] "Generic (PLEG): container finished" podID="38409e5e-4545-49da-8f6c-4bfb30582878" containerID="16d76f892a3a796d1d3cc9570734bbadcf82aeca2b7902fb2000aee8a3ef6008" exitCode=0 Apr 02 14:28:02 crc kubenswrapper[4732]: I0402 14:28:02.503157 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerDied","Data":"16d76f892a3a796d1d3cc9570734bbadcf82aeca2b7902fb2000aee8a3ef6008"} Apr 02 14:28:02 crc kubenswrapper[4732]: I0402 14:28:02.503443 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerStarted","Data":"709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2"} Apr 02 14:28:02 crc kubenswrapper[4732]: I0402 14:28:02.503466 4732 scope.go:117] "RemoveContainer" containerID="e3dc55b503b8e33f24aedd03972c83f64d717fd55da944dc7e898dbc4c268904" Apr 02 14:28:02 crc kubenswrapper[4732]: I0402 14:28:02.507037 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tqggn" event={"ID":"803381a8-7c12-4270-bf63-f3121ec1b870","Type":"ContainerStarted","Data":"7d6b5f5604d4751eeb7a9e7d094cf69a2b490f9fda78a796be5caceb853b4049"} Apr 02 14:28:02 crc kubenswrapper[4732]: I0402 14:28:02.558471 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tqggn" podStartSLOduration=2.889695094 podStartE2EDuration="6.558448017s" podCreationTimestamp="2026-04-02 14:27:56 +0000 UTC" firstStartedPulling="2026-04-02 14:27:58.457500644 +0000 UTC m=+3035.361908207" lastFinishedPulling="2026-04-02 14:28:02.126253547 +0000 UTC m=+3039.030661130" observedRunningTime="2026-04-02 14:28:02.541492638 +0000 UTC m=+3039.445900221" watchObservedRunningTime="2026-04-02 14:28:02.558448017 +0000 UTC m=+3039.462855580" Apr 02 14:28:04 crc kubenswrapper[4732]: I0402 14:28:04.531372 4732 generic.go:334] "Generic (PLEG): container finished" podID="2e48ef20-b922-4518-98fa-3286d5da4b2e" containerID="4c64b775d49333a517532eef0f7ae34aa9ff3d3c83b6f79a0fc07badcd3cc13f" exitCode=0 Apr 02 14:28:04 crc kubenswrapper[4732]: I0402 14:28:04.531428 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585668-p66xv" event={"ID":"2e48ef20-b922-4518-98fa-3286d5da4b2e","Type":"ContainerDied","Data":"4c64b775d49333a517532eef0f7ae34aa9ff3d3c83b6f79a0fc07badcd3cc13f"} Apr 02 14:28:05 crc kubenswrapper[4732]: I0402 14:28:05.988814 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585668-p66xv" Apr 02 14:28:06 crc kubenswrapper[4732]: I0402 14:28:06.097963 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkd8j\" (UniqueName: \"kubernetes.io/projected/2e48ef20-b922-4518-98fa-3286d5da4b2e-kube-api-access-xkd8j\") pod \"2e48ef20-b922-4518-98fa-3286d5da4b2e\" (UID: \"2e48ef20-b922-4518-98fa-3286d5da4b2e\") " Apr 02 14:28:06 crc kubenswrapper[4732]: I0402 14:28:06.107908 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e48ef20-b922-4518-98fa-3286d5da4b2e-kube-api-access-xkd8j" (OuterVolumeSpecName: "kube-api-access-xkd8j") pod "2e48ef20-b922-4518-98fa-3286d5da4b2e" (UID: "2e48ef20-b922-4518-98fa-3286d5da4b2e"). InnerVolumeSpecName "kube-api-access-xkd8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:28:06 crc kubenswrapper[4732]: I0402 14:28:06.200687 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkd8j\" (UniqueName: \"kubernetes.io/projected/2e48ef20-b922-4518-98fa-3286d5da4b2e-kube-api-access-xkd8j\") on node \"crc\" DevicePath \"\"" Apr 02 14:28:06 crc kubenswrapper[4732]: I0402 14:28:06.554754 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585668-p66xv" event={"ID":"2e48ef20-b922-4518-98fa-3286d5da4b2e","Type":"ContainerDied","Data":"f52195501aa3bd44f385fecba9917094bfbb462c17cb7391fab70ce7c8e1e221"} Apr 02 14:28:06 crc kubenswrapper[4732]: I0402 14:28:06.555040 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f52195501aa3bd44f385fecba9917094bfbb462c17cb7391fab70ce7c8e1e221" Apr 02 14:28:06 crc kubenswrapper[4732]: I0402 14:28:06.554828 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585668-p66xv" Apr 02 14:28:07 crc kubenswrapper[4732]: I0402 14:28:07.085104 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585662-g8zxb"] Apr 02 14:28:07 crc kubenswrapper[4732]: I0402 14:28:07.094288 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585662-g8zxb"] Apr 02 14:28:07 crc kubenswrapper[4732]: I0402 14:28:07.156818 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tqggn" Apr 02 14:28:07 crc kubenswrapper[4732]: I0402 14:28:07.156863 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tqggn" Apr 02 14:28:07 crc kubenswrapper[4732]: I0402 14:28:07.205132 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tqggn" Apr 02 14:28:07 crc kubenswrapper[4732]: I0402 14:28:07.619227 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tqggn" Apr 02 14:28:07 crc kubenswrapper[4732]: I0402 14:28:07.672380 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tqggn"] Apr 02 14:28:08 crc kubenswrapper[4732]: I0402 14:28:08.700284 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11cf71a3-dcfe-48e1-a37f-064849f9af8c" path="/var/lib/kubelet/pods/11cf71a3-dcfe-48e1-a37f-064849f9af8c/volumes" Apr 02 14:28:09 crc kubenswrapper[4732]: I0402 14:28:09.581905 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tqggn" podUID="803381a8-7c12-4270-bf63-f3121ec1b870" containerName="registry-server" containerID="cri-o://7d6b5f5604d4751eeb7a9e7d094cf69a2b490f9fda78a796be5caceb853b4049" gracePeriod=2 Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.098412 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tqggn" Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.201560 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfpkp\" (UniqueName: \"kubernetes.io/projected/803381a8-7c12-4270-bf63-f3121ec1b870-kube-api-access-mfpkp\") pod \"803381a8-7c12-4270-bf63-f3121ec1b870\" (UID: \"803381a8-7c12-4270-bf63-f3121ec1b870\") " Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.201735 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/803381a8-7c12-4270-bf63-f3121ec1b870-utilities\") pod \"803381a8-7c12-4270-bf63-f3121ec1b870\" (UID: \"803381a8-7c12-4270-bf63-f3121ec1b870\") " Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.201794 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/803381a8-7c12-4270-bf63-f3121ec1b870-catalog-content\") pod \"803381a8-7c12-4270-bf63-f3121ec1b870\" (UID: \"803381a8-7c12-4270-bf63-f3121ec1b870\") " Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.202548 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/803381a8-7c12-4270-bf63-f3121ec1b870-utilities" (OuterVolumeSpecName: "utilities") pod "803381a8-7c12-4270-bf63-f3121ec1b870" (UID: "803381a8-7c12-4270-bf63-f3121ec1b870"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.207717 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/803381a8-7c12-4270-bf63-f3121ec1b870-kube-api-access-mfpkp" (OuterVolumeSpecName: "kube-api-access-mfpkp") pod "803381a8-7c12-4270-bf63-f3121ec1b870" (UID: "803381a8-7c12-4270-bf63-f3121ec1b870"). InnerVolumeSpecName "kube-api-access-mfpkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.269106 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/803381a8-7c12-4270-bf63-f3121ec1b870-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "803381a8-7c12-4270-bf63-f3121ec1b870" (UID: "803381a8-7c12-4270-bf63-f3121ec1b870"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.303958 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/803381a8-7c12-4270-bf63-f3121ec1b870-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.303998 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfpkp\" (UniqueName: \"kubernetes.io/projected/803381a8-7c12-4270-bf63-f3121ec1b870-kube-api-access-mfpkp\") on node \"crc\" DevicePath \"\"" Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.304019 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/803381a8-7c12-4270-bf63-f3121ec1b870-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.616366 4732 generic.go:334] "Generic (PLEG): container finished" podID="803381a8-7c12-4270-bf63-f3121ec1b870" containerID="7d6b5f5604d4751eeb7a9e7d094cf69a2b490f9fda78a796be5caceb853b4049" exitCode=0 Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.616471 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tqggn" event={"ID":"803381a8-7c12-4270-bf63-f3121ec1b870","Type":"ContainerDied","Data":"7d6b5f5604d4751eeb7a9e7d094cf69a2b490f9fda78a796be5caceb853b4049"} Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.617032 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tqggn" event={"ID":"803381a8-7c12-4270-bf63-f3121ec1b870","Type":"ContainerDied","Data":"54a3cab5b097bb59efc13ecf4732303a4a5415df25cb9da5a841ccc3c324ca85"} Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.617057 4732 scope.go:117] "RemoveContainer" containerID="7d6b5f5604d4751eeb7a9e7d094cf69a2b490f9fda78a796be5caceb853b4049" Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.616565 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tqggn" Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.642489 4732 scope.go:117] "RemoveContainer" containerID="e1115c4caf2acb31826842f326979f8b1ff9603e8808ab0645099f64e18bb220" Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.661966 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tqggn"] Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.673674 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tqggn"] Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.700999 4732 scope.go:117] "RemoveContainer" containerID="e58dbe27718570d86c79df08c243948e267da50bcb50da81dc81cae6e2f09ea8" Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.715739 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="803381a8-7c12-4270-bf63-f3121ec1b870" path="/var/lib/kubelet/pods/803381a8-7c12-4270-bf63-f3121ec1b870/volumes" Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.742825 4732 scope.go:117] "RemoveContainer" containerID="7d6b5f5604d4751eeb7a9e7d094cf69a2b490f9fda78a796be5caceb853b4049" Apr 02 14:28:10 crc kubenswrapper[4732]: E0402 14:28:10.745413 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d6b5f5604d4751eeb7a9e7d094cf69a2b490f9fda78a796be5caceb853b4049\": container with ID starting with 7d6b5f5604d4751eeb7a9e7d094cf69a2b490f9fda78a796be5caceb853b4049 not found: ID does not exist" containerID="7d6b5f5604d4751eeb7a9e7d094cf69a2b490f9fda78a796be5caceb853b4049" Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.745508 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d6b5f5604d4751eeb7a9e7d094cf69a2b490f9fda78a796be5caceb853b4049"} err="failed to get container status \"7d6b5f5604d4751eeb7a9e7d094cf69a2b490f9fda78a796be5caceb853b4049\": rpc error: code = NotFound desc = could not find container \"7d6b5f5604d4751eeb7a9e7d094cf69a2b490f9fda78a796be5caceb853b4049\": container with ID starting with 7d6b5f5604d4751eeb7a9e7d094cf69a2b490f9fda78a796be5caceb853b4049 not found: ID does not exist" Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.745540 4732 scope.go:117] "RemoveContainer" containerID="e1115c4caf2acb31826842f326979f8b1ff9603e8808ab0645099f64e18bb220" Apr 02 14:28:10 crc kubenswrapper[4732]: E0402 14:28:10.746177 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1115c4caf2acb31826842f326979f8b1ff9603e8808ab0645099f64e18bb220\": container with ID starting with e1115c4caf2acb31826842f326979f8b1ff9603e8808ab0645099f64e18bb220 not found: ID does not exist" containerID="e1115c4caf2acb31826842f326979f8b1ff9603e8808ab0645099f64e18bb220" Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.746202 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1115c4caf2acb31826842f326979f8b1ff9603e8808ab0645099f64e18bb220"} err="failed to get container status \"e1115c4caf2acb31826842f326979f8b1ff9603e8808ab0645099f64e18bb220\": rpc error: code = NotFound desc = could not find container \"e1115c4caf2acb31826842f326979f8b1ff9603e8808ab0645099f64e18bb220\": container with ID starting with e1115c4caf2acb31826842f326979f8b1ff9603e8808ab0645099f64e18bb220 not found: ID does not exist" Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.746235 4732 scope.go:117] "RemoveContainer" containerID="e58dbe27718570d86c79df08c243948e267da50bcb50da81dc81cae6e2f09ea8" Apr 02 14:28:10 crc kubenswrapper[4732]: E0402 14:28:10.746532 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e58dbe27718570d86c79df08c243948e267da50bcb50da81dc81cae6e2f09ea8\": container with ID starting with e58dbe27718570d86c79df08c243948e267da50bcb50da81dc81cae6e2f09ea8 not found: ID does not exist" containerID="e58dbe27718570d86c79df08c243948e267da50bcb50da81dc81cae6e2f09ea8" Apr 02 14:28:10 crc kubenswrapper[4732]: I0402 14:28:10.746569 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e58dbe27718570d86c79df08c243948e267da50bcb50da81dc81cae6e2f09ea8"} err="failed to get container status \"e58dbe27718570d86c79df08c243948e267da50bcb50da81dc81cae6e2f09ea8\": rpc error: code = NotFound desc = could not find container \"e58dbe27718570d86c79df08c243948e267da50bcb50da81dc81cae6e2f09ea8\": container with ID starting with e58dbe27718570d86c79df08c243948e267da50bcb50da81dc81cae6e2f09ea8 not found: ID does not exist" Apr 02 14:28:40 crc kubenswrapper[4732]: I0402 14:28:40.121572 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pjzhz"] Apr 02 14:28:40 crc kubenswrapper[4732]: E0402 14:28:40.122544 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="803381a8-7c12-4270-bf63-f3121ec1b870" containerName="extract-content" Apr 02 14:28:40 crc kubenswrapper[4732]: I0402 14:28:40.122557 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="803381a8-7c12-4270-bf63-f3121ec1b870" containerName="extract-content" Apr 02 14:28:40 crc kubenswrapper[4732]: E0402 14:28:40.122573 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="803381a8-7c12-4270-bf63-f3121ec1b870" containerName="extract-utilities" Apr 02 14:28:40 crc kubenswrapper[4732]: I0402 14:28:40.122579 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="803381a8-7c12-4270-bf63-f3121ec1b870" containerName="extract-utilities" Apr 02 14:28:40 crc kubenswrapper[4732]: E0402 14:28:40.122598 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="803381a8-7c12-4270-bf63-f3121ec1b870" containerName="registry-server" Apr 02 14:28:40 crc kubenswrapper[4732]: I0402 14:28:40.122604 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="803381a8-7c12-4270-bf63-f3121ec1b870" containerName="registry-server" Apr 02 14:28:40 crc kubenswrapper[4732]: E0402 14:28:40.122883 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e48ef20-b922-4518-98fa-3286d5da4b2e" containerName="oc" Apr 02 14:28:40 crc kubenswrapper[4732]: I0402 14:28:40.122893 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e48ef20-b922-4518-98fa-3286d5da4b2e" containerName="oc" Apr 02 14:28:40 crc kubenswrapper[4732]: I0402 14:28:40.123135 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="803381a8-7c12-4270-bf63-f3121ec1b870" containerName="registry-server" Apr 02 14:28:40 crc kubenswrapper[4732]: I0402 14:28:40.123147 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e48ef20-b922-4518-98fa-3286d5da4b2e" containerName="oc" Apr 02 14:28:40 crc kubenswrapper[4732]: I0402 14:28:40.124772 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pjzhz" Apr 02 14:28:40 crc kubenswrapper[4732]: I0402 14:28:40.141035 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pjzhz"] Apr 02 14:28:40 crc kubenswrapper[4732]: I0402 14:28:40.294243 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def95779-2cb9-49f6-844a-a1cc92bd3821-utilities\") pod \"redhat-marketplace-pjzhz\" (UID: \"def95779-2cb9-49f6-844a-a1cc92bd3821\") " pod="openshift-marketplace/redhat-marketplace-pjzhz" Apr 02 14:28:40 crc kubenswrapper[4732]: I0402 14:28:40.294294 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j72zt\" (UniqueName: \"kubernetes.io/projected/def95779-2cb9-49f6-844a-a1cc92bd3821-kube-api-access-j72zt\") pod \"redhat-marketplace-pjzhz\" (UID: \"def95779-2cb9-49f6-844a-a1cc92bd3821\") " pod="openshift-marketplace/redhat-marketplace-pjzhz" Apr 02 14:28:40 crc kubenswrapper[4732]: I0402 14:28:40.294447 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def95779-2cb9-49f6-844a-a1cc92bd3821-catalog-content\") pod \"redhat-marketplace-pjzhz\" (UID: \"def95779-2cb9-49f6-844a-a1cc92bd3821\") " pod="openshift-marketplace/redhat-marketplace-pjzhz" Apr 02 14:28:40 crc kubenswrapper[4732]: I0402 14:28:40.396518 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def95779-2cb9-49f6-844a-a1cc92bd3821-utilities\") pod \"redhat-marketplace-pjzhz\" (UID: \"def95779-2cb9-49f6-844a-a1cc92bd3821\") " pod="openshift-marketplace/redhat-marketplace-pjzhz" Apr 02 14:28:40 crc kubenswrapper[4732]: I0402 14:28:40.396584 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j72zt\" (UniqueName: \"kubernetes.io/projected/def95779-2cb9-49f6-844a-a1cc92bd3821-kube-api-access-j72zt\") pod \"redhat-marketplace-pjzhz\" (UID: \"def95779-2cb9-49f6-844a-a1cc92bd3821\") " pod="openshift-marketplace/redhat-marketplace-pjzhz" Apr 02 14:28:40 crc kubenswrapper[4732]: I0402 14:28:40.396759 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def95779-2cb9-49f6-844a-a1cc92bd3821-catalog-content\") pod \"redhat-marketplace-pjzhz\" (UID: \"def95779-2cb9-49f6-844a-a1cc92bd3821\") " pod="openshift-marketplace/redhat-marketplace-pjzhz" Apr 02 14:28:40 crc kubenswrapper[4732]: I0402 14:28:40.397071 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def95779-2cb9-49f6-844a-a1cc92bd3821-utilities\") pod \"redhat-marketplace-pjzhz\" (UID: \"def95779-2cb9-49f6-844a-a1cc92bd3821\") " pod="openshift-marketplace/redhat-marketplace-pjzhz" Apr 02 14:28:40 crc kubenswrapper[4732]: I0402 14:28:40.397106 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def95779-2cb9-49f6-844a-a1cc92bd3821-catalog-content\") pod \"redhat-marketplace-pjzhz\" (UID: \"def95779-2cb9-49f6-844a-a1cc92bd3821\") " pod="openshift-marketplace/redhat-marketplace-pjzhz" Apr 02 14:28:40 crc kubenswrapper[4732]: I0402 14:28:40.418547 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j72zt\" (UniqueName: \"kubernetes.io/projected/def95779-2cb9-49f6-844a-a1cc92bd3821-kube-api-access-j72zt\") pod \"redhat-marketplace-pjzhz\" (UID: \"def95779-2cb9-49f6-844a-a1cc92bd3821\") " pod="openshift-marketplace/redhat-marketplace-pjzhz" Apr 02 14:28:40 crc kubenswrapper[4732]: I0402 14:28:40.459543 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pjzhz" Apr 02 14:28:40 crc kubenswrapper[4732]: I0402 14:28:40.951647 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pjzhz"] Apr 02 14:28:41 crc kubenswrapper[4732]: I0402 14:28:41.939825 4732 generic.go:334] "Generic (PLEG): container finished" podID="def95779-2cb9-49f6-844a-a1cc92bd3821" containerID="94d19481eff2cd6a53bf189aac4e35b8fe91d15d4964ddbb98c05a3218ca7899" exitCode=0 Apr 02 14:28:41 crc kubenswrapper[4732]: I0402 14:28:41.939873 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pjzhz" event={"ID":"def95779-2cb9-49f6-844a-a1cc92bd3821","Type":"ContainerDied","Data":"94d19481eff2cd6a53bf189aac4e35b8fe91d15d4964ddbb98c05a3218ca7899"} Apr 02 14:28:41 crc kubenswrapper[4732]: I0402 14:28:41.940179 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pjzhz" event={"ID":"def95779-2cb9-49f6-844a-a1cc92bd3821","Type":"ContainerStarted","Data":"f92272ae1f95fb00256bcff651598e21ebe555c2370736587d1b8a147b626d5f"} Apr 02 14:28:42 crc kubenswrapper[4732]: I0402 14:28:42.950426 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pjzhz" event={"ID":"def95779-2cb9-49f6-844a-a1cc92bd3821","Type":"ContainerStarted","Data":"57a318bec13eaa4e8929a815ac8946da5d7f7f45b54c54fe7a3be00936a5899a"} Apr 02 14:28:43 crc kubenswrapper[4732]: I0402 14:28:43.962511 4732 generic.go:334] "Generic (PLEG): container finished" podID="def95779-2cb9-49f6-844a-a1cc92bd3821" containerID="57a318bec13eaa4e8929a815ac8946da5d7f7f45b54c54fe7a3be00936a5899a" exitCode=0 Apr 02 14:28:43 crc kubenswrapper[4732]: I0402 14:28:43.962645 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pjzhz" event={"ID":"def95779-2cb9-49f6-844a-a1cc92bd3821","Type":"ContainerDied","Data":"57a318bec13eaa4e8929a815ac8946da5d7f7f45b54c54fe7a3be00936a5899a"} Apr 02 14:28:44 crc kubenswrapper[4732]: I0402 14:28:44.973791 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pjzhz" event={"ID":"def95779-2cb9-49f6-844a-a1cc92bd3821","Type":"ContainerStarted","Data":"ee8c7d127155d5d4615f5554c694fa1c48118a2312b4c5804a09cc1b4f0ce465"} Apr 02 14:28:44 crc kubenswrapper[4732]: I0402 14:28:44.999241 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pjzhz" podStartSLOduration=2.520674212 podStartE2EDuration="4.999214017s" podCreationTimestamp="2026-04-02 14:28:40 +0000 UTC" firstStartedPulling="2026-04-02 14:28:41.941957482 +0000 UTC m=+3078.846365035" lastFinishedPulling="2026-04-02 14:28:44.420497287 +0000 UTC m=+3081.324904840" observedRunningTime="2026-04-02 14:28:44.990292555 +0000 UTC m=+3081.894700158" watchObservedRunningTime="2026-04-02 14:28:44.999214017 +0000 UTC m=+3081.903621570" Apr 02 14:28:50 crc kubenswrapper[4732]: I0402 14:28:50.460981 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pjzhz" Apr 02 14:28:50 crc kubenswrapper[4732]: I0402 14:28:50.461467 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pjzhz" Apr 02 14:28:50 crc kubenswrapper[4732]: I0402 14:28:50.538967 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pjzhz" Apr 02 14:28:51 crc kubenswrapper[4732]: I0402 14:28:51.075894 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pjzhz" Apr 02 14:28:51 crc kubenswrapper[4732]: I0402 14:28:51.122086 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pjzhz"] Apr 02 14:28:52 crc kubenswrapper[4732]: I0402 14:28:52.152782 4732 scope.go:117] "RemoveContainer" containerID="74ee7f47e875de8d4547eff19fe786f49f38d4ce0193a1f3cda4dc35cd73caf4" Apr 02 14:28:53 crc kubenswrapper[4732]: I0402 14:28:53.051799 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pjzhz" podUID="def95779-2cb9-49f6-844a-a1cc92bd3821" containerName="registry-server" containerID="cri-o://ee8c7d127155d5d4615f5554c694fa1c48118a2312b4c5804a09cc1b4f0ce465" gracePeriod=2 Apr 02 14:28:53 crc kubenswrapper[4732]: I0402 14:28:53.553909 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pjzhz" Apr 02 14:28:53 crc kubenswrapper[4732]: I0402 14:28:53.677448 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def95779-2cb9-49f6-844a-a1cc92bd3821-catalog-content\") pod \"def95779-2cb9-49f6-844a-a1cc92bd3821\" (UID: \"def95779-2cb9-49f6-844a-a1cc92bd3821\") " Apr 02 14:28:53 crc kubenswrapper[4732]: I0402 14:28:53.677656 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def95779-2cb9-49f6-844a-a1cc92bd3821-utilities\") pod \"def95779-2cb9-49f6-844a-a1cc92bd3821\" (UID: \"def95779-2cb9-49f6-844a-a1cc92bd3821\") " Apr 02 14:28:53 crc kubenswrapper[4732]: I0402 14:28:53.677704 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j72zt\" (UniqueName: \"kubernetes.io/projected/def95779-2cb9-49f6-844a-a1cc92bd3821-kube-api-access-j72zt\") pod \"def95779-2cb9-49f6-844a-a1cc92bd3821\" (UID: \"def95779-2cb9-49f6-844a-a1cc92bd3821\") " Apr 02 14:28:53 crc kubenswrapper[4732]: I0402 14:28:53.679249 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/def95779-2cb9-49f6-844a-a1cc92bd3821-utilities" (OuterVolumeSpecName: "utilities") pod "def95779-2cb9-49f6-844a-a1cc92bd3821" (UID: "def95779-2cb9-49f6-844a-a1cc92bd3821"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:28:53 crc kubenswrapper[4732]: I0402 14:28:53.682978 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/def95779-2cb9-49f6-844a-a1cc92bd3821-kube-api-access-j72zt" (OuterVolumeSpecName: "kube-api-access-j72zt") pod "def95779-2cb9-49f6-844a-a1cc92bd3821" (UID: "def95779-2cb9-49f6-844a-a1cc92bd3821"). InnerVolumeSpecName "kube-api-access-j72zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:28:53 crc kubenswrapper[4732]: I0402 14:28:53.701666 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/def95779-2cb9-49f6-844a-a1cc92bd3821-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "def95779-2cb9-49f6-844a-a1cc92bd3821" (UID: "def95779-2cb9-49f6-844a-a1cc92bd3821"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:28:53 crc kubenswrapper[4732]: I0402 14:28:53.779901 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def95779-2cb9-49f6-844a-a1cc92bd3821-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 14:28:53 crc kubenswrapper[4732]: I0402 14:28:53.780166 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j72zt\" (UniqueName: \"kubernetes.io/projected/def95779-2cb9-49f6-844a-a1cc92bd3821-kube-api-access-j72zt\") on node \"crc\" DevicePath \"\"" Apr 02 14:28:53 crc kubenswrapper[4732]: I0402 14:28:53.780196 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def95779-2cb9-49f6-844a-a1cc92bd3821-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 14:28:54 crc kubenswrapper[4732]: I0402 14:28:54.062719 4732 generic.go:334] "Generic (PLEG): container finished" podID="def95779-2cb9-49f6-844a-a1cc92bd3821" containerID="ee8c7d127155d5d4615f5554c694fa1c48118a2312b4c5804a09cc1b4f0ce465" exitCode=0 Apr 02 14:28:54 crc kubenswrapper[4732]: I0402 14:28:54.062767 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pjzhz" event={"ID":"def95779-2cb9-49f6-844a-a1cc92bd3821","Type":"ContainerDied","Data":"ee8c7d127155d5d4615f5554c694fa1c48118a2312b4c5804a09cc1b4f0ce465"} Apr 02 14:28:54 crc kubenswrapper[4732]: I0402 14:28:54.062808 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pjzhz" event={"ID":"def95779-2cb9-49f6-844a-a1cc92bd3821","Type":"ContainerDied","Data":"f92272ae1f95fb00256bcff651598e21ebe555c2370736587d1b8a147b626d5f"} Apr 02 14:28:54 crc kubenswrapper[4732]: I0402 14:28:54.062805 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pjzhz" Apr 02 14:28:54 crc kubenswrapper[4732]: I0402 14:28:54.062829 4732 scope.go:117] "RemoveContainer" containerID="ee8c7d127155d5d4615f5554c694fa1c48118a2312b4c5804a09cc1b4f0ce465" Apr 02 14:28:54 crc kubenswrapper[4732]: I0402 14:28:54.089461 4732 scope.go:117] "RemoveContainer" containerID="57a318bec13eaa4e8929a815ac8946da5d7f7f45b54c54fe7a3be00936a5899a" Apr 02 14:28:54 crc kubenswrapper[4732]: I0402 14:28:54.139764 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pjzhz"] Apr 02 14:28:54 crc kubenswrapper[4732]: I0402 14:28:54.148700 4732 scope.go:117] "RemoveContainer" containerID="94d19481eff2cd6a53bf189aac4e35b8fe91d15d4964ddbb98c05a3218ca7899" Apr 02 14:28:54 crc kubenswrapper[4732]: I0402 14:28:54.178241 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pjzhz"] Apr 02 14:28:54 crc kubenswrapper[4732]: I0402 14:28:54.259603 4732 scope.go:117] "RemoveContainer" containerID="ee8c7d127155d5d4615f5554c694fa1c48118a2312b4c5804a09cc1b4f0ce465" Apr 02 14:28:54 crc kubenswrapper[4732]: E0402 14:28:54.263524 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee8c7d127155d5d4615f5554c694fa1c48118a2312b4c5804a09cc1b4f0ce465\": container with ID starting with ee8c7d127155d5d4615f5554c694fa1c48118a2312b4c5804a09cc1b4f0ce465 not found: ID does not exist" containerID="ee8c7d127155d5d4615f5554c694fa1c48118a2312b4c5804a09cc1b4f0ce465" Apr 02 14:28:54 crc kubenswrapper[4732]: I0402 14:28:54.263566 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee8c7d127155d5d4615f5554c694fa1c48118a2312b4c5804a09cc1b4f0ce465"} err="failed to get container status \"ee8c7d127155d5d4615f5554c694fa1c48118a2312b4c5804a09cc1b4f0ce465\": rpc error: code = NotFound desc = could not find container \"ee8c7d127155d5d4615f5554c694fa1c48118a2312b4c5804a09cc1b4f0ce465\": container with ID starting with ee8c7d127155d5d4615f5554c694fa1c48118a2312b4c5804a09cc1b4f0ce465 not found: ID does not exist" Apr 02 14:28:54 crc kubenswrapper[4732]: I0402 14:28:54.263591 4732 scope.go:117] "RemoveContainer" containerID="57a318bec13eaa4e8929a815ac8946da5d7f7f45b54c54fe7a3be00936a5899a" Apr 02 14:28:54 crc kubenswrapper[4732]: E0402 14:28:54.264025 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57a318bec13eaa4e8929a815ac8946da5d7f7f45b54c54fe7a3be00936a5899a\": container with ID starting with 57a318bec13eaa4e8929a815ac8946da5d7f7f45b54c54fe7a3be00936a5899a not found: ID does not exist" containerID="57a318bec13eaa4e8929a815ac8946da5d7f7f45b54c54fe7a3be00936a5899a" Apr 02 14:28:54 crc kubenswrapper[4732]: I0402 14:28:54.264147 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57a318bec13eaa4e8929a815ac8946da5d7f7f45b54c54fe7a3be00936a5899a"} err="failed to get container status \"57a318bec13eaa4e8929a815ac8946da5d7f7f45b54c54fe7a3be00936a5899a\": rpc error: code = NotFound desc = could not find container \"57a318bec13eaa4e8929a815ac8946da5d7f7f45b54c54fe7a3be00936a5899a\": container with ID starting with 57a318bec13eaa4e8929a815ac8946da5d7f7f45b54c54fe7a3be00936a5899a not found: ID does not exist" Apr 02 14:28:54 crc kubenswrapper[4732]: I0402 14:28:54.264242 4732 scope.go:117] "RemoveContainer" containerID="94d19481eff2cd6a53bf189aac4e35b8fe91d15d4964ddbb98c05a3218ca7899" Apr 02 14:28:54 crc kubenswrapper[4732]: E0402 14:28:54.264710 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94d19481eff2cd6a53bf189aac4e35b8fe91d15d4964ddbb98c05a3218ca7899\": container with ID starting with 94d19481eff2cd6a53bf189aac4e35b8fe91d15d4964ddbb98c05a3218ca7899 not found: ID does not exist" containerID="94d19481eff2cd6a53bf189aac4e35b8fe91d15d4964ddbb98c05a3218ca7899" Apr 02 14:28:54 crc kubenswrapper[4732]: I0402 14:28:54.264733 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94d19481eff2cd6a53bf189aac4e35b8fe91d15d4964ddbb98c05a3218ca7899"} err="failed to get container status \"94d19481eff2cd6a53bf189aac4e35b8fe91d15d4964ddbb98c05a3218ca7899\": rpc error: code = NotFound desc = could not find container \"94d19481eff2cd6a53bf189aac4e35b8fe91d15d4964ddbb98c05a3218ca7899\": container with ID starting with 94d19481eff2cd6a53bf189aac4e35b8fe91d15d4964ddbb98c05a3218ca7899 not found: ID does not exist" Apr 02 14:28:54 crc kubenswrapper[4732]: I0402 14:28:54.691733 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="def95779-2cb9-49f6-844a-a1cc92bd3821" path="/var/lib/kubelet/pods/def95779-2cb9-49f6-844a-a1cc92bd3821/volumes" Apr 02 14:29:12 crc kubenswrapper[4732]: I0402 14:29:12.659132 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n6wbf"] Apr 02 14:29:12 crc kubenswrapper[4732]: E0402 14:29:12.661510 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def95779-2cb9-49f6-844a-a1cc92bd3821" containerName="extract-content" Apr 02 14:29:12 crc kubenswrapper[4732]: I0402 14:29:12.661647 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="def95779-2cb9-49f6-844a-a1cc92bd3821" containerName="extract-content" Apr 02 14:29:12 crc kubenswrapper[4732]: E0402 14:29:12.661762 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def95779-2cb9-49f6-844a-a1cc92bd3821" containerName="extract-utilities" Apr 02 14:29:12 crc kubenswrapper[4732]: I0402 14:29:12.661851 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="def95779-2cb9-49f6-844a-a1cc92bd3821" containerName="extract-utilities" Apr 02 14:29:12 crc kubenswrapper[4732]: E0402 14:29:12.661999 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def95779-2cb9-49f6-844a-a1cc92bd3821" containerName="registry-server" Apr 02 14:29:12 crc kubenswrapper[4732]: I0402 14:29:12.662015 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="def95779-2cb9-49f6-844a-a1cc92bd3821" containerName="registry-server" Apr 02 14:29:12 crc kubenswrapper[4732]: I0402 14:29:12.662361 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="def95779-2cb9-49f6-844a-a1cc92bd3821" containerName="registry-server" Apr 02 14:29:12 crc kubenswrapper[4732]: I0402 14:29:12.663812 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n6wbf" Apr 02 14:29:12 crc kubenswrapper[4732]: I0402 14:29:12.675386 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n6wbf"] Apr 02 14:29:12 crc kubenswrapper[4732]: I0402 14:29:12.766167 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggrvt\" (UniqueName: \"kubernetes.io/projected/94804cde-d995-4ba4-82cc-376fbf5c49ee-kube-api-access-ggrvt\") pod \"certified-operators-n6wbf\" (UID: \"94804cde-d995-4ba4-82cc-376fbf5c49ee\") " pod="openshift-marketplace/certified-operators-n6wbf" Apr 02 14:29:12 crc kubenswrapper[4732]: I0402 14:29:12.766226 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94804cde-d995-4ba4-82cc-376fbf5c49ee-utilities\") pod \"certified-operators-n6wbf\" (UID: \"94804cde-d995-4ba4-82cc-376fbf5c49ee\") " pod="openshift-marketplace/certified-operators-n6wbf" Apr 02 14:29:12 crc kubenswrapper[4732]: I0402 14:29:12.766325 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94804cde-d995-4ba4-82cc-376fbf5c49ee-catalog-content\") pod \"certified-operators-n6wbf\" (UID: \"94804cde-d995-4ba4-82cc-376fbf5c49ee\") " pod="openshift-marketplace/certified-operators-n6wbf" Apr 02 14:29:12 crc kubenswrapper[4732]: I0402 14:29:12.868859 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94804cde-d995-4ba4-82cc-376fbf5c49ee-catalog-content\") pod \"certified-operators-n6wbf\" (UID: \"94804cde-d995-4ba4-82cc-376fbf5c49ee\") " pod="openshift-marketplace/certified-operators-n6wbf" Apr 02 14:29:12 crc kubenswrapper[4732]: I0402 14:29:12.869141 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggrvt\" (UniqueName: \"kubernetes.io/projected/94804cde-d995-4ba4-82cc-376fbf5c49ee-kube-api-access-ggrvt\") pod \"certified-operators-n6wbf\" (UID: \"94804cde-d995-4ba4-82cc-376fbf5c49ee\") " pod="openshift-marketplace/certified-operators-n6wbf" Apr 02 14:29:12 crc kubenswrapper[4732]: I0402 14:29:12.869181 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94804cde-d995-4ba4-82cc-376fbf5c49ee-utilities\") pod \"certified-operators-n6wbf\" (UID: \"94804cde-d995-4ba4-82cc-376fbf5c49ee\") " pod="openshift-marketplace/certified-operators-n6wbf" Apr 02 14:29:12 crc kubenswrapper[4732]: I0402 14:29:12.869668 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94804cde-d995-4ba4-82cc-376fbf5c49ee-utilities\") pod \"certified-operators-n6wbf\" (UID: \"94804cde-d995-4ba4-82cc-376fbf5c49ee\") " pod="openshift-marketplace/certified-operators-n6wbf" Apr 02 14:29:12 crc kubenswrapper[4732]: I0402 14:29:12.869750 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94804cde-d995-4ba4-82cc-376fbf5c49ee-catalog-content\") pod \"certified-operators-n6wbf\" (UID: \"94804cde-d995-4ba4-82cc-376fbf5c49ee\") " pod="openshift-marketplace/certified-operators-n6wbf" Apr 02 14:29:12 crc kubenswrapper[4732]: I0402 14:29:12.887855 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggrvt\" (UniqueName: \"kubernetes.io/projected/94804cde-d995-4ba4-82cc-376fbf5c49ee-kube-api-access-ggrvt\") pod \"certified-operators-n6wbf\" (UID: \"94804cde-d995-4ba4-82cc-376fbf5c49ee\") " pod="openshift-marketplace/certified-operators-n6wbf" Apr 02 14:29:12 crc kubenswrapper[4732]: I0402 14:29:12.984213 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n6wbf" Apr 02 14:29:13 crc kubenswrapper[4732]: I0402 14:29:13.495816 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n6wbf"] Apr 02 14:29:14 crc kubenswrapper[4732]: I0402 14:29:14.278078 4732 generic.go:334] "Generic (PLEG): container finished" podID="94804cde-d995-4ba4-82cc-376fbf5c49ee" containerID="e7fc8e2cff5e1fde1913f541295b8dc9cd7c83017478c53100a5ba48f7e7f3d4" exitCode=0 Apr 02 14:29:14 crc kubenswrapper[4732]: I0402 14:29:14.278148 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6wbf" event={"ID":"94804cde-d995-4ba4-82cc-376fbf5c49ee","Type":"ContainerDied","Data":"e7fc8e2cff5e1fde1913f541295b8dc9cd7c83017478c53100a5ba48f7e7f3d4"} Apr 02 14:29:14 crc kubenswrapper[4732]: I0402 14:29:14.278189 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6wbf" event={"ID":"94804cde-d995-4ba4-82cc-376fbf5c49ee","Type":"ContainerStarted","Data":"28b39f1a30aa5d9d577176236f9d05fdc5f32e67961831854f196a9ca9043b33"} Apr 02 14:29:14 crc kubenswrapper[4732]: I0402 14:29:14.281262 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 02 14:29:15 crc kubenswrapper[4732]: I0402 14:29:15.288545 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6wbf" event={"ID":"94804cde-d995-4ba4-82cc-376fbf5c49ee","Type":"ContainerStarted","Data":"b9a38dcef6f1904fa376f14680950375753eef1e57413c302b3c048be55d5afb"} Apr 02 14:29:16 crc kubenswrapper[4732]: I0402 14:29:16.303332 4732 generic.go:334] "Generic (PLEG): container finished" podID="94804cde-d995-4ba4-82cc-376fbf5c49ee" containerID="b9a38dcef6f1904fa376f14680950375753eef1e57413c302b3c048be55d5afb" exitCode=0 Apr 02 14:29:16 crc kubenswrapper[4732]: I0402 14:29:16.303382 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6wbf" event={"ID":"94804cde-d995-4ba4-82cc-376fbf5c49ee","Type":"ContainerDied","Data":"b9a38dcef6f1904fa376f14680950375753eef1e57413c302b3c048be55d5afb"} Apr 02 14:29:17 crc kubenswrapper[4732]: I0402 14:29:17.314518 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6wbf" event={"ID":"94804cde-d995-4ba4-82cc-376fbf5c49ee","Type":"ContainerStarted","Data":"8d6347dc7a33aa03e931ef6b2fec4c0e6a82f64b34e45688a3af9496c6b716d5"} Apr 02 14:29:17 crc kubenswrapper[4732]: I0402 14:29:17.335214 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n6wbf" podStartSLOduration=2.86069625 podStartE2EDuration="5.335155372s" podCreationTimestamp="2026-04-02 14:29:12 +0000 UTC" firstStartedPulling="2026-04-02 14:29:14.2809593 +0000 UTC m=+3111.185366873" lastFinishedPulling="2026-04-02 14:29:16.755418432 +0000 UTC m=+3113.659825995" observedRunningTime="2026-04-02 14:29:17.33213831 +0000 UTC m=+3114.236545883" watchObservedRunningTime="2026-04-02 14:29:17.335155372 +0000 UTC m=+3114.239562965" Apr 02 14:29:22 crc kubenswrapper[4732]: I0402 14:29:22.984408 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n6wbf" Apr 02 14:29:22 crc kubenswrapper[4732]: I0402 14:29:22.985449 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n6wbf" Apr 02 14:29:23 crc kubenswrapper[4732]: I0402 14:29:23.031731 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n6wbf" Apr 02 14:29:23 crc kubenswrapper[4732]: I0402 14:29:23.430333 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n6wbf" Apr 02 14:29:23 crc kubenswrapper[4732]: I0402 14:29:23.504366 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n6wbf"] Apr 02 14:29:25 crc kubenswrapper[4732]: I0402 14:29:25.397173 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n6wbf" podUID="94804cde-d995-4ba4-82cc-376fbf5c49ee" containerName="registry-server" containerID="cri-o://8d6347dc7a33aa03e931ef6b2fec4c0e6a82f64b34e45688a3af9496c6b716d5" gracePeriod=2 Apr 02 14:29:25 crc kubenswrapper[4732]: I0402 14:29:25.908445 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n6wbf" Apr 02 14:29:26 crc kubenswrapper[4732]: I0402 14:29:26.013659 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94804cde-d995-4ba4-82cc-376fbf5c49ee-utilities\") pod \"94804cde-d995-4ba4-82cc-376fbf5c49ee\" (UID: \"94804cde-d995-4ba4-82cc-376fbf5c49ee\") " Apr 02 14:29:26 crc kubenswrapper[4732]: I0402 14:29:26.013732 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94804cde-d995-4ba4-82cc-376fbf5c49ee-catalog-content\") pod \"94804cde-d995-4ba4-82cc-376fbf5c49ee\" (UID: \"94804cde-d995-4ba4-82cc-376fbf5c49ee\") " Apr 02 14:29:26 crc kubenswrapper[4732]: I0402 14:29:26.013811 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggrvt\" (UniqueName: \"kubernetes.io/projected/94804cde-d995-4ba4-82cc-376fbf5c49ee-kube-api-access-ggrvt\") pod \"94804cde-d995-4ba4-82cc-376fbf5c49ee\" (UID: \"94804cde-d995-4ba4-82cc-376fbf5c49ee\") " Apr 02 14:29:26 crc kubenswrapper[4732]: I0402 14:29:26.014872 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94804cde-d995-4ba4-82cc-376fbf5c49ee-utilities" (OuterVolumeSpecName: "utilities") pod "94804cde-d995-4ba4-82cc-376fbf5c49ee" (UID: "94804cde-d995-4ba4-82cc-376fbf5c49ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:29:26 crc kubenswrapper[4732]: I0402 14:29:26.022927 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94804cde-d995-4ba4-82cc-376fbf5c49ee-kube-api-access-ggrvt" (OuterVolumeSpecName: "kube-api-access-ggrvt") pod "94804cde-d995-4ba4-82cc-376fbf5c49ee" (UID: "94804cde-d995-4ba4-82cc-376fbf5c49ee"). InnerVolumeSpecName "kube-api-access-ggrvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:29:26 crc kubenswrapper[4732]: I0402 14:29:26.073422 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94804cde-d995-4ba4-82cc-376fbf5c49ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94804cde-d995-4ba4-82cc-376fbf5c49ee" (UID: "94804cde-d995-4ba4-82cc-376fbf5c49ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:29:26 crc kubenswrapper[4732]: I0402 14:29:26.116261 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94804cde-d995-4ba4-82cc-376fbf5c49ee-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 14:29:26 crc kubenswrapper[4732]: I0402 14:29:26.116304 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94804cde-d995-4ba4-82cc-376fbf5c49ee-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 14:29:26 crc kubenswrapper[4732]: I0402 14:29:26.116320 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggrvt\" (UniqueName: \"kubernetes.io/projected/94804cde-d995-4ba4-82cc-376fbf5c49ee-kube-api-access-ggrvt\") on node \"crc\" DevicePath \"\"" Apr 02 14:29:26 crc kubenswrapper[4732]: I0402 14:29:26.415311 4732 generic.go:334] "Generic (PLEG): container finished" podID="94804cde-d995-4ba4-82cc-376fbf5c49ee" containerID="8d6347dc7a33aa03e931ef6b2fec4c0e6a82f64b34e45688a3af9496c6b716d5" exitCode=0 Apr 02 14:29:26 crc kubenswrapper[4732]: I0402 14:29:26.415366 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6wbf" event={"ID":"94804cde-d995-4ba4-82cc-376fbf5c49ee","Type":"ContainerDied","Data":"8d6347dc7a33aa03e931ef6b2fec4c0e6a82f64b34e45688a3af9496c6b716d5"} Apr 02 14:29:26 crc kubenswrapper[4732]: I0402 14:29:26.415397 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6wbf" event={"ID":"94804cde-d995-4ba4-82cc-376fbf5c49ee","Type":"ContainerDied","Data":"28b39f1a30aa5d9d577176236f9d05fdc5f32e67961831854f196a9ca9043b33"} Apr 02 14:29:26 crc kubenswrapper[4732]: I0402 14:29:26.415417 4732 scope.go:117] "RemoveContainer" containerID="8d6347dc7a33aa03e931ef6b2fec4c0e6a82f64b34e45688a3af9496c6b716d5" Apr 02 14:29:26 crc kubenswrapper[4732]: I0402 14:29:26.415567 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n6wbf" Apr 02 14:29:26 crc kubenswrapper[4732]: I0402 14:29:26.464865 4732 scope.go:117] "RemoveContainer" containerID="b9a38dcef6f1904fa376f14680950375753eef1e57413c302b3c048be55d5afb" Apr 02 14:29:26 crc kubenswrapper[4732]: I0402 14:29:26.469488 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n6wbf"] Apr 02 14:29:26 crc kubenswrapper[4732]: I0402 14:29:26.481263 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n6wbf"] Apr 02 14:29:26 crc kubenswrapper[4732]: I0402 14:29:26.500470 4732 scope.go:117] "RemoveContainer" containerID="e7fc8e2cff5e1fde1913f541295b8dc9cd7c83017478c53100a5ba48f7e7f3d4" Apr 02 14:29:26 crc kubenswrapper[4732]: I0402 14:29:26.548018 4732 scope.go:117] "RemoveContainer" containerID="8d6347dc7a33aa03e931ef6b2fec4c0e6a82f64b34e45688a3af9496c6b716d5" Apr 02 14:29:26 crc kubenswrapper[4732]: E0402 14:29:26.548357 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d6347dc7a33aa03e931ef6b2fec4c0e6a82f64b34e45688a3af9496c6b716d5\": container with ID starting with 8d6347dc7a33aa03e931ef6b2fec4c0e6a82f64b34e45688a3af9496c6b716d5 not found: ID does not exist" containerID="8d6347dc7a33aa03e931ef6b2fec4c0e6a82f64b34e45688a3af9496c6b716d5" Apr 02 14:29:26 crc kubenswrapper[4732]: I0402 14:29:26.548398 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d6347dc7a33aa03e931ef6b2fec4c0e6a82f64b34e45688a3af9496c6b716d5"} err="failed to get container status \"8d6347dc7a33aa03e931ef6b2fec4c0e6a82f64b34e45688a3af9496c6b716d5\": rpc error: code = NotFound desc = could not find container \"8d6347dc7a33aa03e931ef6b2fec4c0e6a82f64b34e45688a3af9496c6b716d5\": container with ID starting with 8d6347dc7a33aa03e931ef6b2fec4c0e6a82f64b34e45688a3af9496c6b716d5 not found: ID does not exist" Apr 02 14:29:26 crc kubenswrapper[4732]: I0402 14:29:26.548428 4732 scope.go:117] "RemoveContainer" containerID="b9a38dcef6f1904fa376f14680950375753eef1e57413c302b3c048be55d5afb" Apr 02 14:29:26 crc kubenswrapper[4732]: E0402 14:29:26.548847 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9a38dcef6f1904fa376f14680950375753eef1e57413c302b3c048be55d5afb\": container with ID starting with b9a38dcef6f1904fa376f14680950375753eef1e57413c302b3c048be55d5afb not found: ID does not exist" containerID="b9a38dcef6f1904fa376f14680950375753eef1e57413c302b3c048be55d5afb" Apr 02 14:29:26 crc kubenswrapper[4732]: I0402 14:29:26.548874 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9a38dcef6f1904fa376f14680950375753eef1e57413c302b3c048be55d5afb"} err="failed to get container status \"b9a38dcef6f1904fa376f14680950375753eef1e57413c302b3c048be55d5afb\": rpc error: code = NotFound desc = could not find container \"b9a38dcef6f1904fa376f14680950375753eef1e57413c302b3c048be55d5afb\": container with ID starting with b9a38dcef6f1904fa376f14680950375753eef1e57413c302b3c048be55d5afb not found: ID does not exist" Apr 02 14:29:26 crc kubenswrapper[4732]: I0402 14:29:26.548896 4732 scope.go:117] "RemoveContainer" containerID="e7fc8e2cff5e1fde1913f541295b8dc9cd7c83017478c53100a5ba48f7e7f3d4" Apr 02 14:29:26 crc kubenswrapper[4732]: E0402 14:29:26.549175 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7fc8e2cff5e1fde1913f541295b8dc9cd7c83017478c53100a5ba48f7e7f3d4\": container with ID starting with e7fc8e2cff5e1fde1913f541295b8dc9cd7c83017478c53100a5ba48f7e7f3d4 not found: ID does not exist" containerID="e7fc8e2cff5e1fde1913f541295b8dc9cd7c83017478c53100a5ba48f7e7f3d4" Apr 02 14:29:26 crc kubenswrapper[4732]: I0402 14:29:26.549219 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7fc8e2cff5e1fde1913f541295b8dc9cd7c83017478c53100a5ba48f7e7f3d4"} err="failed to get container status \"e7fc8e2cff5e1fde1913f541295b8dc9cd7c83017478c53100a5ba48f7e7f3d4\": rpc error: code = NotFound desc = could not find container \"e7fc8e2cff5e1fde1913f541295b8dc9cd7c83017478c53100a5ba48f7e7f3d4\": container with ID starting with e7fc8e2cff5e1fde1913f541295b8dc9cd7c83017478c53100a5ba48f7e7f3d4 not found: ID does not exist" Apr 02 14:29:26 crc kubenswrapper[4732]: I0402 14:29:26.700949 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94804cde-d995-4ba4-82cc-376fbf5c49ee" path="/var/lib/kubelet/pods/94804cde-d995-4ba4-82cc-376fbf5c49ee/volumes" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.147293 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585670-xrnq5"] Apr 02 14:30:00 crc kubenswrapper[4732]: E0402 14:30:00.148606 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94804cde-d995-4ba4-82cc-376fbf5c49ee" containerName="registry-server" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.148652 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="94804cde-d995-4ba4-82cc-376fbf5c49ee" containerName="registry-server" Apr 02 14:30:00 crc kubenswrapper[4732]: E0402 14:30:00.148681 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94804cde-d995-4ba4-82cc-376fbf5c49ee" containerName="extract-utilities" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.148692 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="94804cde-d995-4ba4-82cc-376fbf5c49ee" containerName="extract-utilities" Apr 02 14:30:00 crc kubenswrapper[4732]: E0402 14:30:00.148725 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94804cde-d995-4ba4-82cc-376fbf5c49ee" containerName="extract-content" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.148737 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="94804cde-d995-4ba4-82cc-376fbf5c49ee" containerName="extract-content" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.149115 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="94804cde-d995-4ba4-82cc-376fbf5c49ee" containerName="registry-server" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.150201 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585670-xrnq5" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.153364 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.153443 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.154284 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.160939 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29585670-wrh58"] Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.162772 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29585670-wrh58" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.164601 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.165213 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.171001 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585670-xrnq5"] Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.182097 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29585670-wrh58"] Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.282745 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a987f5d-b61e-43ba-a548-91b3dd5ed700-secret-volume\") pod \"collect-profiles-29585670-wrh58\" (UID: \"6a987f5d-b61e-43ba-a548-91b3dd5ed700\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585670-wrh58" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.282876 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a987f5d-b61e-43ba-a548-91b3dd5ed700-config-volume\") pod \"collect-profiles-29585670-wrh58\" (UID: \"6a987f5d-b61e-43ba-a548-91b3dd5ed700\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585670-wrh58" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.282944 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqxmq\" (UniqueName: \"kubernetes.io/projected/6a987f5d-b61e-43ba-a548-91b3dd5ed700-kube-api-access-vqxmq\") pod \"collect-profiles-29585670-wrh58\" (UID: \"6a987f5d-b61e-43ba-a548-91b3dd5ed700\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585670-wrh58" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.283003 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xchrw\" (UniqueName: \"kubernetes.io/projected/17812719-d9ec-4abb-9a8b-d65e27c6ef30-kube-api-access-xchrw\") pod \"auto-csr-approver-29585670-xrnq5\" (UID: \"17812719-d9ec-4abb-9a8b-d65e27c6ef30\") " pod="openshift-infra/auto-csr-approver-29585670-xrnq5" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.385008 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqxmq\" (UniqueName: \"kubernetes.io/projected/6a987f5d-b61e-43ba-a548-91b3dd5ed700-kube-api-access-vqxmq\") pod \"collect-profiles-29585670-wrh58\" (UID: \"6a987f5d-b61e-43ba-a548-91b3dd5ed700\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585670-wrh58" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.385115 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xchrw\" (UniqueName: \"kubernetes.io/projected/17812719-d9ec-4abb-9a8b-d65e27c6ef30-kube-api-access-xchrw\") pod \"auto-csr-approver-29585670-xrnq5\" (UID: \"17812719-d9ec-4abb-9a8b-d65e27c6ef30\") " pod="openshift-infra/auto-csr-approver-29585670-xrnq5" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.385202 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a987f5d-b61e-43ba-a548-91b3dd5ed700-secret-volume\") pod \"collect-profiles-29585670-wrh58\" (UID: \"6a987f5d-b61e-43ba-a548-91b3dd5ed700\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585670-wrh58" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.385284 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a987f5d-b61e-43ba-a548-91b3dd5ed700-config-volume\") pod \"collect-profiles-29585670-wrh58\" (UID: \"6a987f5d-b61e-43ba-a548-91b3dd5ed700\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585670-wrh58" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.386865 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a987f5d-b61e-43ba-a548-91b3dd5ed700-config-volume\") pod \"collect-profiles-29585670-wrh58\" (UID: \"6a987f5d-b61e-43ba-a548-91b3dd5ed700\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585670-wrh58" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.393681 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a987f5d-b61e-43ba-a548-91b3dd5ed700-secret-volume\") pod \"collect-profiles-29585670-wrh58\" (UID: \"6a987f5d-b61e-43ba-a548-91b3dd5ed700\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585670-wrh58" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.409348 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqxmq\" (UniqueName: \"kubernetes.io/projected/6a987f5d-b61e-43ba-a548-91b3dd5ed700-kube-api-access-vqxmq\") pod \"collect-profiles-29585670-wrh58\" (UID: \"6a987f5d-b61e-43ba-a548-91b3dd5ed700\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585670-wrh58" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.411906 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xchrw\" (UniqueName: \"kubernetes.io/projected/17812719-d9ec-4abb-9a8b-d65e27c6ef30-kube-api-access-xchrw\") pod \"auto-csr-approver-29585670-xrnq5\" (UID: \"17812719-d9ec-4abb-9a8b-d65e27c6ef30\") " pod="openshift-infra/auto-csr-approver-29585670-xrnq5" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.481025 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585670-xrnq5" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.492952 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29585670-wrh58" Apr 02 14:30:00 crc kubenswrapper[4732]: I0402 14:30:00.950831 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585670-xrnq5"] Apr 02 14:30:01 crc kubenswrapper[4732]: I0402 14:30:01.019717 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29585670-wrh58"] Apr 02 14:30:01 crc kubenswrapper[4732]: W0402 14:30:01.026362 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a987f5d_b61e_43ba_a548_91b3dd5ed700.slice/crio-4f35dd396f1f9aa01d9d56527624e6402afcc0ea8f1eae7e5ead56910d040331 WatchSource:0}: Error finding container 4f35dd396f1f9aa01d9d56527624e6402afcc0ea8f1eae7e5ead56910d040331: Status 404 returned error can't find the container with id 4f35dd396f1f9aa01d9d56527624e6402afcc0ea8f1eae7e5ead56910d040331 Apr 02 14:30:01 crc kubenswrapper[4732]: I0402 14:30:01.806208 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585670-xrnq5" event={"ID":"17812719-d9ec-4abb-9a8b-d65e27c6ef30","Type":"ContainerStarted","Data":"dd2cabcd08654b42fb34463e561c9e2d2ee8e950a6395be0781e03bfecc3b650"} Apr 02 14:30:01 crc kubenswrapper[4732]: I0402 14:30:01.808378 4732 generic.go:334] "Generic (PLEG): container finished" podID="6a987f5d-b61e-43ba-a548-91b3dd5ed700" containerID="5ab4a8821cce61427dcbcb760065b6392f4c79fe887645c95d4fa05513f05af6" exitCode=0 Apr 02 14:30:01 crc kubenswrapper[4732]: I0402 14:30:01.808416 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29585670-wrh58" event={"ID":"6a987f5d-b61e-43ba-a548-91b3dd5ed700","Type":"ContainerDied","Data":"5ab4a8821cce61427dcbcb760065b6392f4c79fe887645c95d4fa05513f05af6"} Apr 02 14:30:01 crc kubenswrapper[4732]: I0402 14:30:01.808546 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29585670-wrh58" event={"ID":"6a987f5d-b61e-43ba-a548-91b3dd5ed700","Type":"ContainerStarted","Data":"4f35dd396f1f9aa01d9d56527624e6402afcc0ea8f1eae7e5ead56910d040331"} Apr 02 14:30:03 crc kubenswrapper[4732]: I0402 14:30:03.266684 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29585670-wrh58" Apr 02 14:30:03 crc kubenswrapper[4732]: I0402 14:30:03.343608 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqxmq\" (UniqueName: \"kubernetes.io/projected/6a987f5d-b61e-43ba-a548-91b3dd5ed700-kube-api-access-vqxmq\") pod \"6a987f5d-b61e-43ba-a548-91b3dd5ed700\" (UID: \"6a987f5d-b61e-43ba-a548-91b3dd5ed700\") " Apr 02 14:30:03 crc kubenswrapper[4732]: I0402 14:30:03.343715 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a987f5d-b61e-43ba-a548-91b3dd5ed700-config-volume\") pod \"6a987f5d-b61e-43ba-a548-91b3dd5ed700\" (UID: \"6a987f5d-b61e-43ba-a548-91b3dd5ed700\") " Apr 02 14:30:03 crc kubenswrapper[4732]: I0402 14:30:03.343848 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a987f5d-b61e-43ba-a548-91b3dd5ed700-secret-volume\") pod \"6a987f5d-b61e-43ba-a548-91b3dd5ed700\" (UID: \"6a987f5d-b61e-43ba-a548-91b3dd5ed700\") " Apr 02 14:30:03 crc kubenswrapper[4732]: I0402 14:30:03.344976 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a987f5d-b61e-43ba-a548-91b3dd5ed700-config-volume" (OuterVolumeSpecName: "config-volume") pod "6a987f5d-b61e-43ba-a548-91b3dd5ed700" (UID: "6a987f5d-b61e-43ba-a548-91b3dd5ed700"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:30:03 crc kubenswrapper[4732]: I0402 14:30:03.354776 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a987f5d-b61e-43ba-a548-91b3dd5ed700-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6a987f5d-b61e-43ba-a548-91b3dd5ed700" (UID: "6a987f5d-b61e-43ba-a548-91b3dd5ed700"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:30:03 crc kubenswrapper[4732]: I0402 14:30:03.355077 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a987f5d-b61e-43ba-a548-91b3dd5ed700-kube-api-access-vqxmq" (OuterVolumeSpecName: "kube-api-access-vqxmq") pod "6a987f5d-b61e-43ba-a548-91b3dd5ed700" (UID: "6a987f5d-b61e-43ba-a548-91b3dd5ed700"). InnerVolumeSpecName "kube-api-access-vqxmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:30:03 crc kubenswrapper[4732]: I0402 14:30:03.445783 4732 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a987f5d-b61e-43ba-a548-91b3dd5ed700-secret-volume\") on node \"crc\" DevicePath \"\"" Apr 02 14:30:03 crc kubenswrapper[4732]: I0402 14:30:03.446228 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqxmq\" (UniqueName: \"kubernetes.io/projected/6a987f5d-b61e-43ba-a548-91b3dd5ed700-kube-api-access-vqxmq\") on node \"crc\" DevicePath \"\"" Apr 02 14:30:03 crc kubenswrapper[4732]: I0402 14:30:03.446298 4732 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a987f5d-b61e-43ba-a548-91b3dd5ed700-config-volume\") on node \"crc\" DevicePath \"\"" Apr 02 14:30:03 crc kubenswrapper[4732]: I0402 14:30:03.827519 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29585670-wrh58" Apr 02 14:30:03 crc kubenswrapper[4732]: I0402 14:30:03.827515 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29585670-wrh58" event={"ID":"6a987f5d-b61e-43ba-a548-91b3dd5ed700","Type":"ContainerDied","Data":"4f35dd396f1f9aa01d9d56527624e6402afcc0ea8f1eae7e5ead56910d040331"} Apr 02 14:30:03 crc kubenswrapper[4732]: I0402 14:30:03.827638 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f35dd396f1f9aa01d9d56527624e6402afcc0ea8f1eae7e5ead56910d040331" Apr 02 14:30:03 crc kubenswrapper[4732]: I0402 14:30:03.829722 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585670-xrnq5" event={"ID":"17812719-d9ec-4abb-9a8b-d65e27c6ef30","Type":"ContainerDied","Data":"8c490903641df2d47a2f5c37dea59bed6e7b91f01c5256efa117fed3111d2c00"} Apr 02 14:30:03 crc kubenswrapper[4732]: I0402 14:30:03.829576 4732 generic.go:334] "Generic (PLEG): container finished" podID="17812719-d9ec-4abb-9a8b-d65e27c6ef30" containerID="8c490903641df2d47a2f5c37dea59bed6e7b91f01c5256efa117fed3111d2c00" exitCode=0 Apr 02 14:30:04 crc kubenswrapper[4732]: I0402 14:30:04.339931 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29585625-5cdrx"] Apr 02 14:30:04 crc kubenswrapper[4732]: I0402 14:30:04.348418 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29585625-5cdrx"] Apr 02 14:30:04 crc kubenswrapper[4732]: I0402 14:30:04.697919 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48fbf37e-5802-4553-8b9a-ecb5a5a1a379" path="/var/lib/kubelet/pods/48fbf37e-5802-4553-8b9a-ecb5a5a1a379/volumes" Apr 02 14:30:05 crc kubenswrapper[4732]: I0402 14:30:05.156068 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585670-xrnq5" Apr 02 14:30:05 crc kubenswrapper[4732]: I0402 14:30:05.295787 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xchrw\" (UniqueName: \"kubernetes.io/projected/17812719-d9ec-4abb-9a8b-d65e27c6ef30-kube-api-access-xchrw\") pod \"17812719-d9ec-4abb-9a8b-d65e27c6ef30\" (UID: \"17812719-d9ec-4abb-9a8b-d65e27c6ef30\") " Apr 02 14:30:05 crc kubenswrapper[4732]: I0402 14:30:05.301019 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17812719-d9ec-4abb-9a8b-d65e27c6ef30-kube-api-access-xchrw" (OuterVolumeSpecName: "kube-api-access-xchrw") pod "17812719-d9ec-4abb-9a8b-d65e27c6ef30" (UID: "17812719-d9ec-4abb-9a8b-d65e27c6ef30"). InnerVolumeSpecName "kube-api-access-xchrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:30:05 crc kubenswrapper[4732]: I0402 14:30:05.397416 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xchrw\" (UniqueName: \"kubernetes.io/projected/17812719-d9ec-4abb-9a8b-d65e27c6ef30-kube-api-access-xchrw\") on node \"crc\" DevicePath \"\"" Apr 02 14:30:05 crc kubenswrapper[4732]: I0402 14:30:05.851280 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585670-xrnq5" event={"ID":"17812719-d9ec-4abb-9a8b-d65e27c6ef30","Type":"ContainerDied","Data":"dd2cabcd08654b42fb34463e561c9e2d2ee8e950a6395be0781e03bfecc3b650"} Apr 02 14:30:05 crc kubenswrapper[4732]: I0402 14:30:05.851605 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd2cabcd08654b42fb34463e561c9e2d2ee8e950a6395be0781e03bfecc3b650" Apr 02 14:30:05 crc kubenswrapper[4732]: I0402 14:30:05.851424 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585670-xrnq5" Apr 02 14:30:06 crc kubenswrapper[4732]: I0402 14:30:06.233958 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585664-kmmbv"] Apr 02 14:30:06 crc kubenswrapper[4732]: I0402 14:30:06.248042 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585664-kmmbv"] Apr 02 14:30:06 crc kubenswrapper[4732]: I0402 14:30:06.706802 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38bd5a65-41cb-4390-933d-6df35e2438c7" path="/var/lib/kubelet/pods/38bd5a65-41cb-4390-933d-6df35e2438c7/volumes" Apr 02 14:30:31 crc kubenswrapper[4732]: I0402 14:30:31.925165 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:30:31 crc kubenswrapper[4732]: I0402 14:30:31.925752 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:30:52 crc kubenswrapper[4732]: I0402 14:30:52.291841 4732 scope.go:117] "RemoveContainer" containerID="a8b2676b1287709c5661be7b28fbfa8d6b460cf24ca56ca91aef3a433cb1c1a4" Apr 02 14:30:52 crc kubenswrapper[4732]: I0402 14:30:52.354843 4732 scope.go:117] "RemoveContainer" containerID="d9e3c354334015dbdb65c55a464a9570a2373e628e8e687fe6fa57065771d204" Apr 02 14:31:01 crc kubenswrapper[4732]: I0402 14:31:01.924930 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:31:01 crc kubenswrapper[4732]: I0402 14:31:01.925526 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:31:31 crc kubenswrapper[4732]: I0402 14:31:31.925009 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:31:31 crc kubenswrapper[4732]: I0402 14:31:31.925570 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:31:31 crc kubenswrapper[4732]: I0402 14:31:31.925636 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 14:31:31 crc kubenswrapper[4732]: I0402 14:31:31.926397 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2"} pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 02 14:31:31 crc kubenswrapper[4732]: I0402 14:31:31.926443 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" containerID="cri-o://709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" gracePeriod=600 Apr 02 14:31:32 crc kubenswrapper[4732]: E0402 14:31:32.052435 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:31:32 crc kubenswrapper[4732]: I0402 14:31:32.767810 4732 generic.go:334] "Generic (PLEG): container finished" podID="38409e5e-4545-49da-8f6c-4bfb30582878" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" exitCode=0 Apr 02 14:31:32 crc kubenswrapper[4732]: I0402 14:31:32.767885 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerDied","Data":"709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2"} Apr 02 14:31:32 crc kubenswrapper[4732]: I0402 14:31:32.767944 4732 scope.go:117] "RemoveContainer" containerID="16d76f892a3a796d1d3cc9570734bbadcf82aeca2b7902fb2000aee8a3ef6008" Apr 02 14:31:32 crc kubenswrapper[4732]: I0402 14:31:32.768839 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:31:32 crc kubenswrapper[4732]: E0402 14:31:32.769498 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:31:48 crc kubenswrapper[4732]: I0402 14:31:48.680520 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:31:48 crc kubenswrapper[4732]: E0402 14:31:48.681434 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:32:00 crc kubenswrapper[4732]: I0402 14:32:00.137797 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585672-z2vvj"] Apr 02 14:32:00 crc kubenswrapper[4732]: E0402 14:32:00.138864 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17812719-d9ec-4abb-9a8b-d65e27c6ef30" containerName="oc" Apr 02 14:32:00 crc kubenswrapper[4732]: I0402 14:32:00.138881 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="17812719-d9ec-4abb-9a8b-d65e27c6ef30" containerName="oc" Apr 02 14:32:00 crc kubenswrapper[4732]: E0402 14:32:00.138908 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a987f5d-b61e-43ba-a548-91b3dd5ed700" containerName="collect-profiles" Apr 02 14:32:00 crc kubenswrapper[4732]: I0402 14:32:00.138915 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a987f5d-b61e-43ba-a548-91b3dd5ed700" containerName="collect-profiles" Apr 02 14:32:00 crc kubenswrapper[4732]: I0402 14:32:00.139168 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a987f5d-b61e-43ba-a548-91b3dd5ed700" containerName="collect-profiles" Apr 02 14:32:00 crc kubenswrapper[4732]: I0402 14:32:00.139200 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="17812719-d9ec-4abb-9a8b-d65e27c6ef30" containerName="oc" Apr 02 14:32:00 crc kubenswrapper[4732]: I0402 14:32:00.140028 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585672-z2vvj" Apr 02 14:32:00 crc kubenswrapper[4732]: I0402 14:32:00.142140 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:32:00 crc kubenswrapper[4732]: I0402 14:32:00.142564 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:32:00 crc kubenswrapper[4732]: I0402 14:32:00.143127 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:32:00 crc kubenswrapper[4732]: I0402 14:32:00.149816 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585672-z2vvj"] Apr 02 14:32:00 crc kubenswrapper[4732]: I0402 14:32:00.247071 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xvr9\" (UniqueName: \"kubernetes.io/projected/16ee98f5-a231-400e-89c4-6f663c21da1c-kube-api-access-6xvr9\") pod \"auto-csr-approver-29585672-z2vvj\" (UID: \"16ee98f5-a231-400e-89c4-6f663c21da1c\") " pod="openshift-infra/auto-csr-approver-29585672-z2vvj" Apr 02 14:32:00 crc kubenswrapper[4732]: I0402 14:32:00.349301 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xvr9\" (UniqueName: \"kubernetes.io/projected/16ee98f5-a231-400e-89c4-6f663c21da1c-kube-api-access-6xvr9\") pod \"auto-csr-approver-29585672-z2vvj\" (UID: \"16ee98f5-a231-400e-89c4-6f663c21da1c\") " pod="openshift-infra/auto-csr-approver-29585672-z2vvj" Apr 02 14:32:00 crc kubenswrapper[4732]: I0402 14:32:00.372553 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xvr9\" (UniqueName: \"kubernetes.io/projected/16ee98f5-a231-400e-89c4-6f663c21da1c-kube-api-access-6xvr9\") pod \"auto-csr-approver-29585672-z2vvj\" (UID: \"16ee98f5-a231-400e-89c4-6f663c21da1c\") " pod="openshift-infra/auto-csr-approver-29585672-z2vvj" Apr 02 14:32:00 crc kubenswrapper[4732]: I0402 14:32:00.473212 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585672-z2vvj" Apr 02 14:32:00 crc kubenswrapper[4732]: I0402 14:32:00.901886 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585672-z2vvj"] Apr 02 14:32:01 crc kubenswrapper[4732]: I0402 14:32:01.030121 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585672-z2vvj" event={"ID":"16ee98f5-a231-400e-89c4-6f663c21da1c","Type":"ContainerStarted","Data":"994cf63f63b591cbb59e8ec5c24f7fac2aa3ffef3a7d20d7ed80005cd40b66e1"} Apr 02 14:32:01 crc kubenswrapper[4732]: I0402 14:32:01.679982 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:32:01 crc kubenswrapper[4732]: E0402 14:32:01.680550 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:32:03 crc kubenswrapper[4732]: I0402 14:32:03.048799 4732 generic.go:334] "Generic (PLEG): container finished" podID="16ee98f5-a231-400e-89c4-6f663c21da1c" containerID="e991865fb2a57a5eb8462921eda79aaad441dba5696c26ba39b12cd683b2ecc4" exitCode=0 Apr 02 14:32:03 crc kubenswrapper[4732]: I0402 14:32:03.048886 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585672-z2vvj" event={"ID":"16ee98f5-a231-400e-89c4-6f663c21da1c","Type":"ContainerDied","Data":"e991865fb2a57a5eb8462921eda79aaad441dba5696c26ba39b12cd683b2ecc4"} Apr 02 14:32:04 crc kubenswrapper[4732]: I0402 14:32:04.421510 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585672-z2vvj" Apr 02 14:32:04 crc kubenswrapper[4732]: I0402 14:32:04.531403 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xvr9\" (UniqueName: \"kubernetes.io/projected/16ee98f5-a231-400e-89c4-6f663c21da1c-kube-api-access-6xvr9\") pod \"16ee98f5-a231-400e-89c4-6f663c21da1c\" (UID: \"16ee98f5-a231-400e-89c4-6f663c21da1c\") " Apr 02 14:32:04 crc kubenswrapper[4732]: I0402 14:32:04.536762 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16ee98f5-a231-400e-89c4-6f663c21da1c-kube-api-access-6xvr9" (OuterVolumeSpecName: "kube-api-access-6xvr9") pod "16ee98f5-a231-400e-89c4-6f663c21da1c" (UID: "16ee98f5-a231-400e-89c4-6f663c21da1c"). InnerVolumeSpecName "kube-api-access-6xvr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:32:04 crc kubenswrapper[4732]: I0402 14:32:04.634738 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xvr9\" (UniqueName: \"kubernetes.io/projected/16ee98f5-a231-400e-89c4-6f663c21da1c-kube-api-access-6xvr9\") on node \"crc\" DevicePath \"\"" Apr 02 14:32:05 crc kubenswrapper[4732]: I0402 14:32:05.068557 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585672-z2vvj" event={"ID":"16ee98f5-a231-400e-89c4-6f663c21da1c","Type":"ContainerDied","Data":"994cf63f63b591cbb59e8ec5c24f7fac2aa3ffef3a7d20d7ed80005cd40b66e1"} Apr 02 14:32:05 crc kubenswrapper[4732]: I0402 14:32:05.068607 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="994cf63f63b591cbb59e8ec5c24f7fac2aa3ffef3a7d20d7ed80005cd40b66e1" Apr 02 14:32:05 crc kubenswrapper[4732]: I0402 14:32:05.068687 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585672-z2vvj" Apr 02 14:32:05 crc kubenswrapper[4732]: I0402 14:32:05.520674 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585666-9z9sf"] Apr 02 14:32:05 crc kubenswrapper[4732]: I0402 14:32:05.534014 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585666-9z9sf"] Apr 02 14:32:06 crc kubenswrapper[4732]: I0402 14:32:06.692408 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f72fbefc-1812-464a-8993-75c02e51514f" path="/var/lib/kubelet/pods/f72fbefc-1812-464a-8993-75c02e51514f/volumes" Apr 02 14:32:14 crc kubenswrapper[4732]: I0402 14:32:14.689142 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:32:14 crc kubenswrapper[4732]: E0402 14:32:14.689962 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:32:28 crc kubenswrapper[4732]: I0402 14:32:28.681047 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:32:28 crc kubenswrapper[4732]: E0402 14:32:28.682069 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:32:41 crc kubenswrapper[4732]: I0402 14:32:41.680468 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:32:41 crc kubenswrapper[4732]: E0402 14:32:41.681280 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:32:52 crc kubenswrapper[4732]: I0402 14:32:52.446427 4732 scope.go:117] "RemoveContainer" containerID="02a1ff0a1b427720eeee8e741ed3cbed54d1320afd56a18898886be8d068b9c2" Apr 02 14:32:53 crc kubenswrapper[4732]: I0402 14:32:53.680826 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:32:53 crc kubenswrapper[4732]: E0402 14:32:53.681420 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:33:05 crc kubenswrapper[4732]: I0402 14:33:05.680799 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:33:05 crc kubenswrapper[4732]: E0402 14:33:05.681804 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:33:19 crc kubenswrapper[4732]: I0402 14:33:19.680397 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:33:19 crc kubenswrapper[4732]: E0402 14:33:19.681322 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:33:32 crc kubenswrapper[4732]: I0402 14:33:32.681317 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:33:32 crc kubenswrapper[4732]: E0402 14:33:32.682229 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:33:44 crc kubenswrapper[4732]: I0402 14:33:44.686786 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:33:44 crc kubenswrapper[4732]: E0402 14:33:44.687570 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:33:55 crc kubenswrapper[4732]: I0402 14:33:55.680286 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:33:55 crc kubenswrapper[4732]: E0402 14:33:55.681300 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:34:00 crc kubenswrapper[4732]: I0402 14:34:00.168945 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585674-jn5d6"] Apr 02 14:34:00 crc kubenswrapper[4732]: E0402 14:34:00.170750 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ee98f5-a231-400e-89c4-6f663c21da1c" containerName="oc" Apr 02 14:34:00 crc kubenswrapper[4732]: I0402 14:34:00.170783 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ee98f5-a231-400e-89c4-6f663c21da1c" containerName="oc" Apr 02 14:34:00 crc kubenswrapper[4732]: I0402 14:34:00.171248 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="16ee98f5-a231-400e-89c4-6f663c21da1c" containerName="oc" Apr 02 14:34:00 crc kubenswrapper[4732]: I0402 14:34:00.172715 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585674-jn5d6" Apr 02 14:34:00 crc kubenswrapper[4732]: I0402 14:34:00.174922 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:34:00 crc kubenswrapper[4732]: I0402 14:34:00.175298 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:34:00 crc kubenswrapper[4732]: I0402 14:34:00.175529 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:34:00 crc kubenswrapper[4732]: I0402 14:34:00.183641 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585674-jn5d6"] Apr 02 14:34:00 crc kubenswrapper[4732]: I0402 14:34:00.283806 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt6gx\" (UniqueName: \"kubernetes.io/projected/b6cfcb44-41f6-400e-b7b5-a881187eb46f-kube-api-access-gt6gx\") pod \"auto-csr-approver-29585674-jn5d6\" (UID: \"b6cfcb44-41f6-400e-b7b5-a881187eb46f\") " pod="openshift-infra/auto-csr-approver-29585674-jn5d6" Apr 02 14:34:00 crc kubenswrapper[4732]: I0402 14:34:00.386836 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt6gx\" (UniqueName: \"kubernetes.io/projected/b6cfcb44-41f6-400e-b7b5-a881187eb46f-kube-api-access-gt6gx\") pod \"auto-csr-approver-29585674-jn5d6\" (UID: \"b6cfcb44-41f6-400e-b7b5-a881187eb46f\") " pod="openshift-infra/auto-csr-approver-29585674-jn5d6" Apr 02 14:34:00 crc kubenswrapper[4732]: I0402 14:34:00.405982 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt6gx\" (UniqueName: \"kubernetes.io/projected/b6cfcb44-41f6-400e-b7b5-a881187eb46f-kube-api-access-gt6gx\") pod \"auto-csr-approver-29585674-jn5d6\" (UID: \"b6cfcb44-41f6-400e-b7b5-a881187eb46f\") " pod="openshift-infra/auto-csr-approver-29585674-jn5d6" Apr 02 14:34:00 crc kubenswrapper[4732]: I0402 14:34:00.515630 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585674-jn5d6" Apr 02 14:34:01 crc kubenswrapper[4732]: I0402 14:34:01.015050 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585674-jn5d6"] Apr 02 14:34:01 crc kubenswrapper[4732]: I0402 14:34:01.234157 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585674-jn5d6" event={"ID":"b6cfcb44-41f6-400e-b7b5-a881187eb46f","Type":"ContainerStarted","Data":"0a26de0d1ef1e0c8c7d94b990aaf9ceb4a5d3c9c690a215a1fedcc041db9118a"} Apr 02 14:34:03 crc kubenswrapper[4732]: I0402 14:34:03.263999 4732 generic.go:334] "Generic (PLEG): container finished" podID="b6cfcb44-41f6-400e-b7b5-a881187eb46f" containerID="644e12a213d36dbaf61adacd3d46a4851e3728404d3bfd3873c17c62266ec012" exitCode=0 Apr 02 14:34:03 crc kubenswrapper[4732]: I0402 14:34:03.264127 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585674-jn5d6" event={"ID":"b6cfcb44-41f6-400e-b7b5-a881187eb46f","Type":"ContainerDied","Data":"644e12a213d36dbaf61adacd3d46a4851e3728404d3bfd3873c17c62266ec012"} Apr 02 14:34:04 crc kubenswrapper[4732]: I0402 14:34:04.641206 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585674-jn5d6" Apr 02 14:34:04 crc kubenswrapper[4732]: I0402 14:34:04.784544 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt6gx\" (UniqueName: \"kubernetes.io/projected/b6cfcb44-41f6-400e-b7b5-a881187eb46f-kube-api-access-gt6gx\") pod \"b6cfcb44-41f6-400e-b7b5-a881187eb46f\" (UID: \"b6cfcb44-41f6-400e-b7b5-a881187eb46f\") " Apr 02 14:34:04 crc kubenswrapper[4732]: I0402 14:34:04.790698 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cfcb44-41f6-400e-b7b5-a881187eb46f-kube-api-access-gt6gx" (OuterVolumeSpecName: "kube-api-access-gt6gx") pod "b6cfcb44-41f6-400e-b7b5-a881187eb46f" (UID: "b6cfcb44-41f6-400e-b7b5-a881187eb46f"). InnerVolumeSpecName "kube-api-access-gt6gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:34:04 crc kubenswrapper[4732]: I0402 14:34:04.887362 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gt6gx\" (UniqueName: \"kubernetes.io/projected/b6cfcb44-41f6-400e-b7b5-a881187eb46f-kube-api-access-gt6gx\") on node \"crc\" DevicePath \"\"" Apr 02 14:34:05 crc kubenswrapper[4732]: I0402 14:34:05.291737 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585674-jn5d6" event={"ID":"b6cfcb44-41f6-400e-b7b5-a881187eb46f","Type":"ContainerDied","Data":"0a26de0d1ef1e0c8c7d94b990aaf9ceb4a5d3c9c690a215a1fedcc041db9118a"} Apr 02 14:34:05 crc kubenswrapper[4732]: I0402 14:34:05.291777 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a26de0d1ef1e0c8c7d94b990aaf9ceb4a5d3c9c690a215a1fedcc041db9118a" Apr 02 14:34:05 crc kubenswrapper[4732]: I0402 14:34:05.291839 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585674-jn5d6" Apr 02 14:34:05 crc kubenswrapper[4732]: I0402 14:34:05.714066 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585668-p66xv"] Apr 02 14:34:05 crc kubenswrapper[4732]: I0402 14:34:05.722302 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585668-p66xv"] Apr 02 14:34:06 crc kubenswrapper[4732]: I0402 14:34:06.697268 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e48ef20-b922-4518-98fa-3286d5da4b2e" path="/var/lib/kubelet/pods/2e48ef20-b922-4518-98fa-3286d5da4b2e/volumes" Apr 02 14:34:08 crc kubenswrapper[4732]: I0402 14:34:08.680551 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:34:08 crc kubenswrapper[4732]: E0402 14:34:08.680934 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:34:23 crc kubenswrapper[4732]: I0402 14:34:23.680480 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:34:23 crc kubenswrapper[4732]: E0402 14:34:23.681566 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:34:34 crc kubenswrapper[4732]: I0402 14:34:34.680452 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:34:34 crc kubenswrapper[4732]: E0402 14:34:34.681238 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:34:45 crc kubenswrapper[4732]: I0402 14:34:45.680845 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:34:45 crc kubenswrapper[4732]: E0402 14:34:45.681680 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:34:52 crc kubenswrapper[4732]: I0402 14:34:52.564370 4732 scope.go:117] "RemoveContainer" containerID="4c64b775d49333a517532eef0f7ae34aa9ff3d3c83b6f79a0fc07badcd3cc13f" Apr 02 14:34:57 crc kubenswrapper[4732]: I0402 14:34:57.680582 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:34:57 crc kubenswrapper[4732]: E0402 14:34:57.681379 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:35:09 crc kubenswrapper[4732]: I0402 14:35:09.680263 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:35:09 crc kubenswrapper[4732]: E0402 14:35:09.681037 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:35:20 crc kubenswrapper[4732]: I0402 14:35:20.681475 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:35:20 crc kubenswrapper[4732]: E0402 14:35:20.682341 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:35:28 crc kubenswrapper[4732]: I0402 14:35:28.084978 4732 generic.go:334] "Generic (PLEG): container finished" podID="17645883-477c-437a-b87a-b412f9bbe29e" containerID="34150bd36365edf1c470c6c9391be59b88169f3c84fe78522f74eecbaea9e9be" exitCode=0 Apr 02 14:35:28 crc kubenswrapper[4732]: I0402 14:35:28.085090 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"17645883-477c-437a-b87a-b412f9bbe29e","Type":"ContainerDied","Data":"34150bd36365edf1c470c6c9391be59b88169f3c84fe78522f74eecbaea9e9be"} Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.470388 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.653017 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/17645883-477c-437a-b87a-b412f9bbe29e-test-operator-ephemeral-workdir\") pod \"17645883-477c-437a-b87a-b412f9bbe29e\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.653087 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"17645883-477c-437a-b87a-b412f9bbe29e\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.653226 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/17645883-477c-437a-b87a-b412f9bbe29e-test-operator-ephemeral-temporary\") pod \"17645883-477c-437a-b87a-b412f9bbe29e\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.653805 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17645883-477c-437a-b87a-b412f9bbe29e-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "17645883-477c-437a-b87a-b412f9bbe29e" (UID: "17645883-477c-437a-b87a-b412f9bbe29e"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.653872 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/17645883-477c-437a-b87a-b412f9bbe29e-openstack-config\") pod \"17645883-477c-437a-b87a-b412f9bbe29e\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.654617 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17645883-477c-437a-b87a-b412f9bbe29e-config-data" (OuterVolumeSpecName: "config-data") pod "17645883-477c-437a-b87a-b412f9bbe29e" (UID: "17645883-477c-437a-b87a-b412f9bbe29e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.654735 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17645883-477c-437a-b87a-b412f9bbe29e-config-data\") pod \"17645883-477c-437a-b87a-b412f9bbe29e\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.654866 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xm6j\" (UniqueName: \"kubernetes.io/projected/17645883-477c-437a-b87a-b412f9bbe29e-kube-api-access-6xm6j\") pod \"17645883-477c-437a-b87a-b412f9bbe29e\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.654929 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17645883-477c-437a-b87a-b412f9bbe29e-ssh-key\") pod \"17645883-477c-437a-b87a-b412f9bbe29e\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.655868 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/17645883-477c-437a-b87a-b412f9bbe29e-ca-certs\") pod \"17645883-477c-437a-b87a-b412f9bbe29e\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.655947 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/17645883-477c-437a-b87a-b412f9bbe29e-openstack-config-secret\") pod \"17645883-477c-437a-b87a-b412f9bbe29e\" (UID: \"17645883-477c-437a-b87a-b412f9bbe29e\") " Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.656974 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17645883-477c-437a-b87a-b412f9bbe29e-config-data\") on node \"crc\" DevicePath \"\"" Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.657001 4732 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/17645883-477c-437a-b87a-b412f9bbe29e-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.658392 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "test-operator-logs") pod "17645883-477c-437a-b87a-b412f9bbe29e" (UID: "17645883-477c-437a-b87a-b412f9bbe29e"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.668594 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17645883-477c-437a-b87a-b412f9bbe29e-kube-api-access-6xm6j" (OuterVolumeSpecName: "kube-api-access-6xm6j") pod "17645883-477c-437a-b87a-b412f9bbe29e" (UID: "17645883-477c-437a-b87a-b412f9bbe29e"). InnerVolumeSpecName "kube-api-access-6xm6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.669307 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17645883-477c-437a-b87a-b412f9bbe29e-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "17645883-477c-437a-b87a-b412f9bbe29e" (UID: "17645883-477c-437a-b87a-b412f9bbe29e"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.680772 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17645883-477c-437a-b87a-b412f9bbe29e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "17645883-477c-437a-b87a-b412f9bbe29e" (UID: "17645883-477c-437a-b87a-b412f9bbe29e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.681189 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17645883-477c-437a-b87a-b412f9bbe29e-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "17645883-477c-437a-b87a-b412f9bbe29e" (UID: "17645883-477c-437a-b87a-b412f9bbe29e"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.681200 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17645883-477c-437a-b87a-b412f9bbe29e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "17645883-477c-437a-b87a-b412f9bbe29e" (UID: "17645883-477c-437a-b87a-b412f9bbe29e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.700781 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17645883-477c-437a-b87a-b412f9bbe29e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "17645883-477c-437a-b87a-b412f9bbe29e" (UID: "17645883-477c-437a-b87a-b412f9bbe29e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.759909 4732 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/17645883-477c-437a-b87a-b412f9bbe29e-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.760060 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.760200 4732 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/17645883-477c-437a-b87a-b412f9bbe29e-openstack-config\") on node \"crc\" DevicePath \"\"" Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.760226 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xm6j\" (UniqueName: \"kubernetes.io/projected/17645883-477c-437a-b87a-b412f9bbe29e-kube-api-access-6xm6j\") on node \"crc\" DevicePath \"\"" Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.760238 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17645883-477c-437a-b87a-b412f9bbe29e-ssh-key\") on node \"crc\" DevicePath \"\"" Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.760250 4732 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/17645883-477c-437a-b87a-b412f9bbe29e-ca-certs\") on node \"crc\" DevicePath \"\"" Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.760261 4732 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/17645883-477c-437a-b87a-b412f9bbe29e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.779159 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Apr 02 14:35:29 crc kubenswrapper[4732]: I0402 14:35:29.862453 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Apr 02 14:35:30 crc kubenswrapper[4732]: I0402 14:35:30.108954 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"17645883-477c-437a-b87a-b412f9bbe29e","Type":"ContainerDied","Data":"b0d5f795305bf6232df4d60677049908699b9f3e11c852cf61d3c8afbd703a82"} Apr 02 14:35:30 crc kubenswrapper[4732]: I0402 14:35:30.108993 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0d5f795305bf6232df4d60677049908699b9f3e11c852cf61d3c8afbd703a82" Apr 02 14:35:30 crc kubenswrapper[4732]: I0402 14:35:30.109044 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Apr 02 14:35:32 crc kubenswrapper[4732]: I0402 14:35:32.534861 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Apr 02 14:35:32 crc kubenswrapper[4732]: E0402 14:35:32.536918 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17645883-477c-437a-b87a-b412f9bbe29e" containerName="tempest-tests-tempest-tests-runner" Apr 02 14:35:32 crc kubenswrapper[4732]: I0402 14:35:32.536939 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="17645883-477c-437a-b87a-b412f9bbe29e" containerName="tempest-tests-tempest-tests-runner" Apr 02 14:35:32 crc kubenswrapper[4732]: E0402 14:35:32.536970 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6cfcb44-41f6-400e-b7b5-a881187eb46f" containerName="oc" Apr 02 14:35:32 crc kubenswrapper[4732]: I0402 14:35:32.536978 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6cfcb44-41f6-400e-b7b5-a881187eb46f" containerName="oc" Apr 02 14:35:32 crc kubenswrapper[4732]: I0402 14:35:32.537164 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="17645883-477c-437a-b87a-b412f9bbe29e" containerName="tempest-tests-tempest-tests-runner" Apr 02 14:35:32 crc kubenswrapper[4732]: I0402 14:35:32.537183 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6cfcb44-41f6-400e-b7b5-a881187eb46f" containerName="oc" Apr 02 14:35:32 crc kubenswrapper[4732]: I0402 14:35:32.537794 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Apr 02 14:35:32 crc kubenswrapper[4732]: I0402 14:35:32.539804 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-8fmdn" Apr 02 14:35:32 crc kubenswrapper[4732]: I0402 14:35:32.562334 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Apr 02 14:35:32 crc kubenswrapper[4732]: I0402 14:35:32.712529 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c4323eb5-6b45-4766-961f-eef53306dad0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Apr 02 14:35:32 crc kubenswrapper[4732]: I0402 14:35:32.712696 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bzdk\" (UniqueName: \"kubernetes.io/projected/c4323eb5-6b45-4766-961f-eef53306dad0-kube-api-access-6bzdk\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c4323eb5-6b45-4766-961f-eef53306dad0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Apr 02 14:35:32 crc kubenswrapper[4732]: I0402 14:35:32.814552 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bzdk\" (UniqueName: \"kubernetes.io/projected/c4323eb5-6b45-4766-961f-eef53306dad0-kube-api-access-6bzdk\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c4323eb5-6b45-4766-961f-eef53306dad0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Apr 02 14:35:32 crc kubenswrapper[4732]: I0402 14:35:32.814697 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c4323eb5-6b45-4766-961f-eef53306dad0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Apr 02 14:35:32 crc kubenswrapper[4732]: I0402 14:35:32.815009 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c4323eb5-6b45-4766-961f-eef53306dad0\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Apr 02 14:35:32 crc kubenswrapper[4732]: I0402 14:35:32.838813 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bzdk\" (UniqueName: \"kubernetes.io/projected/c4323eb5-6b45-4766-961f-eef53306dad0-kube-api-access-6bzdk\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c4323eb5-6b45-4766-961f-eef53306dad0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Apr 02 14:35:32 crc kubenswrapper[4732]: I0402 14:35:32.842569 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c4323eb5-6b45-4766-961f-eef53306dad0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Apr 02 14:35:32 crc kubenswrapper[4732]: I0402 14:35:32.868368 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Apr 02 14:35:33 crc kubenswrapper[4732]: I0402 14:35:33.381687 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Apr 02 14:35:33 crc kubenswrapper[4732]: I0402 14:35:33.381705 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 02 14:35:33 crc kubenswrapper[4732]: I0402 14:35:33.607099 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rzj69"] Apr 02 14:35:33 crc kubenswrapper[4732]: I0402 14:35:33.609965 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzj69" Apr 02 14:35:33 crc kubenswrapper[4732]: I0402 14:35:33.618280 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rzj69"] Apr 02 14:35:33 crc kubenswrapper[4732]: I0402 14:35:33.680728 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:35:33 crc kubenswrapper[4732]: E0402 14:35:33.681022 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:35:33 crc kubenswrapper[4732]: I0402 14:35:33.732887 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njmdd\" (UniqueName: \"kubernetes.io/projected/12b8342a-3c31-430b-8199-35a66ff857e0-kube-api-access-njmdd\") pod \"redhat-operators-rzj69\" (UID: \"12b8342a-3c31-430b-8199-35a66ff857e0\") " pod="openshift-marketplace/redhat-operators-rzj69" Apr 02 14:35:33 crc kubenswrapper[4732]: I0402 14:35:33.733005 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12b8342a-3c31-430b-8199-35a66ff857e0-catalog-content\") pod \"redhat-operators-rzj69\" (UID: \"12b8342a-3c31-430b-8199-35a66ff857e0\") " pod="openshift-marketplace/redhat-operators-rzj69" Apr 02 14:35:33 crc kubenswrapper[4732]: I0402 14:35:33.733052 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12b8342a-3c31-430b-8199-35a66ff857e0-utilities\") pod \"redhat-operators-rzj69\" (UID: \"12b8342a-3c31-430b-8199-35a66ff857e0\") " pod="openshift-marketplace/redhat-operators-rzj69" Apr 02 14:35:33 crc kubenswrapper[4732]: I0402 14:35:33.834618 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12b8342a-3c31-430b-8199-35a66ff857e0-catalog-content\") pod \"redhat-operators-rzj69\" (UID: \"12b8342a-3c31-430b-8199-35a66ff857e0\") " pod="openshift-marketplace/redhat-operators-rzj69" Apr 02 14:35:33 crc kubenswrapper[4732]: I0402 14:35:33.834699 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12b8342a-3c31-430b-8199-35a66ff857e0-utilities\") pod \"redhat-operators-rzj69\" (UID: \"12b8342a-3c31-430b-8199-35a66ff857e0\") " pod="openshift-marketplace/redhat-operators-rzj69" Apr 02 14:35:33 crc kubenswrapper[4732]: I0402 14:35:33.834851 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njmdd\" (UniqueName: \"kubernetes.io/projected/12b8342a-3c31-430b-8199-35a66ff857e0-kube-api-access-njmdd\") pod \"redhat-operators-rzj69\" (UID: \"12b8342a-3c31-430b-8199-35a66ff857e0\") " pod="openshift-marketplace/redhat-operators-rzj69" Apr 02 14:35:33 crc kubenswrapper[4732]: I0402 14:35:33.835357 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12b8342a-3c31-430b-8199-35a66ff857e0-catalog-content\") pod \"redhat-operators-rzj69\" (UID: \"12b8342a-3c31-430b-8199-35a66ff857e0\") " pod="openshift-marketplace/redhat-operators-rzj69" Apr 02 14:35:33 crc kubenswrapper[4732]: I0402 14:35:33.835365 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12b8342a-3c31-430b-8199-35a66ff857e0-utilities\") pod \"redhat-operators-rzj69\" (UID: \"12b8342a-3c31-430b-8199-35a66ff857e0\") " pod="openshift-marketplace/redhat-operators-rzj69" Apr 02 14:35:33 crc kubenswrapper[4732]: I0402 14:35:33.854011 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njmdd\" (UniqueName: \"kubernetes.io/projected/12b8342a-3c31-430b-8199-35a66ff857e0-kube-api-access-njmdd\") pod \"redhat-operators-rzj69\" (UID: \"12b8342a-3c31-430b-8199-35a66ff857e0\") " pod="openshift-marketplace/redhat-operators-rzj69" Apr 02 14:35:33 crc kubenswrapper[4732]: I0402 14:35:33.936344 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzj69" Apr 02 14:35:34 crc kubenswrapper[4732]: I0402 14:35:34.150339 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"c4323eb5-6b45-4766-961f-eef53306dad0","Type":"ContainerStarted","Data":"03ca46bb3712b69269b827c3a6e888180db6e50027b82eb5b412f701c0d83ae8"} Apr 02 14:35:34 crc kubenswrapper[4732]: I0402 14:35:34.421988 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rzj69"] Apr 02 14:35:34 crc kubenswrapper[4732]: W0402 14:35:34.446500 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12b8342a_3c31_430b_8199_35a66ff857e0.slice/crio-4c3b92a6d698c5387f93dc005e498d9b0165c5167a7c1261abca0e43eb1f05db WatchSource:0}: Error finding container 4c3b92a6d698c5387f93dc005e498d9b0165c5167a7c1261abca0e43eb1f05db: Status 404 returned error can't find the container with id 4c3b92a6d698c5387f93dc005e498d9b0165c5167a7c1261abca0e43eb1f05db Apr 02 14:35:35 crc kubenswrapper[4732]: I0402 14:35:35.160034 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"c4323eb5-6b45-4766-961f-eef53306dad0","Type":"ContainerStarted","Data":"9648a5d9a4299a2f7404f9148811e0fbbb89041b3f4512c215c28e9c682c9684"} Apr 02 14:35:35 crc kubenswrapper[4732]: I0402 14:35:35.161683 4732 generic.go:334] "Generic (PLEG): container finished" podID="12b8342a-3c31-430b-8199-35a66ff857e0" containerID="0acaeb2ebff395b7dbe095de49a43ab311b60cb3316918e2f29c4ce5a79c66e0" exitCode=0 Apr 02 14:35:35 crc kubenswrapper[4732]: I0402 14:35:35.161718 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzj69" event={"ID":"12b8342a-3c31-430b-8199-35a66ff857e0","Type":"ContainerDied","Data":"0acaeb2ebff395b7dbe095de49a43ab311b60cb3316918e2f29c4ce5a79c66e0"} Apr 02 14:35:35 crc kubenswrapper[4732]: I0402 14:35:35.161737 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzj69" event={"ID":"12b8342a-3c31-430b-8199-35a66ff857e0","Type":"ContainerStarted","Data":"4c3b92a6d698c5387f93dc005e498d9b0165c5167a7c1261abca0e43eb1f05db"} Apr 02 14:35:35 crc kubenswrapper[4732]: I0402 14:35:35.179429 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.258346086 podStartE2EDuration="3.17940968s" podCreationTimestamp="2026-04-02 14:35:32 +0000 UTC" firstStartedPulling="2026-04-02 14:35:33.38150925 +0000 UTC m=+3490.285916803" lastFinishedPulling="2026-04-02 14:35:34.302572844 +0000 UTC m=+3491.206980397" observedRunningTime="2026-04-02 14:35:35.170731914 +0000 UTC m=+3492.075139487" watchObservedRunningTime="2026-04-02 14:35:35.17940968 +0000 UTC m=+3492.083817233" Apr 02 14:35:36 crc kubenswrapper[4732]: I0402 14:35:36.175289 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzj69" event={"ID":"12b8342a-3c31-430b-8199-35a66ff857e0","Type":"ContainerStarted","Data":"52f8d4405f0e221b5e3c5e0206c6f78c163626e1282860d9a768f7cd4e1918b8"} Apr 02 14:35:38 crc kubenswrapper[4732]: I0402 14:35:38.203641 4732 generic.go:334] "Generic (PLEG): container finished" podID="12b8342a-3c31-430b-8199-35a66ff857e0" containerID="52f8d4405f0e221b5e3c5e0206c6f78c163626e1282860d9a768f7cd4e1918b8" exitCode=0 Apr 02 14:35:38 crc kubenswrapper[4732]: I0402 14:35:38.203673 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzj69" event={"ID":"12b8342a-3c31-430b-8199-35a66ff857e0","Type":"ContainerDied","Data":"52f8d4405f0e221b5e3c5e0206c6f78c163626e1282860d9a768f7cd4e1918b8"} Apr 02 14:35:39 crc kubenswrapper[4732]: I0402 14:35:39.214681 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzj69" event={"ID":"12b8342a-3c31-430b-8199-35a66ff857e0","Type":"ContainerStarted","Data":"87ccd57a46cc7e3198717f16d979c9000c4e9e76e93ce10b5b0ce47c0f25e11e"} Apr 02 14:35:39 crc kubenswrapper[4732]: I0402 14:35:39.242058 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rzj69" podStartSLOduration=2.766037715 podStartE2EDuration="6.242017886s" podCreationTimestamp="2026-04-02 14:35:33 +0000 UTC" firstStartedPulling="2026-04-02 14:35:35.1635757 +0000 UTC m=+3492.067983253" lastFinishedPulling="2026-04-02 14:35:38.639555841 +0000 UTC m=+3495.543963424" observedRunningTime="2026-04-02 14:35:39.232439517 +0000 UTC m=+3496.136847070" watchObservedRunningTime="2026-04-02 14:35:39.242017886 +0000 UTC m=+3496.146425439" Apr 02 14:35:43 crc kubenswrapper[4732]: I0402 14:35:43.937127 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rzj69" Apr 02 14:35:43 crc kubenswrapper[4732]: I0402 14:35:43.937791 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rzj69" Apr 02 14:35:44 crc kubenswrapper[4732]: I0402 14:35:44.686761 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:35:44 crc kubenswrapper[4732]: E0402 14:35:44.687391 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:35:44 crc kubenswrapper[4732]: I0402 14:35:44.986398 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rzj69" podUID="12b8342a-3c31-430b-8199-35a66ff857e0" containerName="registry-server" probeResult="failure" output=< Apr 02 14:35:44 crc kubenswrapper[4732]: timeout: failed to connect service ":50051" within 1s Apr 02 14:35:44 crc kubenswrapper[4732]: > Apr 02 14:35:54 crc kubenswrapper[4732]: I0402 14:35:54.007997 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rzj69" Apr 02 14:35:54 crc kubenswrapper[4732]: I0402 14:35:54.069815 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rzj69" Apr 02 14:35:54 crc kubenswrapper[4732]: I0402 14:35:54.260812 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rzj69"] Apr 02 14:35:55 crc kubenswrapper[4732]: I0402 14:35:55.386824 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rzj69" podUID="12b8342a-3c31-430b-8199-35a66ff857e0" containerName="registry-server" containerID="cri-o://87ccd57a46cc7e3198717f16d979c9000c4e9e76e93ce10b5b0ce47c0f25e11e" gracePeriod=2 Apr 02 14:35:55 crc kubenswrapper[4732]: I0402 14:35:55.870124 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzj69" Apr 02 14:35:55 crc kubenswrapper[4732]: I0402 14:35:55.893701 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njmdd\" (UniqueName: \"kubernetes.io/projected/12b8342a-3c31-430b-8199-35a66ff857e0-kube-api-access-njmdd\") pod \"12b8342a-3c31-430b-8199-35a66ff857e0\" (UID: \"12b8342a-3c31-430b-8199-35a66ff857e0\") " Apr 02 14:35:55 crc kubenswrapper[4732]: I0402 14:35:55.893871 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12b8342a-3c31-430b-8199-35a66ff857e0-utilities\") pod \"12b8342a-3c31-430b-8199-35a66ff857e0\" (UID: \"12b8342a-3c31-430b-8199-35a66ff857e0\") " Apr 02 14:35:55 crc kubenswrapper[4732]: I0402 14:35:55.893935 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12b8342a-3c31-430b-8199-35a66ff857e0-catalog-content\") pod \"12b8342a-3c31-430b-8199-35a66ff857e0\" (UID: \"12b8342a-3c31-430b-8199-35a66ff857e0\") " Apr 02 14:35:55 crc kubenswrapper[4732]: I0402 14:35:55.894808 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12b8342a-3c31-430b-8199-35a66ff857e0-utilities" (OuterVolumeSpecName: "utilities") pod "12b8342a-3c31-430b-8199-35a66ff857e0" (UID: "12b8342a-3c31-430b-8199-35a66ff857e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:35:55 crc kubenswrapper[4732]: I0402 14:35:55.900507 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12b8342a-3c31-430b-8199-35a66ff857e0-kube-api-access-njmdd" (OuterVolumeSpecName: "kube-api-access-njmdd") pod "12b8342a-3c31-430b-8199-35a66ff857e0" (UID: "12b8342a-3c31-430b-8199-35a66ff857e0"). InnerVolumeSpecName "kube-api-access-njmdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:35:55 crc kubenswrapper[4732]: I0402 14:35:55.995387 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njmdd\" (UniqueName: \"kubernetes.io/projected/12b8342a-3c31-430b-8199-35a66ff857e0-kube-api-access-njmdd\") on node \"crc\" DevicePath \"\"" Apr 02 14:35:55 crc kubenswrapper[4732]: I0402 14:35:55.995429 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12b8342a-3c31-430b-8199-35a66ff857e0-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 14:35:56 crc kubenswrapper[4732]: I0402 14:35:56.029737 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12b8342a-3c31-430b-8199-35a66ff857e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12b8342a-3c31-430b-8199-35a66ff857e0" (UID: "12b8342a-3c31-430b-8199-35a66ff857e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:35:56 crc kubenswrapper[4732]: I0402 14:35:56.096973 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12b8342a-3c31-430b-8199-35a66ff857e0-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 14:35:56 crc kubenswrapper[4732]: I0402 14:35:56.398370 4732 generic.go:334] "Generic (PLEG): container finished" podID="12b8342a-3c31-430b-8199-35a66ff857e0" containerID="87ccd57a46cc7e3198717f16d979c9000c4e9e76e93ce10b5b0ce47c0f25e11e" exitCode=0 Apr 02 14:35:56 crc kubenswrapper[4732]: I0402 14:35:56.398420 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzj69" event={"ID":"12b8342a-3c31-430b-8199-35a66ff857e0","Type":"ContainerDied","Data":"87ccd57a46cc7e3198717f16d979c9000c4e9e76e93ce10b5b0ce47c0f25e11e"} Apr 02 14:35:56 crc kubenswrapper[4732]: I0402 14:35:56.398426 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzj69" Apr 02 14:35:56 crc kubenswrapper[4732]: I0402 14:35:56.398466 4732 scope.go:117] "RemoveContainer" containerID="87ccd57a46cc7e3198717f16d979c9000c4e9e76e93ce10b5b0ce47c0f25e11e" Apr 02 14:35:56 crc kubenswrapper[4732]: I0402 14:35:56.398454 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzj69" event={"ID":"12b8342a-3c31-430b-8199-35a66ff857e0","Type":"ContainerDied","Data":"4c3b92a6d698c5387f93dc005e498d9b0165c5167a7c1261abca0e43eb1f05db"} Apr 02 14:35:56 crc kubenswrapper[4732]: I0402 14:35:56.430951 4732 scope.go:117] "RemoveContainer" containerID="52f8d4405f0e221b5e3c5e0206c6f78c163626e1282860d9a768f7cd4e1918b8" Apr 02 14:35:56 crc kubenswrapper[4732]: I0402 14:35:56.434790 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rzj69"] Apr 02 14:35:56 crc kubenswrapper[4732]: I0402 14:35:56.443249 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rzj69"] Apr 02 14:35:56 crc kubenswrapper[4732]: I0402 14:35:56.453859 4732 scope.go:117] "RemoveContainer" containerID="0acaeb2ebff395b7dbe095de49a43ab311b60cb3316918e2f29c4ce5a79c66e0" Apr 02 14:35:56 crc kubenswrapper[4732]: I0402 14:35:56.495307 4732 scope.go:117] "RemoveContainer" containerID="87ccd57a46cc7e3198717f16d979c9000c4e9e76e93ce10b5b0ce47c0f25e11e" Apr 02 14:35:56 crc kubenswrapper[4732]: E0402 14:35:56.495842 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87ccd57a46cc7e3198717f16d979c9000c4e9e76e93ce10b5b0ce47c0f25e11e\": container with ID starting with 87ccd57a46cc7e3198717f16d979c9000c4e9e76e93ce10b5b0ce47c0f25e11e not found: ID does not exist" containerID="87ccd57a46cc7e3198717f16d979c9000c4e9e76e93ce10b5b0ce47c0f25e11e" Apr 02 14:35:56 crc kubenswrapper[4732]: I0402 14:35:56.495874 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87ccd57a46cc7e3198717f16d979c9000c4e9e76e93ce10b5b0ce47c0f25e11e"} err="failed to get container status \"87ccd57a46cc7e3198717f16d979c9000c4e9e76e93ce10b5b0ce47c0f25e11e\": rpc error: code = NotFound desc = could not find container \"87ccd57a46cc7e3198717f16d979c9000c4e9e76e93ce10b5b0ce47c0f25e11e\": container with ID starting with 87ccd57a46cc7e3198717f16d979c9000c4e9e76e93ce10b5b0ce47c0f25e11e not found: ID does not exist" Apr 02 14:35:56 crc kubenswrapper[4732]: I0402 14:35:56.495895 4732 scope.go:117] "RemoveContainer" containerID="52f8d4405f0e221b5e3c5e0206c6f78c163626e1282860d9a768f7cd4e1918b8" Apr 02 14:35:56 crc kubenswrapper[4732]: E0402 14:35:56.496162 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52f8d4405f0e221b5e3c5e0206c6f78c163626e1282860d9a768f7cd4e1918b8\": container with ID starting with 52f8d4405f0e221b5e3c5e0206c6f78c163626e1282860d9a768f7cd4e1918b8 not found: ID does not exist" containerID="52f8d4405f0e221b5e3c5e0206c6f78c163626e1282860d9a768f7cd4e1918b8" Apr 02 14:35:56 crc kubenswrapper[4732]: I0402 14:35:56.496187 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52f8d4405f0e221b5e3c5e0206c6f78c163626e1282860d9a768f7cd4e1918b8"} err="failed to get container status \"52f8d4405f0e221b5e3c5e0206c6f78c163626e1282860d9a768f7cd4e1918b8\": rpc error: code = NotFound desc = could not find container \"52f8d4405f0e221b5e3c5e0206c6f78c163626e1282860d9a768f7cd4e1918b8\": container with ID starting with 52f8d4405f0e221b5e3c5e0206c6f78c163626e1282860d9a768f7cd4e1918b8 not found: ID does not exist" Apr 02 14:35:56 crc kubenswrapper[4732]: I0402 14:35:56.496203 4732 scope.go:117] "RemoveContainer" containerID="0acaeb2ebff395b7dbe095de49a43ab311b60cb3316918e2f29c4ce5a79c66e0" Apr 02 14:35:56 crc kubenswrapper[4732]: E0402 14:35:56.497225 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0acaeb2ebff395b7dbe095de49a43ab311b60cb3316918e2f29c4ce5a79c66e0\": container with ID starting with 0acaeb2ebff395b7dbe095de49a43ab311b60cb3316918e2f29c4ce5a79c66e0 not found: ID does not exist" containerID="0acaeb2ebff395b7dbe095de49a43ab311b60cb3316918e2f29c4ce5a79c66e0" Apr 02 14:35:56 crc kubenswrapper[4732]: I0402 14:35:56.497247 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0acaeb2ebff395b7dbe095de49a43ab311b60cb3316918e2f29c4ce5a79c66e0"} err="failed to get container status \"0acaeb2ebff395b7dbe095de49a43ab311b60cb3316918e2f29c4ce5a79c66e0\": rpc error: code = NotFound desc = could not find container \"0acaeb2ebff395b7dbe095de49a43ab311b60cb3316918e2f29c4ce5a79c66e0\": container with ID starting with 0acaeb2ebff395b7dbe095de49a43ab311b60cb3316918e2f29c4ce5a79c66e0 not found: ID does not exist" Apr 02 14:35:56 crc kubenswrapper[4732]: I0402 14:35:56.703321 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12b8342a-3c31-430b-8199-35a66ff857e0" path="/var/lib/kubelet/pods/12b8342a-3c31-430b-8199-35a66ff857e0/volumes" Apr 02 14:35:57 crc kubenswrapper[4732]: I0402 14:35:57.680159 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:35:57 crc kubenswrapper[4732]: E0402 14:35:57.680787 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:35:59 crc kubenswrapper[4732]: I0402 14:35:59.452709 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mjg8z/must-gather-jrdpk"] Apr 02 14:35:59 crc kubenswrapper[4732]: E0402 14:35:59.453184 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12b8342a-3c31-430b-8199-35a66ff857e0" containerName="registry-server" Apr 02 14:35:59 crc kubenswrapper[4732]: I0402 14:35:59.453199 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b8342a-3c31-430b-8199-35a66ff857e0" containerName="registry-server" Apr 02 14:35:59 crc kubenswrapper[4732]: E0402 14:35:59.453210 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12b8342a-3c31-430b-8199-35a66ff857e0" containerName="extract-content" Apr 02 14:35:59 crc kubenswrapper[4732]: I0402 14:35:59.453217 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b8342a-3c31-430b-8199-35a66ff857e0" containerName="extract-content" Apr 02 14:35:59 crc kubenswrapper[4732]: E0402 14:35:59.453244 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12b8342a-3c31-430b-8199-35a66ff857e0" containerName="extract-utilities" Apr 02 14:35:59 crc kubenswrapper[4732]: I0402 14:35:59.453250 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b8342a-3c31-430b-8199-35a66ff857e0" containerName="extract-utilities" Apr 02 14:35:59 crc kubenswrapper[4732]: I0402 14:35:59.453471 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="12b8342a-3c31-430b-8199-35a66ff857e0" containerName="registry-server" Apr 02 14:35:59 crc kubenswrapper[4732]: I0402 14:35:59.454695 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mjg8z/must-gather-jrdpk" Apr 02 14:35:59 crc kubenswrapper[4732]: I0402 14:35:59.468715 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mjg8z"/"kube-root-ca.crt" Apr 02 14:35:59 crc kubenswrapper[4732]: I0402 14:35:59.469253 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mjg8z"/"openshift-service-ca.crt" Apr 02 14:35:59 crc kubenswrapper[4732]: I0402 14:35:59.471412 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-mjg8z"/"default-dockercfg-mj5dh" Apr 02 14:35:59 crc kubenswrapper[4732]: I0402 14:35:59.510503 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mjg8z/must-gather-jrdpk"] Apr 02 14:35:59 crc kubenswrapper[4732]: I0402 14:35:59.592662 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx52g\" (UniqueName: \"kubernetes.io/projected/d3dcab4b-fbf1-451c-834e-c09b71ba0532-kube-api-access-bx52g\") pod \"must-gather-jrdpk\" (UID: \"d3dcab4b-fbf1-451c-834e-c09b71ba0532\") " pod="openshift-must-gather-mjg8z/must-gather-jrdpk" Apr 02 14:35:59 crc kubenswrapper[4732]: I0402 14:35:59.593183 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d3dcab4b-fbf1-451c-834e-c09b71ba0532-must-gather-output\") pod \"must-gather-jrdpk\" (UID: \"d3dcab4b-fbf1-451c-834e-c09b71ba0532\") " pod="openshift-must-gather-mjg8z/must-gather-jrdpk" Apr 02 14:35:59 crc kubenswrapper[4732]: I0402 14:35:59.694839 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx52g\" (UniqueName: \"kubernetes.io/projected/d3dcab4b-fbf1-451c-834e-c09b71ba0532-kube-api-access-bx52g\") pod \"must-gather-jrdpk\" (UID: \"d3dcab4b-fbf1-451c-834e-c09b71ba0532\") " pod="openshift-must-gather-mjg8z/must-gather-jrdpk" Apr 02 14:35:59 crc kubenswrapper[4732]: I0402 14:35:59.695051 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d3dcab4b-fbf1-451c-834e-c09b71ba0532-must-gather-output\") pod \"must-gather-jrdpk\" (UID: \"d3dcab4b-fbf1-451c-834e-c09b71ba0532\") " pod="openshift-must-gather-mjg8z/must-gather-jrdpk" Apr 02 14:35:59 crc kubenswrapper[4732]: I0402 14:35:59.695679 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d3dcab4b-fbf1-451c-834e-c09b71ba0532-must-gather-output\") pod \"must-gather-jrdpk\" (UID: \"d3dcab4b-fbf1-451c-834e-c09b71ba0532\") " pod="openshift-must-gather-mjg8z/must-gather-jrdpk" Apr 02 14:35:59 crc kubenswrapper[4732]: I0402 14:35:59.717586 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx52g\" (UniqueName: \"kubernetes.io/projected/d3dcab4b-fbf1-451c-834e-c09b71ba0532-kube-api-access-bx52g\") pod \"must-gather-jrdpk\" (UID: \"d3dcab4b-fbf1-451c-834e-c09b71ba0532\") " pod="openshift-must-gather-mjg8z/must-gather-jrdpk" Apr 02 14:35:59 crc kubenswrapper[4732]: I0402 14:35:59.797566 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mjg8z/must-gather-jrdpk" Apr 02 14:36:00 crc kubenswrapper[4732]: I0402 14:36:00.155800 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585676-gp9tx"] Apr 02 14:36:00 crc kubenswrapper[4732]: I0402 14:36:00.157202 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585676-gp9tx" Apr 02 14:36:00 crc kubenswrapper[4732]: I0402 14:36:00.159784 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:36:00 crc kubenswrapper[4732]: I0402 14:36:00.160038 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:36:00 crc kubenswrapper[4732]: I0402 14:36:00.160190 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:36:00 crc kubenswrapper[4732]: I0402 14:36:00.188552 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585676-gp9tx"] Apr 02 14:36:00 crc kubenswrapper[4732]: I0402 14:36:00.238503 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mjg8z/must-gather-jrdpk"] Apr 02 14:36:00 crc kubenswrapper[4732]: I0402 14:36:00.307915 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjcbb\" (UniqueName: \"kubernetes.io/projected/acc0658a-3ba9-4d19-85f4-6e2c1270efcf-kube-api-access-pjcbb\") pod \"auto-csr-approver-29585676-gp9tx\" (UID: \"acc0658a-3ba9-4d19-85f4-6e2c1270efcf\") " pod="openshift-infra/auto-csr-approver-29585676-gp9tx" Apr 02 14:36:00 crc kubenswrapper[4732]: I0402 14:36:00.410064 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjcbb\" (UniqueName: \"kubernetes.io/projected/acc0658a-3ba9-4d19-85f4-6e2c1270efcf-kube-api-access-pjcbb\") pod \"auto-csr-approver-29585676-gp9tx\" (UID: \"acc0658a-3ba9-4d19-85f4-6e2c1270efcf\") " pod="openshift-infra/auto-csr-approver-29585676-gp9tx" Apr 02 14:36:00 crc kubenswrapper[4732]: I0402 14:36:00.427628 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjcbb\" (UniqueName: \"kubernetes.io/projected/acc0658a-3ba9-4d19-85f4-6e2c1270efcf-kube-api-access-pjcbb\") pod \"auto-csr-approver-29585676-gp9tx\" (UID: \"acc0658a-3ba9-4d19-85f4-6e2c1270efcf\") " pod="openshift-infra/auto-csr-approver-29585676-gp9tx" Apr 02 14:36:00 crc kubenswrapper[4732]: I0402 14:36:00.465052 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mjg8z/must-gather-jrdpk" event={"ID":"d3dcab4b-fbf1-451c-834e-c09b71ba0532","Type":"ContainerStarted","Data":"27885fa4914a8fa56d73e5a31429e84684f3f3dd6d95545e024aa32608267709"} Apr 02 14:36:00 crc kubenswrapper[4732]: I0402 14:36:00.486633 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585676-gp9tx" Apr 02 14:36:00 crc kubenswrapper[4732]: I0402 14:36:00.760947 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585676-gp9tx"] Apr 02 14:36:00 crc kubenswrapper[4732]: W0402 14:36:00.764093 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacc0658a_3ba9_4d19_85f4_6e2c1270efcf.slice/crio-ed22f14fe595c09c392845f211d797f66aeaae98ea8d536f6cd94cb225503859 WatchSource:0}: Error finding container ed22f14fe595c09c392845f211d797f66aeaae98ea8d536f6cd94cb225503859: Status 404 returned error can't find the container with id ed22f14fe595c09c392845f211d797f66aeaae98ea8d536f6cd94cb225503859 Apr 02 14:36:01 crc kubenswrapper[4732]: I0402 14:36:01.505816 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585676-gp9tx" event={"ID":"acc0658a-3ba9-4d19-85f4-6e2c1270efcf","Type":"ContainerStarted","Data":"ed22f14fe595c09c392845f211d797f66aeaae98ea8d536f6cd94cb225503859"} Apr 02 14:36:02 crc kubenswrapper[4732]: I0402 14:36:02.522799 4732 generic.go:334] "Generic (PLEG): container finished" podID="acc0658a-3ba9-4d19-85f4-6e2c1270efcf" containerID="29ddeebdf8d34aea2651d495ea74d95c814bc1a0dfd8b0291df6b463bd39abfc" exitCode=0 Apr 02 14:36:02 crc kubenswrapper[4732]: I0402 14:36:02.522888 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585676-gp9tx" event={"ID":"acc0658a-3ba9-4d19-85f4-6e2c1270efcf","Type":"ContainerDied","Data":"29ddeebdf8d34aea2651d495ea74d95c814bc1a0dfd8b0291df6b463bd39abfc"} Apr 02 14:36:06 crc kubenswrapper[4732]: I0402 14:36:06.043809 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585676-gp9tx" Apr 02 14:36:06 crc kubenswrapper[4732]: I0402 14:36:06.239864 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjcbb\" (UniqueName: \"kubernetes.io/projected/acc0658a-3ba9-4d19-85f4-6e2c1270efcf-kube-api-access-pjcbb\") pod \"acc0658a-3ba9-4d19-85f4-6e2c1270efcf\" (UID: \"acc0658a-3ba9-4d19-85f4-6e2c1270efcf\") " Apr 02 14:36:06 crc kubenswrapper[4732]: I0402 14:36:06.246625 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acc0658a-3ba9-4d19-85f4-6e2c1270efcf-kube-api-access-pjcbb" (OuterVolumeSpecName: "kube-api-access-pjcbb") pod "acc0658a-3ba9-4d19-85f4-6e2c1270efcf" (UID: "acc0658a-3ba9-4d19-85f4-6e2c1270efcf"). InnerVolumeSpecName "kube-api-access-pjcbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:36:06 crc kubenswrapper[4732]: I0402 14:36:06.341952 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjcbb\" (UniqueName: \"kubernetes.io/projected/acc0658a-3ba9-4d19-85f4-6e2c1270efcf-kube-api-access-pjcbb\") on node \"crc\" DevicePath \"\"" Apr 02 14:36:06 crc kubenswrapper[4732]: I0402 14:36:06.563805 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mjg8z/must-gather-jrdpk" event={"ID":"d3dcab4b-fbf1-451c-834e-c09b71ba0532","Type":"ContainerStarted","Data":"48960aaf1ab4f6d3adaeed7e914fb355d25232b633c9de5e96cc72b7f29d1994"} Apr 02 14:36:06 crc kubenswrapper[4732]: I0402 14:36:06.566043 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585676-gp9tx" event={"ID":"acc0658a-3ba9-4d19-85f4-6e2c1270efcf","Type":"ContainerDied","Data":"ed22f14fe595c09c392845f211d797f66aeaae98ea8d536f6cd94cb225503859"} Apr 02 14:36:06 crc kubenswrapper[4732]: I0402 14:36:06.566090 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed22f14fe595c09c392845f211d797f66aeaae98ea8d536f6cd94cb225503859" Apr 02 14:36:06 crc kubenswrapper[4732]: I0402 14:36:06.566093 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585676-gp9tx" Apr 02 14:36:07 crc kubenswrapper[4732]: I0402 14:36:07.123414 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585670-xrnq5"] Apr 02 14:36:07 crc kubenswrapper[4732]: I0402 14:36:07.134416 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585670-xrnq5"] Apr 02 14:36:07 crc kubenswrapper[4732]: I0402 14:36:07.580801 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mjg8z/must-gather-jrdpk" event={"ID":"d3dcab4b-fbf1-451c-834e-c09b71ba0532","Type":"ContainerStarted","Data":"abea6c7ce604b4382966cd050f23a9e9443e3fac1f49a575f219a6f335b008ce"} Apr 02 14:36:07 crc kubenswrapper[4732]: I0402 14:36:07.612313 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mjg8z/must-gather-jrdpk" podStartSLOduration=2.828202346 podStartE2EDuration="8.612291079s" podCreationTimestamp="2026-04-02 14:35:59 +0000 UTC" firstStartedPulling="2026-04-02 14:36:00.241957025 +0000 UTC m=+3517.146364578" lastFinishedPulling="2026-04-02 14:36:06.026045758 +0000 UTC m=+3522.930453311" observedRunningTime="2026-04-02 14:36:07.605056183 +0000 UTC m=+3524.509463766" watchObservedRunningTime="2026-04-02 14:36:07.612291079 +0000 UTC m=+3524.516698632" Apr 02 14:36:08 crc kubenswrapper[4732]: I0402 14:36:08.690541 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17812719-d9ec-4abb-9a8b-d65e27c6ef30" path="/var/lib/kubelet/pods/17812719-d9ec-4abb-9a8b-d65e27c6ef30/volumes" Apr 02 14:36:09 crc kubenswrapper[4732]: I0402 14:36:09.947597 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mjg8z/crc-debug-kbslp"] Apr 02 14:36:09 crc kubenswrapper[4732]: E0402 14:36:09.948286 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acc0658a-3ba9-4d19-85f4-6e2c1270efcf" containerName="oc" Apr 02 14:36:09 crc kubenswrapper[4732]: I0402 14:36:09.948299 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="acc0658a-3ba9-4d19-85f4-6e2c1270efcf" containerName="oc" Apr 02 14:36:09 crc kubenswrapper[4732]: I0402 14:36:09.948519 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="acc0658a-3ba9-4d19-85f4-6e2c1270efcf" containerName="oc" Apr 02 14:36:09 crc kubenswrapper[4732]: I0402 14:36:09.949149 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mjg8z/crc-debug-kbslp" Apr 02 14:36:10 crc kubenswrapper[4732]: I0402 14:36:10.113748 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbg57\" (UniqueName: \"kubernetes.io/projected/1382abcd-ac9c-4506-a2e3-dbeb3a709ee2-kube-api-access-fbg57\") pod \"crc-debug-kbslp\" (UID: \"1382abcd-ac9c-4506-a2e3-dbeb3a709ee2\") " pod="openshift-must-gather-mjg8z/crc-debug-kbslp" Apr 02 14:36:10 crc kubenswrapper[4732]: I0402 14:36:10.114169 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1382abcd-ac9c-4506-a2e3-dbeb3a709ee2-host\") pod \"crc-debug-kbslp\" (UID: \"1382abcd-ac9c-4506-a2e3-dbeb3a709ee2\") " pod="openshift-must-gather-mjg8z/crc-debug-kbslp" Apr 02 14:36:10 crc kubenswrapper[4732]: I0402 14:36:10.216536 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbg57\" (UniqueName: \"kubernetes.io/projected/1382abcd-ac9c-4506-a2e3-dbeb3a709ee2-kube-api-access-fbg57\") pod \"crc-debug-kbslp\" (UID: \"1382abcd-ac9c-4506-a2e3-dbeb3a709ee2\") " pod="openshift-must-gather-mjg8z/crc-debug-kbslp" Apr 02 14:36:10 crc kubenswrapper[4732]: I0402 14:36:10.216691 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1382abcd-ac9c-4506-a2e3-dbeb3a709ee2-host\") pod \"crc-debug-kbslp\" (UID: \"1382abcd-ac9c-4506-a2e3-dbeb3a709ee2\") " pod="openshift-must-gather-mjg8z/crc-debug-kbslp" Apr 02 14:36:10 crc kubenswrapper[4732]: I0402 14:36:10.216809 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1382abcd-ac9c-4506-a2e3-dbeb3a709ee2-host\") pod \"crc-debug-kbslp\" (UID: \"1382abcd-ac9c-4506-a2e3-dbeb3a709ee2\") " pod="openshift-must-gather-mjg8z/crc-debug-kbslp" Apr 02 14:36:10 crc kubenswrapper[4732]: I0402 14:36:10.237695 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbg57\" (UniqueName: \"kubernetes.io/projected/1382abcd-ac9c-4506-a2e3-dbeb3a709ee2-kube-api-access-fbg57\") pod \"crc-debug-kbslp\" (UID: \"1382abcd-ac9c-4506-a2e3-dbeb3a709ee2\") " pod="openshift-must-gather-mjg8z/crc-debug-kbslp" Apr 02 14:36:10 crc kubenswrapper[4732]: I0402 14:36:10.265002 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mjg8z/crc-debug-kbslp" Apr 02 14:36:10 crc kubenswrapper[4732]: W0402 14:36:10.303588 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1382abcd_ac9c_4506_a2e3_dbeb3a709ee2.slice/crio-0d8366463df362304e273a27d4da637c075eb0ceba8c1abcaf9204858ab92e34 WatchSource:0}: Error finding container 0d8366463df362304e273a27d4da637c075eb0ceba8c1abcaf9204858ab92e34: Status 404 returned error can't find the container with id 0d8366463df362304e273a27d4da637c075eb0ceba8c1abcaf9204858ab92e34 Apr 02 14:36:10 crc kubenswrapper[4732]: I0402 14:36:10.607872 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mjg8z/crc-debug-kbslp" event={"ID":"1382abcd-ac9c-4506-a2e3-dbeb3a709ee2","Type":"ContainerStarted","Data":"0d8366463df362304e273a27d4da637c075eb0ceba8c1abcaf9204858ab92e34"} Apr 02 14:36:11 crc kubenswrapper[4732]: I0402 14:36:11.680952 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:36:11 crc kubenswrapper[4732]: E0402 14:36:11.681666 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:36:23 crc kubenswrapper[4732]: I0402 14:36:23.682695 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:36:23 crc kubenswrapper[4732]: E0402 14:36:23.683329 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:36:23 crc kubenswrapper[4732]: I0402 14:36:23.737216 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mjg8z/crc-debug-kbslp" event={"ID":"1382abcd-ac9c-4506-a2e3-dbeb3a709ee2","Type":"ContainerStarted","Data":"6f66ec762eaa36ac21c4234d510ee069d198ab5a88a0e055d6f699d9fe50da9b"} Apr 02 14:36:23 crc kubenswrapper[4732]: I0402 14:36:23.753415 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mjg8z/crc-debug-kbslp" podStartSLOduration=1.64165763 podStartE2EDuration="14.753397971s" podCreationTimestamp="2026-04-02 14:36:09 +0000 UTC" firstStartedPulling="2026-04-02 14:36:10.30570065 +0000 UTC m=+3527.210108203" lastFinishedPulling="2026-04-02 14:36:23.417440991 +0000 UTC m=+3540.321848544" observedRunningTime="2026-04-02 14:36:23.752546688 +0000 UTC m=+3540.656954241" watchObservedRunningTime="2026-04-02 14:36:23.753397971 +0000 UTC m=+3540.657805524" Apr 02 14:36:36 crc kubenswrapper[4732]: I0402 14:36:36.680989 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:36:37 crc kubenswrapper[4732]: I0402 14:36:37.870448 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerStarted","Data":"e40e82e6d4518aba30c53d5cb0838adda1128bec7118e470ccaa5d5e5e745a10"} Apr 02 14:36:52 crc kubenswrapper[4732]: I0402 14:36:52.669831 4732 scope.go:117] "RemoveContainer" containerID="8c490903641df2d47a2f5c37dea59bed6e7b91f01c5256efa117fed3111d2c00" Apr 02 14:37:08 crc kubenswrapper[4732]: I0402 14:37:08.152223 4732 generic.go:334] "Generic (PLEG): container finished" podID="1382abcd-ac9c-4506-a2e3-dbeb3a709ee2" containerID="6f66ec762eaa36ac21c4234d510ee069d198ab5a88a0e055d6f699d9fe50da9b" exitCode=0 Apr 02 14:37:08 crc kubenswrapper[4732]: I0402 14:37:08.152316 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mjg8z/crc-debug-kbslp" event={"ID":"1382abcd-ac9c-4506-a2e3-dbeb3a709ee2","Type":"ContainerDied","Data":"6f66ec762eaa36ac21c4234d510ee069d198ab5a88a0e055d6f699d9fe50da9b"} Apr 02 14:37:09 crc kubenswrapper[4732]: I0402 14:37:09.298481 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mjg8z/crc-debug-kbslp" Apr 02 14:37:09 crc kubenswrapper[4732]: I0402 14:37:09.332984 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mjg8z/crc-debug-kbslp"] Apr 02 14:37:09 crc kubenswrapper[4732]: I0402 14:37:09.341222 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mjg8z/crc-debug-kbslp"] Apr 02 14:37:09 crc kubenswrapper[4732]: I0402 14:37:09.419970 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1382abcd-ac9c-4506-a2e3-dbeb3a709ee2-host\") pod \"1382abcd-ac9c-4506-a2e3-dbeb3a709ee2\" (UID: \"1382abcd-ac9c-4506-a2e3-dbeb3a709ee2\") " Apr 02 14:37:09 crc kubenswrapper[4732]: I0402 14:37:09.420183 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbg57\" (UniqueName: \"kubernetes.io/projected/1382abcd-ac9c-4506-a2e3-dbeb3a709ee2-kube-api-access-fbg57\") pod \"1382abcd-ac9c-4506-a2e3-dbeb3a709ee2\" (UID: \"1382abcd-ac9c-4506-a2e3-dbeb3a709ee2\") " Apr 02 14:37:09 crc kubenswrapper[4732]: I0402 14:37:09.420230 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1382abcd-ac9c-4506-a2e3-dbeb3a709ee2-host" (OuterVolumeSpecName: "host") pod "1382abcd-ac9c-4506-a2e3-dbeb3a709ee2" (UID: "1382abcd-ac9c-4506-a2e3-dbeb3a709ee2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 14:37:09 crc kubenswrapper[4732]: I0402 14:37:09.420726 4732 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1382abcd-ac9c-4506-a2e3-dbeb3a709ee2-host\") on node \"crc\" DevicePath \"\"" Apr 02 14:37:09 crc kubenswrapper[4732]: I0402 14:37:09.428677 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1382abcd-ac9c-4506-a2e3-dbeb3a709ee2-kube-api-access-fbg57" (OuterVolumeSpecName: "kube-api-access-fbg57") pod "1382abcd-ac9c-4506-a2e3-dbeb3a709ee2" (UID: "1382abcd-ac9c-4506-a2e3-dbeb3a709ee2"). InnerVolumeSpecName "kube-api-access-fbg57". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:37:09 crc kubenswrapper[4732]: I0402 14:37:09.522996 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbg57\" (UniqueName: \"kubernetes.io/projected/1382abcd-ac9c-4506-a2e3-dbeb3a709ee2-kube-api-access-fbg57\") on node \"crc\" DevicePath \"\"" Apr 02 14:37:10 crc kubenswrapper[4732]: I0402 14:37:10.174210 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d8366463df362304e273a27d4da637c075eb0ceba8c1abcaf9204858ab92e34" Apr 02 14:37:10 crc kubenswrapper[4732]: I0402 14:37:10.174255 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mjg8z/crc-debug-kbslp" Apr 02 14:37:10 crc kubenswrapper[4732]: I0402 14:37:10.534767 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mjg8z/crc-debug-8p9zv"] Apr 02 14:37:10 crc kubenswrapper[4732]: E0402 14:37:10.535178 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1382abcd-ac9c-4506-a2e3-dbeb3a709ee2" containerName="container-00" Apr 02 14:37:10 crc kubenswrapper[4732]: I0402 14:37:10.535194 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1382abcd-ac9c-4506-a2e3-dbeb3a709ee2" containerName="container-00" Apr 02 14:37:10 crc kubenswrapper[4732]: I0402 14:37:10.535451 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1382abcd-ac9c-4506-a2e3-dbeb3a709ee2" containerName="container-00" Apr 02 14:37:10 crc kubenswrapper[4732]: I0402 14:37:10.536160 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mjg8z/crc-debug-8p9zv" Apr 02 14:37:10 crc kubenswrapper[4732]: I0402 14:37:10.642092 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-768mj\" (UniqueName: \"kubernetes.io/projected/f7c45fef-3eb5-4ac3-b5d0-503a703baeaa-kube-api-access-768mj\") pod \"crc-debug-8p9zv\" (UID: \"f7c45fef-3eb5-4ac3-b5d0-503a703baeaa\") " pod="openshift-must-gather-mjg8z/crc-debug-8p9zv" Apr 02 14:37:10 crc kubenswrapper[4732]: I0402 14:37:10.642315 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f7c45fef-3eb5-4ac3-b5d0-503a703baeaa-host\") pod \"crc-debug-8p9zv\" (UID: \"f7c45fef-3eb5-4ac3-b5d0-503a703baeaa\") " pod="openshift-must-gather-mjg8z/crc-debug-8p9zv" Apr 02 14:37:10 crc kubenswrapper[4732]: I0402 14:37:10.692536 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1382abcd-ac9c-4506-a2e3-dbeb3a709ee2" path="/var/lib/kubelet/pods/1382abcd-ac9c-4506-a2e3-dbeb3a709ee2/volumes" Apr 02 14:37:10 crc kubenswrapper[4732]: I0402 14:37:10.743521 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-768mj\" (UniqueName: \"kubernetes.io/projected/f7c45fef-3eb5-4ac3-b5d0-503a703baeaa-kube-api-access-768mj\") pod \"crc-debug-8p9zv\" (UID: \"f7c45fef-3eb5-4ac3-b5d0-503a703baeaa\") " pod="openshift-must-gather-mjg8z/crc-debug-8p9zv" Apr 02 14:37:10 crc kubenswrapper[4732]: I0402 14:37:10.744544 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f7c45fef-3eb5-4ac3-b5d0-503a703baeaa-host\") pod \"crc-debug-8p9zv\" (UID: \"f7c45fef-3eb5-4ac3-b5d0-503a703baeaa\") " pod="openshift-must-gather-mjg8z/crc-debug-8p9zv" Apr 02 14:37:10 crc kubenswrapper[4732]: I0402 14:37:10.745563 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f7c45fef-3eb5-4ac3-b5d0-503a703baeaa-host\") pod \"crc-debug-8p9zv\" (UID: \"f7c45fef-3eb5-4ac3-b5d0-503a703baeaa\") " pod="openshift-must-gather-mjg8z/crc-debug-8p9zv" Apr 02 14:37:10 crc kubenswrapper[4732]: I0402 14:37:10.760967 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-768mj\" (UniqueName: \"kubernetes.io/projected/f7c45fef-3eb5-4ac3-b5d0-503a703baeaa-kube-api-access-768mj\") pod \"crc-debug-8p9zv\" (UID: \"f7c45fef-3eb5-4ac3-b5d0-503a703baeaa\") " pod="openshift-must-gather-mjg8z/crc-debug-8p9zv" Apr 02 14:37:10 crc kubenswrapper[4732]: I0402 14:37:10.864447 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mjg8z/crc-debug-8p9zv" Apr 02 14:37:11 crc kubenswrapper[4732]: I0402 14:37:11.188133 4732 generic.go:334] "Generic (PLEG): container finished" podID="f7c45fef-3eb5-4ac3-b5d0-503a703baeaa" containerID="d8431e0373cbebe6422494afbab22f0690fe2c0fa0bd7699840111a855e31845" exitCode=0 Apr 02 14:37:11 crc kubenswrapper[4732]: I0402 14:37:11.188403 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mjg8z/crc-debug-8p9zv" event={"ID":"f7c45fef-3eb5-4ac3-b5d0-503a703baeaa","Type":"ContainerDied","Data":"d8431e0373cbebe6422494afbab22f0690fe2c0fa0bd7699840111a855e31845"} Apr 02 14:37:11 crc kubenswrapper[4732]: I0402 14:37:11.188432 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mjg8z/crc-debug-8p9zv" event={"ID":"f7c45fef-3eb5-4ac3-b5d0-503a703baeaa","Type":"ContainerStarted","Data":"a5bf54c89ec55ec3788a3ec76b85262c8fe2bc06ea817e675a939f1c34e54fd0"} Apr 02 14:37:11 crc kubenswrapper[4732]: I0402 14:37:11.698434 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mjg8z/crc-debug-8p9zv"] Apr 02 14:37:11 crc kubenswrapper[4732]: I0402 14:37:11.706342 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mjg8z/crc-debug-8p9zv"] Apr 02 14:37:12 crc kubenswrapper[4732]: I0402 14:37:12.321138 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mjg8z/crc-debug-8p9zv" Apr 02 14:37:12 crc kubenswrapper[4732]: I0402 14:37:12.379411 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f7c45fef-3eb5-4ac3-b5d0-503a703baeaa-host\") pod \"f7c45fef-3eb5-4ac3-b5d0-503a703baeaa\" (UID: \"f7c45fef-3eb5-4ac3-b5d0-503a703baeaa\") " Apr 02 14:37:12 crc kubenswrapper[4732]: I0402 14:37:12.379505 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-768mj\" (UniqueName: \"kubernetes.io/projected/f7c45fef-3eb5-4ac3-b5d0-503a703baeaa-kube-api-access-768mj\") pod \"f7c45fef-3eb5-4ac3-b5d0-503a703baeaa\" (UID: \"f7c45fef-3eb5-4ac3-b5d0-503a703baeaa\") " Apr 02 14:37:12 crc kubenswrapper[4732]: I0402 14:37:12.379527 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7c45fef-3eb5-4ac3-b5d0-503a703baeaa-host" (OuterVolumeSpecName: "host") pod "f7c45fef-3eb5-4ac3-b5d0-503a703baeaa" (UID: "f7c45fef-3eb5-4ac3-b5d0-503a703baeaa"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 14:37:12 crc kubenswrapper[4732]: I0402 14:37:12.379956 4732 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f7c45fef-3eb5-4ac3-b5d0-503a703baeaa-host\") on node \"crc\" DevicePath \"\"" Apr 02 14:37:12 crc kubenswrapper[4732]: I0402 14:37:12.385124 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7c45fef-3eb5-4ac3-b5d0-503a703baeaa-kube-api-access-768mj" (OuterVolumeSpecName: "kube-api-access-768mj") pod "f7c45fef-3eb5-4ac3-b5d0-503a703baeaa" (UID: "f7c45fef-3eb5-4ac3-b5d0-503a703baeaa"). InnerVolumeSpecName "kube-api-access-768mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:37:12 crc kubenswrapper[4732]: I0402 14:37:12.481118 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-768mj\" (UniqueName: \"kubernetes.io/projected/f7c45fef-3eb5-4ac3-b5d0-503a703baeaa-kube-api-access-768mj\") on node \"crc\" DevicePath \"\"" Apr 02 14:37:12 crc kubenswrapper[4732]: I0402 14:37:12.700215 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7c45fef-3eb5-4ac3-b5d0-503a703baeaa" path="/var/lib/kubelet/pods/f7c45fef-3eb5-4ac3-b5d0-503a703baeaa/volumes" Apr 02 14:37:12 crc kubenswrapper[4732]: I0402 14:37:12.889681 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mjg8z/crc-debug-rqtbr"] Apr 02 14:37:12 crc kubenswrapper[4732]: E0402 14:37:12.890227 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c45fef-3eb5-4ac3-b5d0-503a703baeaa" containerName="container-00" Apr 02 14:37:12 crc kubenswrapper[4732]: I0402 14:37:12.890249 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c45fef-3eb5-4ac3-b5d0-503a703baeaa" containerName="container-00" Apr 02 14:37:12 crc kubenswrapper[4732]: I0402 14:37:12.890500 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7c45fef-3eb5-4ac3-b5d0-503a703baeaa" containerName="container-00" Apr 02 14:37:12 crc kubenswrapper[4732]: I0402 14:37:12.891235 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mjg8z/crc-debug-rqtbr" Apr 02 14:37:12 crc kubenswrapper[4732]: I0402 14:37:12.990120 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/177c463b-c25b-4c02-a9c1-db9d97ebbd00-host\") pod \"crc-debug-rqtbr\" (UID: \"177c463b-c25b-4c02-a9c1-db9d97ebbd00\") " pod="openshift-must-gather-mjg8z/crc-debug-rqtbr" Apr 02 14:37:12 crc kubenswrapper[4732]: I0402 14:37:12.990397 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdkbg\" (UniqueName: \"kubernetes.io/projected/177c463b-c25b-4c02-a9c1-db9d97ebbd00-kube-api-access-rdkbg\") pod \"crc-debug-rqtbr\" (UID: \"177c463b-c25b-4c02-a9c1-db9d97ebbd00\") " pod="openshift-must-gather-mjg8z/crc-debug-rqtbr" Apr 02 14:37:13 crc kubenswrapper[4732]: I0402 14:37:13.092318 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/177c463b-c25b-4c02-a9c1-db9d97ebbd00-host\") pod \"crc-debug-rqtbr\" (UID: \"177c463b-c25b-4c02-a9c1-db9d97ebbd00\") " pod="openshift-must-gather-mjg8z/crc-debug-rqtbr" Apr 02 14:37:13 crc kubenswrapper[4732]: I0402 14:37:13.092449 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/177c463b-c25b-4c02-a9c1-db9d97ebbd00-host\") pod \"crc-debug-rqtbr\" (UID: \"177c463b-c25b-4c02-a9c1-db9d97ebbd00\") " pod="openshift-must-gather-mjg8z/crc-debug-rqtbr" Apr 02 14:37:13 crc kubenswrapper[4732]: I0402 14:37:13.092538 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdkbg\" (UniqueName: \"kubernetes.io/projected/177c463b-c25b-4c02-a9c1-db9d97ebbd00-kube-api-access-rdkbg\") pod \"crc-debug-rqtbr\" (UID: \"177c463b-c25b-4c02-a9c1-db9d97ebbd00\") " pod="openshift-must-gather-mjg8z/crc-debug-rqtbr" Apr 02 14:37:13 crc kubenswrapper[4732]: I0402 14:37:13.113002 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdkbg\" (UniqueName: \"kubernetes.io/projected/177c463b-c25b-4c02-a9c1-db9d97ebbd00-kube-api-access-rdkbg\") pod \"crc-debug-rqtbr\" (UID: \"177c463b-c25b-4c02-a9c1-db9d97ebbd00\") " pod="openshift-must-gather-mjg8z/crc-debug-rqtbr" Apr 02 14:37:13 crc kubenswrapper[4732]: I0402 14:37:13.207162 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mjg8z/crc-debug-rqtbr" Apr 02 14:37:13 crc kubenswrapper[4732]: I0402 14:37:13.211207 4732 scope.go:117] "RemoveContainer" containerID="d8431e0373cbebe6422494afbab22f0690fe2c0fa0bd7699840111a855e31845" Apr 02 14:37:13 crc kubenswrapper[4732]: I0402 14:37:13.211256 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mjg8z/crc-debug-8p9zv" Apr 02 14:37:13 crc kubenswrapper[4732]: W0402 14:37:13.283450 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod177c463b_c25b_4c02_a9c1_db9d97ebbd00.slice/crio-e6d14081b225c5b0ebc3a5e5b6e75633f87f6b5a8693911c6e1fe066dcc35b62 WatchSource:0}: Error finding container e6d14081b225c5b0ebc3a5e5b6e75633f87f6b5a8693911c6e1fe066dcc35b62: Status 404 returned error can't find the container with id e6d14081b225c5b0ebc3a5e5b6e75633f87f6b5a8693911c6e1fe066dcc35b62 Apr 02 14:37:14 crc kubenswrapper[4732]: I0402 14:37:14.223934 4732 generic.go:334] "Generic (PLEG): container finished" podID="177c463b-c25b-4c02-a9c1-db9d97ebbd00" containerID="b7b5fbabd557f2cd3d4792012ff5390af6ad4c1c2e9dc3048f595acf7464a9bb" exitCode=0 Apr 02 14:37:14 crc kubenswrapper[4732]: I0402 14:37:14.224140 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mjg8z/crc-debug-rqtbr" event={"ID":"177c463b-c25b-4c02-a9c1-db9d97ebbd00","Type":"ContainerDied","Data":"b7b5fbabd557f2cd3d4792012ff5390af6ad4c1c2e9dc3048f595acf7464a9bb"} Apr 02 14:37:14 crc kubenswrapper[4732]: I0402 14:37:14.224347 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mjg8z/crc-debug-rqtbr" event={"ID":"177c463b-c25b-4c02-a9c1-db9d97ebbd00","Type":"ContainerStarted","Data":"e6d14081b225c5b0ebc3a5e5b6e75633f87f6b5a8693911c6e1fe066dcc35b62"} Apr 02 14:37:14 crc kubenswrapper[4732]: I0402 14:37:14.269596 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mjg8z/crc-debug-rqtbr"] Apr 02 14:37:14 crc kubenswrapper[4732]: I0402 14:37:14.280202 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mjg8z/crc-debug-rqtbr"] Apr 02 14:37:15 crc kubenswrapper[4732]: I0402 14:37:15.340468 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mjg8z/crc-debug-rqtbr" Apr 02 14:37:15 crc kubenswrapper[4732]: I0402 14:37:15.436223 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/177c463b-c25b-4c02-a9c1-db9d97ebbd00-host\") pod \"177c463b-c25b-4c02-a9c1-db9d97ebbd00\" (UID: \"177c463b-c25b-4c02-a9c1-db9d97ebbd00\") " Apr 02 14:37:15 crc kubenswrapper[4732]: I0402 14:37:15.436324 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/177c463b-c25b-4c02-a9c1-db9d97ebbd00-host" (OuterVolumeSpecName: "host") pod "177c463b-c25b-4c02-a9c1-db9d97ebbd00" (UID: "177c463b-c25b-4c02-a9c1-db9d97ebbd00"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 14:37:15 crc kubenswrapper[4732]: I0402 14:37:15.436659 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdkbg\" (UniqueName: \"kubernetes.io/projected/177c463b-c25b-4c02-a9c1-db9d97ebbd00-kube-api-access-rdkbg\") pod \"177c463b-c25b-4c02-a9c1-db9d97ebbd00\" (UID: \"177c463b-c25b-4c02-a9c1-db9d97ebbd00\") " Apr 02 14:37:15 crc kubenswrapper[4732]: I0402 14:37:15.438062 4732 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/177c463b-c25b-4c02-a9c1-db9d97ebbd00-host\") on node \"crc\" DevicePath \"\"" Apr 02 14:37:15 crc kubenswrapper[4732]: I0402 14:37:15.446935 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/177c463b-c25b-4c02-a9c1-db9d97ebbd00-kube-api-access-rdkbg" (OuterVolumeSpecName: "kube-api-access-rdkbg") pod "177c463b-c25b-4c02-a9c1-db9d97ebbd00" (UID: "177c463b-c25b-4c02-a9c1-db9d97ebbd00"). InnerVolumeSpecName "kube-api-access-rdkbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:37:15 crc kubenswrapper[4732]: I0402 14:37:15.539977 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdkbg\" (UniqueName: \"kubernetes.io/projected/177c463b-c25b-4c02-a9c1-db9d97ebbd00-kube-api-access-rdkbg\") on node \"crc\" DevicePath \"\"" Apr 02 14:37:16 crc kubenswrapper[4732]: I0402 14:37:16.251638 4732 scope.go:117] "RemoveContainer" containerID="b7b5fbabd557f2cd3d4792012ff5390af6ad4c1c2e9dc3048f595acf7464a9bb" Apr 02 14:37:16 crc kubenswrapper[4732]: I0402 14:37:16.251747 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mjg8z/crc-debug-rqtbr" Apr 02 14:37:16 crc kubenswrapper[4732]: I0402 14:37:16.695395 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="177c463b-c25b-4c02-a9c1-db9d97ebbd00" path="/var/lib/kubelet/pods/177c463b-c25b-4c02-a9c1-db9d97ebbd00/volumes" Apr 02 14:37:29 crc kubenswrapper[4732]: I0402 14:37:29.890222 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6c879c6666-5kls7_362f9e50-6f86-41ff-ae02-e0b8565fa55f/barbican-api/0.log" Apr 02 14:37:30 crc kubenswrapper[4732]: I0402 14:37:30.068559 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6c879c6666-5kls7_362f9e50-6f86-41ff-ae02-e0b8565fa55f/barbican-api-log/0.log" Apr 02 14:37:30 crc kubenswrapper[4732]: I0402 14:37:30.091640 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f897cfb64-ql8wz_5e017590-845a-4f52-a6ae-258890dd6388/barbican-keystone-listener/0.log" Apr 02 14:37:30 crc kubenswrapper[4732]: I0402 14:37:30.136067 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f897cfb64-ql8wz_5e017590-845a-4f52-a6ae-258890dd6388/barbican-keystone-listener-log/0.log" Apr 02 14:37:30 crc kubenswrapper[4732]: I0402 14:37:30.276920 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-98d7bf879-xkszz_8eed39a7-f437-403d-acab-246fa6d25c4b/barbican-worker/0.log" Apr 02 14:37:30 crc kubenswrapper[4732]: I0402 14:37:30.326308 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-98d7bf879-xkszz_8eed39a7-f437-403d-acab-246fa6d25c4b/barbican-worker-log/0.log" Apr 02 14:37:30 crc kubenswrapper[4732]: I0402 14:37:30.591408 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ad6cbd5d-4434-4885-bf56-8ee47171b897/ceilometer-notification-agent/0.log" Apr 02 14:37:30 crc kubenswrapper[4732]: I0402 14:37:30.592430 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ad6cbd5d-4434-4885-bf56-8ee47171b897/ceilometer-central-agent/0.log" Apr 02 14:37:30 crc kubenswrapper[4732]: I0402 14:37:30.667238 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr_a67d60f0-3912-4fc4-96b7-f96831ff23d3/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:37:30 crc kubenswrapper[4732]: I0402 14:37:30.760437 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ad6cbd5d-4434-4885-bf56-8ee47171b897/proxy-httpd/0.log" Apr 02 14:37:30 crc kubenswrapper[4732]: I0402 14:37:30.773887 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ad6cbd5d-4434-4885-bf56-8ee47171b897/sg-core/0.log" Apr 02 14:37:30 crc kubenswrapper[4732]: I0402 14:37:30.899095 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9fc3b5a4-f5bf-44ea-aa53-d93a32900271/cinder-api/0.log" Apr 02 14:37:30 crc kubenswrapper[4732]: I0402 14:37:30.965961 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9fc3b5a4-f5bf-44ea-aa53-d93a32900271/cinder-api-log/0.log" Apr 02 14:37:31 crc kubenswrapper[4732]: I0402 14:37:31.104726 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d7613bfc-a605-4485-b771-242a65e30df8/cinder-scheduler/0.log" Apr 02 14:37:31 crc kubenswrapper[4732]: I0402 14:37:31.206689 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d7613bfc-a605-4485-b771-242a65e30df8/probe/0.log" Apr 02 14:37:31 crc kubenswrapper[4732]: I0402 14:37:31.334027 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z_21ce48db-fb4b-4086-86cc-a32f30ebd002/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:37:31 crc kubenswrapper[4732]: I0402 14:37:31.489688 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5_d8bb9bae-9d09-42c5-a60a-134c907db6d5/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:37:31 crc kubenswrapper[4732]: I0402 14:37:31.562712 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-2wr76_b22602a0-7545-4c2d-8b16-2233288ab360/init/0.log" Apr 02 14:37:31 crc kubenswrapper[4732]: I0402 14:37:31.738062 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-2wr76_b22602a0-7545-4c2d-8b16-2233288ab360/init/0.log" Apr 02 14:37:31 crc kubenswrapper[4732]: I0402 14:37:31.765161 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-2wr76_b22602a0-7545-4c2d-8b16-2233288ab360/dnsmasq-dns/0.log" Apr 02 14:37:31 crc kubenswrapper[4732]: I0402 14:37:31.834114 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q_240ff67d-47d5-4b2e-b744-e0e2332a9496/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:37:31 crc kubenswrapper[4732]: I0402 14:37:31.979110 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2bbb407d-51c0-4cca-99c6-9436acda495d/glance-log/0.log" Apr 02 14:37:31 crc kubenswrapper[4732]: I0402 14:37:31.988276 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2bbb407d-51c0-4cca-99c6-9436acda495d/glance-httpd/0.log" Apr 02 14:37:32 crc kubenswrapper[4732]: I0402 14:37:32.146754 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8724b48c-9ac7-43a2-8d27-7d16056387ca/glance-httpd/0.log" Apr 02 14:37:32 crc kubenswrapper[4732]: I0402 14:37:32.182793 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8724b48c-9ac7-43a2-8d27-7d16056387ca/glance-log/0.log" Apr 02 14:37:32 crc kubenswrapper[4732]: I0402 14:37:32.376303 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-54f994999b-b88d7_97d6e519-a82f-4ce5-9199-4d7db769f86b/horizon/0.log" Apr 02 14:37:32 crc kubenswrapper[4732]: I0402 14:37:32.489880 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl_2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:37:32 crc kubenswrapper[4732]: I0402 14:37:32.675141 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-54f994999b-b88d7_97d6e519-a82f-4ce5-9199-4d7db769f86b/horizon-log/0.log" Apr 02 14:37:32 crc kubenswrapper[4732]: I0402 14:37:32.878410 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29585641-zldph_61cd5173-b5d3-4cd7-a8ea-e4300054f364/keystone-cron/0.log" Apr 02 14:37:32 crc kubenswrapper[4732]: I0402 14:37:32.910733 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-49z8f_2072a722-772d-4379-a439-fdebfa6e219e/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:37:33 crc kubenswrapper[4732]: I0402 14:37:33.108901 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_96308a9a-b137-4d84-a470-74395c7a5d60/kube-state-metrics/0.log" Apr 02 14:37:33 crc kubenswrapper[4732]: I0402 14:37:33.160331 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-d4c8876f7-592x4_5acfdea3-28ba-47f3-860c-6e7af2fe3222/keystone-api/0.log" Apr 02 14:37:33 crc kubenswrapper[4732]: I0402 14:37:33.811737 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-58f8f59779-9rrsx_cd0e816f-6d8e-4ed8-884c-ee38cec72d94/neutron-httpd/0.log" Apr 02 14:37:33 crc kubenswrapper[4732]: I0402 14:37:33.879350 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h_6d71fa88-324b-440b-aefd-492ac7ff7cd5/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:37:33 crc kubenswrapper[4732]: I0402 14:37:33.896254 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-58f8f59779-9rrsx_cd0e816f-6d8e-4ed8-884c-ee38cec72d94/neutron-api/0.log" Apr 02 14:37:34 crc kubenswrapper[4732]: I0402 14:37:34.188019 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt_9e89ed59-ef4b-44a7-b6de-d98b2319ee10/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:37:34 crc kubenswrapper[4732]: I0402 14:37:34.731691 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16/nova-api-log/0.log" Apr 02 14:37:34 crc kubenswrapper[4732]: I0402 14:37:34.856107 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_b5f7b2b4-c3da-49e6-b873-c2937dc27bbf/nova-cell0-conductor-conductor/0.log" Apr 02 14:37:34 crc kubenswrapper[4732]: I0402 14:37:34.871019 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16/nova-api-api/0.log" Apr 02 14:37:35 crc kubenswrapper[4732]: I0402 14:37:35.182160 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_43834d16-35ed-4baa-8292-26a762220c9a/nova-cell1-conductor-conductor/0.log" Apr 02 14:37:35 crc kubenswrapper[4732]: I0402 14:37:35.456416 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_35b61eb9-52fd-4d29-8942-c1c18b2f4aff/nova-cell1-novncproxy-novncproxy/0.log" Apr 02 14:37:35 crc kubenswrapper[4732]: I0402 14:37:35.676494 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4/nova-metadata-log/0.log" Apr 02 14:37:36 crc kubenswrapper[4732]: I0402 14:37:36.025842 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c6916cdf-dcaa-4e17-b33c-3fc6684abb46/nova-scheduler-scheduler/0.log" Apr 02 14:37:36 crc kubenswrapper[4732]: I0402 14:37:36.050493 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-gmk6c_f0eca204-c72d-4909-89ba-03d2b1976e07/nova-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:37:36 crc kubenswrapper[4732]: I0402 14:37:36.059810 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4/nova-metadata-metadata/0.log" Apr 02 14:37:36 crc kubenswrapper[4732]: I0402 14:37:36.264993 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_688bac91-aede-4c9f-a063-6469bb03db8c/mysql-bootstrap/0.log" Apr 02 14:37:36 crc kubenswrapper[4732]: I0402 14:37:36.448495 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_688bac91-aede-4c9f-a063-6469bb03db8c/mysql-bootstrap/0.log" Apr 02 14:37:36 crc kubenswrapper[4732]: I0402 14:37:36.490431 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_688bac91-aede-4c9f-a063-6469bb03db8c/galera/0.log" Apr 02 14:37:36 crc kubenswrapper[4732]: I0402 14:37:36.548362 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2fbe66fb-6f02-432d-8acf-50fec5339d96/mysql-bootstrap/0.log" Apr 02 14:37:36 crc kubenswrapper[4732]: I0402 14:37:36.791256 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2fbe66fb-6f02-432d-8acf-50fec5339d96/mysql-bootstrap/0.log" Apr 02 14:37:36 crc kubenswrapper[4732]: I0402 14:37:36.823976 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2fbe66fb-6f02-432d-8acf-50fec5339d96/galera/0.log" Apr 02 14:37:36 crc kubenswrapper[4732]: I0402 14:37:36.857645 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_66ae86e8-597c-4fdb-b0da-283cf37afba2/openstackclient/0.log" Apr 02 14:37:37 crc kubenswrapper[4732]: I0402 14:37:37.071879 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-5222s_8af6391f-4f8b-4473-8e7c-186c9c838527/ovn-controller/0.log" Apr 02 14:37:37 crc kubenswrapper[4732]: I0402 14:37:37.087420 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-7swjs_731d113e-365b-4d68-a0e9-402bb8a8e9b7/openstack-network-exporter/0.log" Apr 02 14:37:37 crc kubenswrapper[4732]: I0402 14:37:37.256884 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l4ttl_5eba7503-ee7b-40ba-a0dc-e11fad40c2b7/ovsdb-server-init/0.log" Apr 02 14:37:37 crc kubenswrapper[4732]: I0402 14:37:37.454046 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l4ttl_5eba7503-ee7b-40ba-a0dc-e11fad40c2b7/ovsdb-server-init/0.log" Apr 02 14:37:37 crc kubenswrapper[4732]: I0402 14:37:37.525413 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l4ttl_5eba7503-ee7b-40ba-a0dc-e11fad40c2b7/ovs-vswitchd/0.log" Apr 02 14:37:37 crc kubenswrapper[4732]: I0402 14:37:37.536641 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l4ttl_5eba7503-ee7b-40ba-a0dc-e11fad40c2b7/ovsdb-server/0.log" Apr 02 14:37:37 crc kubenswrapper[4732]: I0402 14:37:37.753275 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3da22737-6be3-4ffc-afff-b5d7fb20a283/openstack-network-exporter/0.log" Apr 02 14:37:37 crc kubenswrapper[4732]: I0402 14:37:37.795590 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-bc8s7_e6ca7706-9083-4555-b762-1d24315b85ea/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:37:37 crc kubenswrapper[4732]: I0402 14:37:37.828888 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3da22737-6be3-4ffc-afff-b5d7fb20a283/ovn-northd/0.log" Apr 02 14:37:38 crc kubenswrapper[4732]: I0402 14:37:38.019933 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d8dff454-a625-4309-92b6-8ab92d4bd60a/openstack-network-exporter/0.log" Apr 02 14:37:38 crc kubenswrapper[4732]: I0402 14:37:38.035718 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d8dff454-a625-4309-92b6-8ab92d4bd60a/ovsdbserver-nb/0.log" Apr 02 14:37:38 crc kubenswrapper[4732]: I0402 14:37:38.263830 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f84d20f6-82ec-45d6-8487-4ed2ed90b286/ovsdbserver-sb/0.log" Apr 02 14:37:38 crc kubenswrapper[4732]: I0402 14:37:38.274844 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f84d20f6-82ec-45d6-8487-4ed2ed90b286/openstack-network-exporter/0.log" Apr 02 14:37:38 crc kubenswrapper[4732]: I0402 14:37:38.357300 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5688fc477d-p59pf_c11a1fe8-1217-4e5b-b172-642b85527099/placement-api/0.log" Apr 02 14:37:38 crc kubenswrapper[4732]: I0402 14:37:38.565176 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5688fc477d-p59pf_c11a1fe8-1217-4e5b-b172-642b85527099/placement-log/0.log" Apr 02 14:37:38 crc kubenswrapper[4732]: I0402 14:37:38.590822 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_29e95846-a0bc-4d8b-ad4d-457766418564/setup-container/0.log" Apr 02 14:37:38 crc kubenswrapper[4732]: I0402 14:37:38.797988 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_29e95846-a0bc-4d8b-ad4d-457766418564/setup-container/0.log" Apr 02 14:37:38 crc kubenswrapper[4732]: I0402 14:37:38.878279 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c9e1cd50-72d3-4ccc-9f49-c4c1619252fc/setup-container/0.log" Apr 02 14:37:38 crc kubenswrapper[4732]: I0402 14:37:38.900238 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_29e95846-a0bc-4d8b-ad4d-457766418564/rabbitmq/0.log" Apr 02 14:37:39 crc kubenswrapper[4732]: I0402 14:37:39.048590 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c9e1cd50-72d3-4ccc-9f49-c4c1619252fc/setup-container/0.log" Apr 02 14:37:39 crc kubenswrapper[4732]: I0402 14:37:39.098838 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c9e1cd50-72d3-4ccc-9f49-c4c1619252fc/rabbitmq/0.log" Apr 02 14:37:39 crc kubenswrapper[4732]: I0402 14:37:39.130765 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg_160a19c0-4b2b-439a-9ea5-0f0ec2d4aede/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:37:39 crc kubenswrapper[4732]: I0402 14:37:39.293750 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-szt5p_d2c0401e-94b6-46b0-84f5-59ffac42c2f7/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:37:39 crc kubenswrapper[4732]: I0402 14:37:39.416230 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w_28b7c53a-39ed-4eea-8697-50dc3eb09818/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:37:39 crc kubenswrapper[4732]: I0402 14:37:39.572270 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-sdvqf_c5427d0c-bb3a-491e-8461-8e189da84bd9/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:37:39 crc kubenswrapper[4732]: I0402 14:37:39.683794 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-8ct56_ede1fe7d-16b7-41be-af74-8933aa0a1e83/ssh-known-hosts-edpm-deployment/0.log" Apr 02 14:37:39 crc kubenswrapper[4732]: I0402 14:37:39.898803 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9f57ff6c-7m8sr_7f6ffca1-ce91-4e20-8cbc-38a3eab1616e/proxy-httpd/0.log" Apr 02 14:37:39 crc kubenswrapper[4732]: I0402 14:37:39.924112 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9f57ff6c-7m8sr_7f6ffca1-ce91-4e20-8cbc-38a3eab1616e/proxy-server/0.log" Apr 02 14:37:40 crc kubenswrapper[4732]: I0402 14:37:40.063011 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-wsskk_81138fff-8b7c-4cf3-8aa5-2582d80483e1/swift-ring-rebalance/0.log" Apr 02 14:37:40 crc kubenswrapper[4732]: I0402 14:37:40.139547 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/account-auditor/0.log" Apr 02 14:37:40 crc kubenswrapper[4732]: I0402 14:37:40.221117 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/account-reaper/0.log" Apr 02 14:37:40 crc kubenswrapper[4732]: I0402 14:37:40.269285 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/account-replicator/0.log" Apr 02 14:37:40 crc kubenswrapper[4732]: I0402 14:37:40.384227 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/account-server/0.log" Apr 02 14:37:40 crc kubenswrapper[4732]: I0402 14:37:40.433606 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/container-auditor/0.log" Apr 02 14:37:40 crc kubenswrapper[4732]: I0402 14:37:40.492692 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/container-replicator/0.log" Apr 02 14:37:40 crc kubenswrapper[4732]: I0402 14:37:40.501080 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/container-server/0.log" Apr 02 14:37:40 crc kubenswrapper[4732]: I0402 14:37:40.615085 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/container-updater/0.log" Apr 02 14:37:40 crc kubenswrapper[4732]: I0402 14:37:40.683739 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/object-auditor/0.log" Apr 02 14:37:40 crc kubenswrapper[4732]: I0402 14:37:40.721307 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/object-expirer/0.log" Apr 02 14:37:40 crc kubenswrapper[4732]: I0402 14:37:40.785598 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/object-replicator/0.log" Apr 02 14:37:40 crc kubenswrapper[4732]: I0402 14:37:40.828718 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/object-server/0.log" Apr 02 14:37:40 crc kubenswrapper[4732]: I0402 14:37:40.936012 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/object-updater/0.log" Apr 02 14:37:40 crc kubenswrapper[4732]: I0402 14:37:40.953187 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/rsync/0.log" Apr 02 14:37:40 crc kubenswrapper[4732]: I0402 14:37:40.982145 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/swift-recon-cron/0.log" Apr 02 14:37:41 crc kubenswrapper[4732]: I0402 14:37:41.307194 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_17645883-477c-437a-b87a-b412f9bbe29e/tempest-tests-tempest-tests-runner/0.log" Apr 02 14:37:41 crc kubenswrapper[4732]: I0402 14:37:41.460750 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_c4323eb5-6b45-4766-961f-eef53306dad0/test-operator-logs-container/0.log" Apr 02 14:37:41 crc kubenswrapper[4732]: I0402 14:37:41.607862 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv_ce9af86e-92fb-4693-8af9-4d95af13b999/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:37:41 crc kubenswrapper[4732]: I0402 14:37:41.630822 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-9frcg_0e3946af-2a00-4313-9a3b-79acd9152f58/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:37:48 crc kubenswrapper[4732]: I0402 14:37:48.097965 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_4f87d2b1-82d0-4126-aeae-46aa84ba3d1f/memcached/0.log" Apr 02 14:37:57 crc kubenswrapper[4732]: I0402 14:37:57.253219 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4tnxp"] Apr 02 14:37:57 crc kubenswrapper[4732]: E0402 14:37:57.254153 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="177c463b-c25b-4c02-a9c1-db9d97ebbd00" containerName="container-00" Apr 02 14:37:57 crc kubenswrapper[4732]: I0402 14:37:57.254166 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="177c463b-c25b-4c02-a9c1-db9d97ebbd00" containerName="container-00" Apr 02 14:37:57 crc kubenswrapper[4732]: I0402 14:37:57.254403 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="177c463b-c25b-4c02-a9c1-db9d97ebbd00" containerName="container-00" Apr 02 14:37:57 crc kubenswrapper[4732]: I0402 14:37:57.256090 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4tnxp" Apr 02 14:37:57 crc kubenswrapper[4732]: I0402 14:37:57.274652 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4tnxp"] Apr 02 14:37:57 crc kubenswrapper[4732]: I0402 14:37:57.413034 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cebdcb3-464d-42bd-b673-8cc2942d2e85-utilities\") pod \"community-operators-4tnxp\" (UID: \"3cebdcb3-464d-42bd-b673-8cc2942d2e85\") " pod="openshift-marketplace/community-operators-4tnxp" Apr 02 14:37:57 crc kubenswrapper[4732]: I0402 14:37:57.413202 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cebdcb3-464d-42bd-b673-8cc2942d2e85-catalog-content\") pod \"community-operators-4tnxp\" (UID: \"3cebdcb3-464d-42bd-b673-8cc2942d2e85\") " pod="openshift-marketplace/community-operators-4tnxp" Apr 02 14:37:57 crc kubenswrapper[4732]: I0402 14:37:57.413260 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqg98\" (UniqueName: \"kubernetes.io/projected/3cebdcb3-464d-42bd-b673-8cc2942d2e85-kube-api-access-dqg98\") pod \"community-operators-4tnxp\" (UID: \"3cebdcb3-464d-42bd-b673-8cc2942d2e85\") " pod="openshift-marketplace/community-operators-4tnxp" Apr 02 14:37:57 crc kubenswrapper[4732]: I0402 14:37:57.514860 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cebdcb3-464d-42bd-b673-8cc2942d2e85-utilities\") pod \"community-operators-4tnxp\" (UID: \"3cebdcb3-464d-42bd-b673-8cc2942d2e85\") " pod="openshift-marketplace/community-operators-4tnxp" Apr 02 14:37:57 crc kubenswrapper[4732]: I0402 14:37:57.515016 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cebdcb3-464d-42bd-b673-8cc2942d2e85-catalog-content\") pod \"community-operators-4tnxp\" (UID: \"3cebdcb3-464d-42bd-b673-8cc2942d2e85\") " pod="openshift-marketplace/community-operators-4tnxp" Apr 02 14:37:57 crc kubenswrapper[4732]: I0402 14:37:57.515080 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqg98\" (UniqueName: \"kubernetes.io/projected/3cebdcb3-464d-42bd-b673-8cc2942d2e85-kube-api-access-dqg98\") pod \"community-operators-4tnxp\" (UID: \"3cebdcb3-464d-42bd-b673-8cc2942d2e85\") " pod="openshift-marketplace/community-operators-4tnxp" Apr 02 14:37:57 crc kubenswrapper[4732]: I0402 14:37:57.515409 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cebdcb3-464d-42bd-b673-8cc2942d2e85-utilities\") pod \"community-operators-4tnxp\" (UID: \"3cebdcb3-464d-42bd-b673-8cc2942d2e85\") " pod="openshift-marketplace/community-operators-4tnxp" Apr 02 14:37:57 crc kubenswrapper[4732]: I0402 14:37:57.515502 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cebdcb3-464d-42bd-b673-8cc2942d2e85-catalog-content\") pod \"community-operators-4tnxp\" (UID: \"3cebdcb3-464d-42bd-b673-8cc2942d2e85\") " pod="openshift-marketplace/community-operators-4tnxp" Apr 02 14:37:57 crc kubenswrapper[4732]: I0402 14:37:57.543194 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqg98\" (UniqueName: \"kubernetes.io/projected/3cebdcb3-464d-42bd-b673-8cc2942d2e85-kube-api-access-dqg98\") pod \"community-operators-4tnxp\" (UID: \"3cebdcb3-464d-42bd-b673-8cc2942d2e85\") " pod="openshift-marketplace/community-operators-4tnxp" Apr 02 14:37:57 crc kubenswrapper[4732]: I0402 14:37:57.584227 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4tnxp" Apr 02 14:37:58 crc kubenswrapper[4732]: I0402 14:37:58.111788 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4tnxp"] Apr 02 14:37:58 crc kubenswrapper[4732]: I0402 14:37:58.686907 4732 generic.go:334] "Generic (PLEG): container finished" podID="3cebdcb3-464d-42bd-b673-8cc2942d2e85" containerID="abd975363dcc7e6494657e30d23def14f6c1876ad3a0bd416725589cf985322a" exitCode=0 Apr 02 14:37:58 crc kubenswrapper[4732]: I0402 14:37:58.700596 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tnxp" event={"ID":"3cebdcb3-464d-42bd-b673-8cc2942d2e85","Type":"ContainerDied","Data":"abd975363dcc7e6494657e30d23def14f6c1876ad3a0bd416725589cf985322a"} Apr 02 14:37:58 crc kubenswrapper[4732]: I0402 14:37:58.700837 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tnxp" event={"ID":"3cebdcb3-464d-42bd-b673-8cc2942d2e85","Type":"ContainerStarted","Data":"f65d9a5e52622ad142fb8ea86c42e873a1e1242925452d1b280fe080e30db28d"} Apr 02 14:37:59 crc kubenswrapper[4732]: I0402 14:37:59.698957 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tnxp" event={"ID":"3cebdcb3-464d-42bd-b673-8cc2942d2e85","Type":"ContainerStarted","Data":"0a77d8419fed9728356517b9ba3438ced4bd0fb7cf77cb2c8b7e2f083e703fd0"} Apr 02 14:38:00 crc kubenswrapper[4732]: I0402 14:38:00.140796 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585678-tc75x"] Apr 02 14:38:00 crc kubenswrapper[4732]: I0402 14:38:00.141980 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585678-tc75x" Apr 02 14:38:00 crc kubenswrapper[4732]: I0402 14:38:00.143874 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:38:00 crc kubenswrapper[4732]: I0402 14:38:00.145602 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:38:00 crc kubenswrapper[4732]: I0402 14:38:00.145629 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:38:00 crc kubenswrapper[4732]: I0402 14:38:00.153479 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585678-tc75x"] Apr 02 14:38:00 crc kubenswrapper[4732]: I0402 14:38:00.272299 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9xwt\" (UniqueName: \"kubernetes.io/projected/1c4b5ea7-e9e1-45d9-80b7-e3a61b5464cd-kube-api-access-c9xwt\") pod \"auto-csr-approver-29585678-tc75x\" (UID: \"1c4b5ea7-e9e1-45d9-80b7-e3a61b5464cd\") " pod="openshift-infra/auto-csr-approver-29585678-tc75x" Apr 02 14:38:00 crc kubenswrapper[4732]: I0402 14:38:00.374495 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9xwt\" (UniqueName: \"kubernetes.io/projected/1c4b5ea7-e9e1-45d9-80b7-e3a61b5464cd-kube-api-access-c9xwt\") pod \"auto-csr-approver-29585678-tc75x\" (UID: \"1c4b5ea7-e9e1-45d9-80b7-e3a61b5464cd\") " pod="openshift-infra/auto-csr-approver-29585678-tc75x" Apr 02 14:38:00 crc kubenswrapper[4732]: I0402 14:38:00.400012 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9xwt\" (UniqueName: \"kubernetes.io/projected/1c4b5ea7-e9e1-45d9-80b7-e3a61b5464cd-kube-api-access-c9xwt\") pod \"auto-csr-approver-29585678-tc75x\" (UID: \"1c4b5ea7-e9e1-45d9-80b7-e3a61b5464cd\") " pod="openshift-infra/auto-csr-approver-29585678-tc75x" Apr 02 14:38:00 crc kubenswrapper[4732]: I0402 14:38:00.463675 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585678-tc75x" Apr 02 14:38:00 crc kubenswrapper[4732]: I0402 14:38:00.711339 4732 generic.go:334] "Generic (PLEG): container finished" podID="3cebdcb3-464d-42bd-b673-8cc2942d2e85" containerID="0a77d8419fed9728356517b9ba3438ced4bd0fb7cf77cb2c8b7e2f083e703fd0" exitCode=0 Apr 02 14:38:00 crc kubenswrapper[4732]: I0402 14:38:00.711449 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tnxp" event={"ID":"3cebdcb3-464d-42bd-b673-8cc2942d2e85","Type":"ContainerDied","Data":"0a77d8419fed9728356517b9ba3438ced4bd0fb7cf77cb2c8b7e2f083e703fd0"} Apr 02 14:38:00 crc kubenswrapper[4732]: I0402 14:38:00.886666 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585678-tc75x"] Apr 02 14:38:00 crc kubenswrapper[4732]: W0402 14:38:00.888154 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c4b5ea7_e9e1_45d9_80b7_e3a61b5464cd.slice/crio-ba4d960fae246dafb03827949007fe294196918446e930bde18011bd5466001d WatchSource:0}: Error finding container ba4d960fae246dafb03827949007fe294196918446e930bde18011bd5466001d: Status 404 returned error can't find the container with id ba4d960fae246dafb03827949007fe294196918446e930bde18011bd5466001d Apr 02 14:38:01 crc kubenswrapper[4732]: I0402 14:38:01.722361 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585678-tc75x" event={"ID":"1c4b5ea7-e9e1-45d9-80b7-e3a61b5464cd","Type":"ContainerStarted","Data":"ba4d960fae246dafb03827949007fe294196918446e930bde18011bd5466001d"} Apr 02 14:38:01 crc kubenswrapper[4732]: I0402 14:38:01.727035 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tnxp" event={"ID":"3cebdcb3-464d-42bd-b673-8cc2942d2e85","Type":"ContainerStarted","Data":"af0a5bb7dd127027db680914d0b2c10a4fd7f7eafdc1cbe1a883b522c9c4daf1"} Apr 02 14:38:01 crc kubenswrapper[4732]: I0402 14:38:01.751726 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4tnxp" podStartSLOduration=2.390848387 podStartE2EDuration="4.751707021s" podCreationTimestamp="2026-04-02 14:37:57 +0000 UTC" firstStartedPulling="2026-04-02 14:37:58.691814173 +0000 UTC m=+3635.596221746" lastFinishedPulling="2026-04-02 14:38:01.052672817 +0000 UTC m=+3637.957080380" observedRunningTime="2026-04-02 14:38:01.748727061 +0000 UTC m=+3638.653134654" watchObservedRunningTime="2026-04-02 14:38:01.751707021 +0000 UTC m=+3638.656114584" Apr 02 14:38:02 crc kubenswrapper[4732]: I0402 14:38:02.751560 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585678-tc75x" event={"ID":"1c4b5ea7-e9e1-45d9-80b7-e3a61b5464cd","Type":"ContainerStarted","Data":"fb2895b298118bda51efe5d7a874c2e09106874e5e025a83b0573f2fac1fb3d8"} Apr 02 14:38:02 crc kubenswrapper[4732]: I0402 14:38:02.767367 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29585678-tc75x" podStartSLOduration=1.408522396 podStartE2EDuration="2.767351281s" podCreationTimestamp="2026-04-02 14:38:00 +0000 UTC" firstStartedPulling="2026-04-02 14:38:00.890373716 +0000 UTC m=+3637.794781269" lastFinishedPulling="2026-04-02 14:38:02.249202591 +0000 UTC m=+3639.153610154" observedRunningTime="2026-04-02 14:38:02.764528624 +0000 UTC m=+3639.668936187" watchObservedRunningTime="2026-04-02 14:38:02.767351281 +0000 UTC m=+3639.671758834" Apr 02 14:38:03 crc kubenswrapper[4732]: I0402 14:38:03.764484 4732 generic.go:334] "Generic (PLEG): container finished" podID="1c4b5ea7-e9e1-45d9-80b7-e3a61b5464cd" containerID="fb2895b298118bda51efe5d7a874c2e09106874e5e025a83b0573f2fac1fb3d8" exitCode=0 Apr 02 14:38:03 crc kubenswrapper[4732]: I0402 14:38:03.764529 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585678-tc75x" event={"ID":"1c4b5ea7-e9e1-45d9-80b7-e3a61b5464cd","Type":"ContainerDied","Data":"fb2895b298118bda51efe5d7a874c2e09106874e5e025a83b0573f2fac1fb3d8"} Apr 02 14:38:05 crc kubenswrapper[4732]: I0402 14:38:05.111666 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585678-tc75x" Apr 02 14:38:05 crc kubenswrapper[4732]: I0402 14:38:05.287289 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9xwt\" (UniqueName: \"kubernetes.io/projected/1c4b5ea7-e9e1-45d9-80b7-e3a61b5464cd-kube-api-access-c9xwt\") pod \"1c4b5ea7-e9e1-45d9-80b7-e3a61b5464cd\" (UID: \"1c4b5ea7-e9e1-45d9-80b7-e3a61b5464cd\") " Apr 02 14:38:05 crc kubenswrapper[4732]: I0402 14:38:05.294926 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c4b5ea7-e9e1-45d9-80b7-e3a61b5464cd-kube-api-access-c9xwt" (OuterVolumeSpecName: "kube-api-access-c9xwt") pod "1c4b5ea7-e9e1-45d9-80b7-e3a61b5464cd" (UID: "1c4b5ea7-e9e1-45d9-80b7-e3a61b5464cd"). InnerVolumeSpecName "kube-api-access-c9xwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:38:05 crc kubenswrapper[4732]: I0402 14:38:05.390301 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9xwt\" (UniqueName: \"kubernetes.io/projected/1c4b5ea7-e9e1-45d9-80b7-e3a61b5464cd-kube-api-access-c9xwt\") on node \"crc\" DevicePath \"\"" Apr 02 14:38:05 crc kubenswrapper[4732]: I0402 14:38:05.765904 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf_85e10d53-2bd5-4a87-8ec8-89a9ef13f766/util/0.log" Apr 02 14:38:05 crc kubenswrapper[4732]: I0402 14:38:05.782451 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585678-tc75x" event={"ID":"1c4b5ea7-e9e1-45d9-80b7-e3a61b5464cd","Type":"ContainerDied","Data":"ba4d960fae246dafb03827949007fe294196918446e930bde18011bd5466001d"} Apr 02 14:38:05 crc kubenswrapper[4732]: I0402 14:38:05.782764 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba4d960fae246dafb03827949007fe294196918446e930bde18011bd5466001d" Apr 02 14:38:05 crc kubenswrapper[4732]: I0402 14:38:05.782532 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585678-tc75x" Apr 02 14:38:05 crc kubenswrapper[4732]: I0402 14:38:05.832278 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585672-z2vvj"] Apr 02 14:38:05 crc kubenswrapper[4732]: I0402 14:38:05.846563 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585672-z2vvj"] Apr 02 14:38:05 crc kubenswrapper[4732]: I0402 14:38:05.959540 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf_85e10d53-2bd5-4a87-8ec8-89a9ef13f766/util/0.log" Apr 02 14:38:05 crc kubenswrapper[4732]: I0402 14:38:05.961887 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf_85e10d53-2bd5-4a87-8ec8-89a9ef13f766/pull/0.log" Apr 02 14:38:05 crc kubenswrapper[4732]: I0402 14:38:05.982013 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf_85e10d53-2bd5-4a87-8ec8-89a9ef13f766/pull/0.log" Apr 02 14:38:06 crc kubenswrapper[4732]: I0402 14:38:06.188789 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf_85e10d53-2bd5-4a87-8ec8-89a9ef13f766/util/0.log" Apr 02 14:38:06 crc kubenswrapper[4732]: I0402 14:38:06.265146 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf_85e10d53-2bd5-4a87-8ec8-89a9ef13f766/extract/0.log" Apr 02 14:38:06 crc kubenswrapper[4732]: I0402 14:38:06.295667 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf_85e10d53-2bd5-4a87-8ec8-89a9ef13f766/pull/0.log" Apr 02 14:38:06 crc kubenswrapper[4732]: I0402 14:38:06.579903 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d46cccfb9-65vqg_d925f7c0-af6d-49d5-a09f-82afb7c58a15/manager/0.log" Apr 02 14:38:06 crc kubenswrapper[4732]: I0402 14:38:06.614928 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-86644c9c9c-nhxqn_08d5eea8-7c67-4aa1-ad91-ab1c60214872/manager/0.log" Apr 02 14:38:06 crc kubenswrapper[4732]: I0402 14:38:06.655767 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-58689c6fff-47xk7_4c76cc17-ab86-4c9c-9438-7e72e2ce895f/manager/0.log" Apr 02 14:38:06 crc kubenswrapper[4732]: I0402 14:38:06.695860 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16ee98f5-a231-400e-89c4-6f663c21da1c" path="/var/lib/kubelet/pods/16ee98f5-a231-400e-89c4-6f663c21da1c/volumes" Apr 02 14:38:06 crc kubenswrapper[4732]: I0402 14:38:06.868181 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-648bdc7f99-vt6x9_5e74dfe1-0e0f-4b70-8b9a-db645eb40e05/manager/0.log" Apr 02 14:38:06 crc kubenswrapper[4732]: I0402 14:38:06.933597 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8684f86954-xgncs_12296214-f552-4868-8884-66c241eb973b/manager/0.log" Apr 02 14:38:07 crc kubenswrapper[4732]: I0402 14:38:07.138333 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6ccfd84cb4-hv8p6_879197e5-dc13-4c17-b8ac-7e51a97aa0f2/manager/0.log" Apr 02 14:38:07 crc kubenswrapper[4732]: I0402 14:38:07.302869 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5f96574b5-nbm76_e46394c5-fd9e-4c0d-8e78-96723f5931d9/manager/0.log" Apr 02 14:38:07 crc kubenswrapper[4732]: I0402 14:38:07.450292 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-58f79b884c-5q7cz_43f86830-d407-4dc4-9b09-388fb5db82c8/manager/0.log" Apr 02 14:38:07 crc kubenswrapper[4732]: I0402 14:38:07.483429 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-dbf8bb784-4kz5n_6b75349c-23b4-4dc0-914f-f1dc82b12e18/manager/0.log" Apr 02 14:38:07 crc kubenswrapper[4732]: I0402 14:38:07.585181 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4tnxp" Apr 02 14:38:07 crc kubenswrapper[4732]: I0402 14:38:07.585253 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4tnxp" Apr 02 14:38:07 crc kubenswrapper[4732]: I0402 14:38:07.664096 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4tnxp" Apr 02 14:38:07 crc kubenswrapper[4732]: I0402 14:38:07.717820 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6b7497dc59-ph5hk_084daf4c-82c9-42e7-8eb9-3ae4658c1742/manager/0.log" Apr 02 14:38:07 crc kubenswrapper[4732]: I0402 14:38:07.846748 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4tnxp" Apr 02 14:38:07 crc kubenswrapper[4732]: I0402 14:38:07.858316 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6554749d88-4cwml_338e9bfc-709f-49f2-8456-9dbe8b815382/manager/0.log" Apr 02 14:38:07 crc kubenswrapper[4732]: I0402 14:38:07.918113 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-z49v9_1c68b230-3f85-41f9-a6ed-7da1d0738748/manager/0.log" Apr 02 14:38:08 crc kubenswrapper[4732]: I0402 14:38:08.132852 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d6f9fd68c-mvbqt_b49d6074-a4b1-4658-b6b8-95bfe63163b0/manager/0.log" Apr 02 14:38:08 crc kubenswrapper[4732]: I0402 14:38:08.175381 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7594f57946-2rvck_5319722c-7913-4dcd-a03d-dc7a5040b434/manager/0.log" Apr 02 14:38:08 crc kubenswrapper[4732]: I0402 14:38:08.299994 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh_d5a07520-1380-45b1-a00a-7148b158711e/manager/0.log" Apr 02 14:38:08 crc kubenswrapper[4732]: I0402 14:38:08.456257 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-f786688f5-tv7s7_1e51ae33-6c4c-4e7f-8309-ef0c8901b6ed/operator/0.log" Apr 02 14:38:08 crc kubenswrapper[4732]: I0402 14:38:08.577333 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-r7tzm_7dea8805-bd1c-400b-bddf-d3ac2cd57617/registry-server/0.log" Apr 02 14:38:08 crc kubenswrapper[4732]: I0402 14:38:08.858083 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-84464c7c78-tgrmk_bec355a9-c60e-4480-a32c-f1a43ef27131/manager/0.log" Apr 02 14:38:08 crc kubenswrapper[4732]: I0402 14:38:08.950335 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-559d8fdb6b-mfzk4_6296d461-d333-4b9c-a082-e48db64bdd96/manager/0.log" Apr 02 14:38:09 crc kubenswrapper[4732]: I0402 14:38:09.241162 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-fbdcf7f7b-bw2df_5b424dbc-80ac-46ae-90d2-c69fdf4c14d7/manager/0.log" Apr 02 14:38:09 crc kubenswrapper[4732]: I0402 14:38:09.399157 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6f76d4c7-nbxg9_426c551b-e661-40e0-9aa3-a83897ce2814/manager/0.log" Apr 02 14:38:09 crc kubenswrapper[4732]: I0402 14:38:09.501809 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56ccc97cf5-ztlzz_bd54902c-4922-4c49-85c1-280af54370ba/manager/0.log" Apr 02 14:38:09 crc kubenswrapper[4732]: I0402 14:38:09.656128 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5985877f6-hxnth_e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f/manager/0.log" Apr 02 14:38:09 crc kubenswrapper[4732]: I0402 14:38:09.732207 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-989fbd45-w2zrf_64892a56-9180-4d1d-ad33-d87caa5f2002/manager/0.log" Apr 02 14:38:11 crc kubenswrapper[4732]: I0402 14:38:11.244836 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4tnxp"] Apr 02 14:38:11 crc kubenswrapper[4732]: I0402 14:38:11.245407 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4tnxp" podUID="3cebdcb3-464d-42bd-b673-8cc2942d2e85" containerName="registry-server" containerID="cri-o://af0a5bb7dd127027db680914d0b2c10a4fd7f7eafdc1cbe1a883b522c9c4daf1" gracePeriod=2 Apr 02 14:38:11 crc kubenswrapper[4732]: I0402 14:38:11.733078 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4tnxp" Apr 02 14:38:11 crc kubenswrapper[4732]: I0402 14:38:11.806697 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqg98\" (UniqueName: \"kubernetes.io/projected/3cebdcb3-464d-42bd-b673-8cc2942d2e85-kube-api-access-dqg98\") pod \"3cebdcb3-464d-42bd-b673-8cc2942d2e85\" (UID: \"3cebdcb3-464d-42bd-b673-8cc2942d2e85\") " Apr 02 14:38:11 crc kubenswrapper[4732]: I0402 14:38:11.806771 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cebdcb3-464d-42bd-b673-8cc2942d2e85-catalog-content\") pod \"3cebdcb3-464d-42bd-b673-8cc2942d2e85\" (UID: \"3cebdcb3-464d-42bd-b673-8cc2942d2e85\") " Apr 02 14:38:11 crc kubenswrapper[4732]: I0402 14:38:11.806841 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cebdcb3-464d-42bd-b673-8cc2942d2e85-utilities\") pod \"3cebdcb3-464d-42bd-b673-8cc2942d2e85\" (UID: \"3cebdcb3-464d-42bd-b673-8cc2942d2e85\") " Apr 02 14:38:11 crc kubenswrapper[4732]: I0402 14:38:11.810479 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cebdcb3-464d-42bd-b673-8cc2942d2e85-utilities" (OuterVolumeSpecName: "utilities") pod "3cebdcb3-464d-42bd-b673-8cc2942d2e85" (UID: "3cebdcb3-464d-42bd-b673-8cc2942d2e85"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:38:11 crc kubenswrapper[4732]: I0402 14:38:11.815271 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cebdcb3-464d-42bd-b673-8cc2942d2e85-kube-api-access-dqg98" (OuterVolumeSpecName: "kube-api-access-dqg98") pod "3cebdcb3-464d-42bd-b673-8cc2942d2e85" (UID: "3cebdcb3-464d-42bd-b673-8cc2942d2e85"). InnerVolumeSpecName "kube-api-access-dqg98". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:38:11 crc kubenswrapper[4732]: I0402 14:38:11.851538 4732 generic.go:334] "Generic (PLEG): container finished" podID="3cebdcb3-464d-42bd-b673-8cc2942d2e85" containerID="af0a5bb7dd127027db680914d0b2c10a4fd7f7eafdc1cbe1a883b522c9c4daf1" exitCode=0 Apr 02 14:38:11 crc kubenswrapper[4732]: I0402 14:38:11.851554 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4tnxp" Apr 02 14:38:11 crc kubenswrapper[4732]: I0402 14:38:11.851586 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tnxp" event={"ID":"3cebdcb3-464d-42bd-b673-8cc2942d2e85","Type":"ContainerDied","Data":"af0a5bb7dd127027db680914d0b2c10a4fd7f7eafdc1cbe1a883b522c9c4daf1"} Apr 02 14:38:11 crc kubenswrapper[4732]: I0402 14:38:11.851637 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tnxp" event={"ID":"3cebdcb3-464d-42bd-b673-8cc2942d2e85","Type":"ContainerDied","Data":"f65d9a5e52622ad142fb8ea86c42e873a1e1242925452d1b280fe080e30db28d"} Apr 02 14:38:11 crc kubenswrapper[4732]: I0402 14:38:11.851658 4732 scope.go:117] "RemoveContainer" containerID="af0a5bb7dd127027db680914d0b2c10a4fd7f7eafdc1cbe1a883b522c9c4daf1" Apr 02 14:38:11 crc kubenswrapper[4732]: I0402 14:38:11.887998 4732 scope.go:117] "RemoveContainer" containerID="0a77d8419fed9728356517b9ba3438ced4bd0fb7cf77cb2c8b7e2f083e703fd0" Apr 02 14:38:11 crc kubenswrapper[4732]: I0402 14:38:11.909776 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqg98\" (UniqueName: \"kubernetes.io/projected/3cebdcb3-464d-42bd-b673-8cc2942d2e85-kube-api-access-dqg98\") on node \"crc\" DevicePath \"\"" Apr 02 14:38:11 crc kubenswrapper[4732]: I0402 14:38:11.909811 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cebdcb3-464d-42bd-b673-8cc2942d2e85-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 14:38:11 crc kubenswrapper[4732]: I0402 14:38:11.913642 4732 scope.go:117] "RemoveContainer" containerID="abd975363dcc7e6494657e30d23def14f6c1876ad3a0bd416725589cf985322a" Apr 02 14:38:11 crc kubenswrapper[4732]: I0402 14:38:11.915121 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cebdcb3-464d-42bd-b673-8cc2942d2e85-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3cebdcb3-464d-42bd-b673-8cc2942d2e85" (UID: "3cebdcb3-464d-42bd-b673-8cc2942d2e85"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:38:11 crc kubenswrapper[4732]: I0402 14:38:11.968301 4732 scope.go:117] "RemoveContainer" containerID="af0a5bb7dd127027db680914d0b2c10a4fd7f7eafdc1cbe1a883b522c9c4daf1" Apr 02 14:38:11 crc kubenswrapper[4732]: E0402 14:38:11.968744 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af0a5bb7dd127027db680914d0b2c10a4fd7f7eafdc1cbe1a883b522c9c4daf1\": container with ID starting with af0a5bb7dd127027db680914d0b2c10a4fd7f7eafdc1cbe1a883b522c9c4daf1 not found: ID does not exist" containerID="af0a5bb7dd127027db680914d0b2c10a4fd7f7eafdc1cbe1a883b522c9c4daf1" Apr 02 14:38:11 crc kubenswrapper[4732]: I0402 14:38:11.968802 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0a5bb7dd127027db680914d0b2c10a4fd7f7eafdc1cbe1a883b522c9c4daf1"} err="failed to get container status \"af0a5bb7dd127027db680914d0b2c10a4fd7f7eafdc1cbe1a883b522c9c4daf1\": rpc error: code = NotFound desc = could not find container \"af0a5bb7dd127027db680914d0b2c10a4fd7f7eafdc1cbe1a883b522c9c4daf1\": container with ID starting with af0a5bb7dd127027db680914d0b2c10a4fd7f7eafdc1cbe1a883b522c9c4daf1 not found: ID does not exist" Apr 02 14:38:11 crc kubenswrapper[4732]: I0402 14:38:11.968836 4732 scope.go:117] "RemoveContainer" containerID="0a77d8419fed9728356517b9ba3438ced4bd0fb7cf77cb2c8b7e2f083e703fd0" Apr 02 14:38:11 crc kubenswrapper[4732]: E0402 14:38:11.969114 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a77d8419fed9728356517b9ba3438ced4bd0fb7cf77cb2c8b7e2f083e703fd0\": container with ID starting with 0a77d8419fed9728356517b9ba3438ced4bd0fb7cf77cb2c8b7e2f083e703fd0 not found: ID does not exist" containerID="0a77d8419fed9728356517b9ba3438ced4bd0fb7cf77cb2c8b7e2f083e703fd0" Apr 02 14:38:11 crc kubenswrapper[4732]: I0402 14:38:11.969145 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a77d8419fed9728356517b9ba3438ced4bd0fb7cf77cb2c8b7e2f083e703fd0"} err="failed to get container status \"0a77d8419fed9728356517b9ba3438ced4bd0fb7cf77cb2c8b7e2f083e703fd0\": rpc error: code = NotFound desc = could not find container \"0a77d8419fed9728356517b9ba3438ced4bd0fb7cf77cb2c8b7e2f083e703fd0\": container with ID starting with 0a77d8419fed9728356517b9ba3438ced4bd0fb7cf77cb2c8b7e2f083e703fd0 not found: ID does not exist" Apr 02 14:38:11 crc kubenswrapper[4732]: I0402 14:38:11.969165 4732 scope.go:117] "RemoveContainer" containerID="abd975363dcc7e6494657e30d23def14f6c1876ad3a0bd416725589cf985322a" Apr 02 14:38:11 crc kubenswrapper[4732]: E0402 14:38:11.969846 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abd975363dcc7e6494657e30d23def14f6c1876ad3a0bd416725589cf985322a\": container with ID starting with abd975363dcc7e6494657e30d23def14f6c1876ad3a0bd416725589cf985322a not found: ID does not exist" containerID="abd975363dcc7e6494657e30d23def14f6c1876ad3a0bd416725589cf985322a" Apr 02 14:38:11 crc kubenswrapper[4732]: I0402 14:38:11.969872 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abd975363dcc7e6494657e30d23def14f6c1876ad3a0bd416725589cf985322a"} err="failed to get container status \"abd975363dcc7e6494657e30d23def14f6c1876ad3a0bd416725589cf985322a\": rpc error: code = NotFound desc = could not find container \"abd975363dcc7e6494657e30d23def14f6c1876ad3a0bd416725589cf985322a\": container with ID starting with abd975363dcc7e6494657e30d23def14f6c1876ad3a0bd416725589cf985322a not found: ID does not exist" Apr 02 14:38:12 crc kubenswrapper[4732]: I0402 14:38:12.011450 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cebdcb3-464d-42bd-b673-8cc2942d2e85-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 14:38:12 crc kubenswrapper[4732]: I0402 14:38:12.183107 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4tnxp"] Apr 02 14:38:12 crc kubenswrapper[4732]: I0402 14:38:12.194638 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4tnxp"] Apr 02 14:38:12 crc kubenswrapper[4732]: I0402 14:38:12.695244 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cebdcb3-464d-42bd-b673-8cc2942d2e85" path="/var/lib/kubelet/pods/3cebdcb3-464d-42bd-b673-8cc2942d2e85/volumes" Apr 02 14:38:27 crc kubenswrapper[4732]: I0402 14:38:27.869571 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-szr89_1eee8837-1a56-40df-b564-bb65ad94d593/control-plane-machine-set-operator/0.log" Apr 02 14:38:28 crc kubenswrapper[4732]: I0402 14:38:28.056095 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kpg9f_3568fcc7-10bd-4972-9782-b97aa3c9c8a0/kube-rbac-proxy/0.log" Apr 02 14:38:28 crc kubenswrapper[4732]: I0402 14:38:28.085458 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kpg9f_3568fcc7-10bd-4972-9782-b97aa3c9c8a0/machine-api-operator/0.log" Apr 02 14:38:39 crc kubenswrapper[4732]: I0402 14:38:39.836711 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-pnknx_fc67973f-1e3f-4aea-bf48-5914e7c8ddbb/cert-manager-controller/0.log" Apr 02 14:38:40 crc kubenswrapper[4732]: I0402 14:38:40.024576 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-fhv6z_4bbf6f84-13bb-4562-af35-10ab372d6580/cert-manager-webhook/0.log" Apr 02 14:38:40 crc kubenswrapper[4732]: I0402 14:38:40.030075 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-v9s5t_11b5be71-32b7-43a3-bf27-d6b1c73844c6/cert-manager-cainjector/0.log" Apr 02 14:38:51 crc kubenswrapper[4732]: I0402 14:38:51.931513 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7b5ddc4dc7-rpk7t_eae3a07b-45a3-4f0e-8fd6-ac653ab24deb/nmstate-console-plugin/0.log" Apr 02 14:38:52 crc kubenswrapper[4732]: I0402 14:38:52.073405 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-6mg2l_55a456a8-9ff7-4d10-a126-d662f361b74d/nmstate-handler/0.log" Apr 02 14:38:52 crc kubenswrapper[4732]: I0402 14:38:52.162707 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-cq88s_fb4efdc1-4ea6-4068-bea0-8f961de0328b/kube-rbac-proxy/0.log" Apr 02 14:38:52 crc kubenswrapper[4732]: I0402 14:38:52.209259 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-cq88s_fb4efdc1-4ea6-4068-bea0-8f961de0328b/nmstate-metrics/0.log" Apr 02 14:38:52 crc kubenswrapper[4732]: I0402 14:38:52.314169 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-6b8c6447b-p77cg_59b4decd-a99c-4637-bc8e-2a95d017696d/nmstate-operator/0.log" Apr 02 14:38:52 crc kubenswrapper[4732]: I0402 14:38:52.399978 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-pqqzz_2669e31a-19bb-42df-a5dd-5886b22e7674/nmstate-webhook/0.log" Apr 02 14:38:52 crc kubenswrapper[4732]: I0402 14:38:52.820918 4732 scope.go:117] "RemoveContainer" containerID="e991865fb2a57a5eb8462921eda79aaad441dba5696c26ba39b12cd683b2ecc4" Apr 02 14:39:01 crc kubenswrapper[4732]: I0402 14:39:01.925102 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:39:01 crc kubenswrapper[4732]: I0402 14:39:01.925753 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:39:19 crc kubenswrapper[4732]: I0402 14:39:19.285901 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bb64cd5d7-5795k_15b5f14a-3755-4967-b789-555f8ac970a2/kube-rbac-proxy/0.log" Apr 02 14:39:19 crc kubenswrapper[4732]: I0402 14:39:19.443779 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bb64cd5d7-5795k_15b5f14a-3755-4967-b789-555f8ac970a2/controller/0.log" Apr 02 14:39:19 crc kubenswrapper[4732]: I0402 14:39:19.586213 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/cp-frr-files/0.log" Apr 02 14:39:19 crc kubenswrapper[4732]: I0402 14:39:19.689068 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/cp-metrics/0.log" Apr 02 14:39:19 crc kubenswrapper[4732]: I0402 14:39:19.690705 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/cp-frr-files/0.log" Apr 02 14:39:19 crc kubenswrapper[4732]: I0402 14:39:19.710794 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/cp-reloader/0.log" Apr 02 14:39:19 crc kubenswrapper[4732]: I0402 14:39:19.783423 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/cp-reloader/0.log" Apr 02 14:39:19 crc kubenswrapper[4732]: I0402 14:39:19.916795 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/cp-frr-files/0.log" Apr 02 14:39:19 crc kubenswrapper[4732]: I0402 14:39:19.963775 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/cp-metrics/0.log" Apr 02 14:39:19 crc kubenswrapper[4732]: I0402 14:39:19.977810 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/cp-reloader/0.log" Apr 02 14:39:19 crc kubenswrapper[4732]: I0402 14:39:19.994377 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/cp-metrics/0.log" Apr 02 14:39:20 crc kubenswrapper[4732]: I0402 14:39:20.168900 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/cp-frr-files/0.log" Apr 02 14:39:20 crc kubenswrapper[4732]: I0402 14:39:20.173456 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/cp-reloader/0.log" Apr 02 14:39:20 crc kubenswrapper[4732]: I0402 14:39:20.201512 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/cp-metrics/0.log" Apr 02 14:39:20 crc kubenswrapper[4732]: I0402 14:39:20.218903 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/controller/0.log" Apr 02 14:39:20 crc kubenswrapper[4732]: I0402 14:39:20.372239 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/frr-metrics/0.log" Apr 02 14:39:20 crc kubenswrapper[4732]: I0402 14:39:20.397567 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/kube-rbac-proxy/0.log" Apr 02 14:39:20 crc kubenswrapper[4732]: I0402 14:39:20.401521 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/kube-rbac-proxy-frr/0.log" Apr 02 14:39:20 crc kubenswrapper[4732]: I0402 14:39:20.571256 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/reloader/0.log" Apr 02 14:39:20 crc kubenswrapper[4732]: I0402 14:39:20.604716 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-rrzxj_777a90c8-0e68-4362-a696-c92e0a49253f/frr-k8s-webhook-server/0.log" Apr 02 14:39:20 crc kubenswrapper[4732]: I0402 14:39:20.845106 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-86c87c56d7-qlfzr_bd386538-6696-4b2c-96e4-6f8e4b949364/manager/0.log" Apr 02 14:39:20 crc kubenswrapper[4732]: I0402 14:39:20.973116 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6948d8cf8d-vd8rt_329919c3-94d2-43c2-94a8-2ba9518b98fa/webhook-server/0.log" Apr 02 14:39:21 crc kubenswrapper[4732]: I0402 14:39:21.116828 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ng8gx_c89914a1-adc8-4baa-ae73-ec02091fca58/kube-rbac-proxy/0.log" Apr 02 14:39:21 crc kubenswrapper[4732]: I0402 14:39:21.617065 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ng8gx_c89914a1-adc8-4baa-ae73-ec02091fca58/speaker/0.log" Apr 02 14:39:21 crc kubenswrapper[4732]: I0402 14:39:21.904351 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/frr/0.log" Apr 02 14:39:31 crc kubenswrapper[4732]: I0402 14:39:31.924693 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:39:31 crc kubenswrapper[4732]: I0402 14:39:31.925197 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:39:34 crc kubenswrapper[4732]: I0402 14:39:34.214374 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx_77a3925c-07b0-47ea-950f-524ce995edbf/util/0.log" Apr 02 14:39:34 crc kubenswrapper[4732]: I0402 14:39:34.353665 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx_77a3925c-07b0-47ea-950f-524ce995edbf/util/0.log" Apr 02 14:39:34 crc kubenswrapper[4732]: I0402 14:39:34.411767 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx_77a3925c-07b0-47ea-950f-524ce995edbf/pull/0.log" Apr 02 14:39:34 crc kubenswrapper[4732]: I0402 14:39:34.470522 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx_77a3925c-07b0-47ea-950f-524ce995edbf/pull/0.log" Apr 02 14:39:34 crc kubenswrapper[4732]: I0402 14:39:34.649509 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx_77a3925c-07b0-47ea-950f-524ce995edbf/util/0.log" Apr 02 14:39:34 crc kubenswrapper[4732]: I0402 14:39:34.655293 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx_77a3925c-07b0-47ea-950f-524ce995edbf/pull/0.log" Apr 02 14:39:34 crc kubenswrapper[4732]: I0402 14:39:34.664282 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx_77a3925c-07b0-47ea-950f-524ce995edbf/extract/0.log" Apr 02 14:39:34 crc kubenswrapper[4732]: I0402 14:39:34.810016 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg_6c632108-b88a-4b6c-9368-49aacc9c04ec/util/0.log" Apr 02 14:39:35 crc kubenswrapper[4732]: I0402 14:39:35.124708 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg_6c632108-b88a-4b6c-9368-49aacc9c04ec/pull/0.log" Apr 02 14:39:35 crc kubenswrapper[4732]: I0402 14:39:35.141139 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg_6c632108-b88a-4b6c-9368-49aacc9c04ec/pull/0.log" Apr 02 14:39:35 crc kubenswrapper[4732]: I0402 14:39:35.146861 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg_6c632108-b88a-4b6c-9368-49aacc9c04ec/util/0.log" Apr 02 14:39:35 crc kubenswrapper[4732]: I0402 14:39:35.349852 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg_6c632108-b88a-4b6c-9368-49aacc9c04ec/util/0.log" Apr 02 14:39:35 crc kubenswrapper[4732]: I0402 14:39:35.352920 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg_6c632108-b88a-4b6c-9368-49aacc9c04ec/extract/0.log" Apr 02 14:39:35 crc kubenswrapper[4732]: I0402 14:39:35.353703 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg_6c632108-b88a-4b6c-9368-49aacc9c04ec/pull/0.log" Apr 02 14:39:35 crc kubenswrapper[4732]: I0402 14:39:35.526521 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6265v_5d2e3468-f731-45d7-bc5f-5c32a739a196/extract-utilities/0.log" Apr 02 14:39:35 crc kubenswrapper[4732]: I0402 14:39:35.691869 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6265v_5d2e3468-f731-45d7-bc5f-5c32a739a196/extract-content/0.log" Apr 02 14:39:35 crc kubenswrapper[4732]: I0402 14:39:35.726596 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6265v_5d2e3468-f731-45d7-bc5f-5c32a739a196/extract-content/0.log" Apr 02 14:39:35 crc kubenswrapper[4732]: I0402 14:39:35.738330 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6265v_5d2e3468-f731-45d7-bc5f-5c32a739a196/extract-utilities/0.log" Apr 02 14:39:35 crc kubenswrapper[4732]: I0402 14:39:35.915071 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6265v_5d2e3468-f731-45d7-bc5f-5c32a739a196/extract-utilities/0.log" Apr 02 14:39:35 crc kubenswrapper[4732]: I0402 14:39:35.931247 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6265v_5d2e3468-f731-45d7-bc5f-5c32a739a196/extract-content/0.log" Apr 02 14:39:36 crc kubenswrapper[4732]: I0402 14:39:36.123808 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mvbv_7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b/extract-utilities/0.log" Apr 02 14:39:36 crc kubenswrapper[4732]: I0402 14:39:36.381769 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mvbv_7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b/extract-content/0.log" Apr 02 14:39:36 crc kubenswrapper[4732]: I0402 14:39:36.437947 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mvbv_7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b/extract-utilities/0.log" Apr 02 14:39:36 crc kubenswrapper[4732]: I0402 14:39:36.440106 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mvbv_7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b/extract-content/0.log" Apr 02 14:39:36 crc kubenswrapper[4732]: I0402 14:39:36.550009 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6265v_5d2e3468-f731-45d7-bc5f-5c32a739a196/registry-server/0.log" Apr 02 14:39:36 crc kubenswrapper[4732]: I0402 14:39:36.686917 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mvbv_7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b/extract-utilities/0.log" Apr 02 14:39:36 crc kubenswrapper[4732]: I0402 14:39:36.731590 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mvbv_7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b/extract-content/0.log" Apr 02 14:39:36 crc kubenswrapper[4732]: I0402 14:39:36.973814 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-w7mzq_c2c9f0ff-65e0-4d5f-8518-4461263be6c2/marketplace-operator/0.log" Apr 02 14:39:37 crc kubenswrapper[4732]: I0402 14:39:37.116011 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jlqh_76b4f705-b92d-4b11-8ddf-a4252a98c37d/extract-utilities/0.log" Apr 02 14:39:37 crc kubenswrapper[4732]: I0402 14:39:37.289304 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jlqh_76b4f705-b92d-4b11-8ddf-a4252a98c37d/extract-utilities/0.log" Apr 02 14:39:37 crc kubenswrapper[4732]: I0402 14:39:37.324804 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mvbv_7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b/registry-server/0.log" Apr 02 14:39:37 crc kubenswrapper[4732]: I0402 14:39:37.407225 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jlqh_76b4f705-b92d-4b11-8ddf-a4252a98c37d/extract-content/0.log" Apr 02 14:39:37 crc kubenswrapper[4732]: I0402 14:39:37.414555 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jlqh_76b4f705-b92d-4b11-8ddf-a4252a98c37d/extract-content/0.log" Apr 02 14:39:37 crc kubenswrapper[4732]: I0402 14:39:37.620093 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jlqh_76b4f705-b92d-4b11-8ddf-a4252a98c37d/extract-utilities/0.log" Apr 02 14:39:37 crc kubenswrapper[4732]: I0402 14:39:37.626227 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jlqh_76b4f705-b92d-4b11-8ddf-a4252a98c37d/extract-content/0.log" Apr 02 14:39:37 crc kubenswrapper[4732]: I0402 14:39:37.794952 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jlqh_76b4f705-b92d-4b11-8ddf-a4252a98c37d/registry-server/0.log" Apr 02 14:39:37 crc kubenswrapper[4732]: I0402 14:39:37.828862 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6c8xq_a9edc76f-550a-4fb7-b8e5-24a34beb38f8/extract-utilities/0.log" Apr 02 14:39:37 crc kubenswrapper[4732]: I0402 14:39:37.981515 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6c8xq_a9edc76f-550a-4fb7-b8e5-24a34beb38f8/extract-content/0.log" Apr 02 14:39:37 crc kubenswrapper[4732]: I0402 14:39:37.990920 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6c8xq_a9edc76f-550a-4fb7-b8e5-24a34beb38f8/extract-content/0.log" Apr 02 14:39:38 crc kubenswrapper[4732]: I0402 14:39:38.031424 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6c8xq_a9edc76f-550a-4fb7-b8e5-24a34beb38f8/extract-utilities/0.log" Apr 02 14:39:38 crc kubenswrapper[4732]: I0402 14:39:38.221269 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6c8xq_a9edc76f-550a-4fb7-b8e5-24a34beb38f8/extract-content/0.log" Apr 02 14:39:38 crc kubenswrapper[4732]: I0402 14:39:38.234467 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6c8xq_a9edc76f-550a-4fb7-b8e5-24a34beb38f8/extract-utilities/0.log" Apr 02 14:39:38 crc kubenswrapper[4732]: I0402 14:39:38.706783 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6c8xq_a9edc76f-550a-4fb7-b8e5-24a34beb38f8/registry-server/0.log" Apr 02 14:39:54 crc kubenswrapper[4732]: E0402 14:39:54.959847 4732 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.138:53070->38.102.83.138:34999: read tcp 38.102.83.138:53070->38.102.83.138:34999: read: connection reset by peer Apr 02 14:40:00 crc kubenswrapper[4732]: I0402 14:40:00.178999 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585680-mnl44"] Apr 02 14:40:00 crc kubenswrapper[4732]: E0402 14:40:00.180769 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c4b5ea7-e9e1-45d9-80b7-e3a61b5464cd" containerName="oc" Apr 02 14:40:00 crc kubenswrapper[4732]: I0402 14:40:00.180816 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c4b5ea7-e9e1-45d9-80b7-e3a61b5464cd" containerName="oc" Apr 02 14:40:00 crc kubenswrapper[4732]: E0402 14:40:00.180840 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cebdcb3-464d-42bd-b673-8cc2942d2e85" containerName="extract-content" Apr 02 14:40:00 crc kubenswrapper[4732]: I0402 14:40:00.180849 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cebdcb3-464d-42bd-b673-8cc2942d2e85" containerName="extract-content" Apr 02 14:40:00 crc kubenswrapper[4732]: E0402 14:40:00.180883 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cebdcb3-464d-42bd-b673-8cc2942d2e85" containerName="registry-server" Apr 02 14:40:00 crc kubenswrapper[4732]: I0402 14:40:00.180926 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cebdcb3-464d-42bd-b673-8cc2942d2e85" containerName="registry-server" Apr 02 14:40:00 crc kubenswrapper[4732]: E0402 14:40:00.180976 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cebdcb3-464d-42bd-b673-8cc2942d2e85" containerName="extract-utilities" Apr 02 14:40:00 crc kubenswrapper[4732]: I0402 14:40:00.180988 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cebdcb3-464d-42bd-b673-8cc2942d2e85" containerName="extract-utilities" Apr 02 14:40:00 crc kubenswrapper[4732]: I0402 14:40:00.182823 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cebdcb3-464d-42bd-b673-8cc2942d2e85" containerName="registry-server" Apr 02 14:40:00 crc kubenswrapper[4732]: I0402 14:40:00.182873 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c4b5ea7-e9e1-45d9-80b7-e3a61b5464cd" containerName="oc" Apr 02 14:40:00 crc kubenswrapper[4732]: I0402 14:40:00.186045 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585680-mnl44" Apr 02 14:40:00 crc kubenswrapper[4732]: I0402 14:40:00.189558 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:40:00 crc kubenswrapper[4732]: I0402 14:40:00.190170 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:40:00 crc kubenswrapper[4732]: I0402 14:40:00.199209 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:40:00 crc kubenswrapper[4732]: I0402 14:40:00.217228 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585680-mnl44"] Apr 02 14:40:00 crc kubenswrapper[4732]: I0402 14:40:00.238987 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg2hm\" (UniqueName: \"kubernetes.io/projected/429f68bb-47d9-4185-8dcb-26fb3ad3a20c-kube-api-access-cg2hm\") pod \"auto-csr-approver-29585680-mnl44\" (UID: \"429f68bb-47d9-4185-8dcb-26fb3ad3a20c\") " pod="openshift-infra/auto-csr-approver-29585680-mnl44" Apr 02 14:40:00 crc kubenswrapper[4732]: I0402 14:40:00.343203 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg2hm\" (UniqueName: \"kubernetes.io/projected/429f68bb-47d9-4185-8dcb-26fb3ad3a20c-kube-api-access-cg2hm\") pod \"auto-csr-approver-29585680-mnl44\" (UID: \"429f68bb-47d9-4185-8dcb-26fb3ad3a20c\") " pod="openshift-infra/auto-csr-approver-29585680-mnl44" Apr 02 14:40:00 crc kubenswrapper[4732]: I0402 14:40:00.364187 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg2hm\" (UniqueName: \"kubernetes.io/projected/429f68bb-47d9-4185-8dcb-26fb3ad3a20c-kube-api-access-cg2hm\") pod \"auto-csr-approver-29585680-mnl44\" (UID: \"429f68bb-47d9-4185-8dcb-26fb3ad3a20c\") " pod="openshift-infra/auto-csr-approver-29585680-mnl44" Apr 02 14:40:00 crc kubenswrapper[4732]: I0402 14:40:00.549593 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585680-mnl44" Apr 02 14:40:00 crc kubenswrapper[4732]: I0402 14:40:00.865073 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585680-mnl44"] Apr 02 14:40:00 crc kubenswrapper[4732]: I0402 14:40:00.895103 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585680-mnl44" event={"ID":"429f68bb-47d9-4185-8dcb-26fb3ad3a20c","Type":"ContainerStarted","Data":"47bcbf53d0a75d499c4dbc1eecbcffbdda9c5a198d671c8066753beef19db399"} Apr 02 14:40:01 crc kubenswrapper[4732]: I0402 14:40:01.924420 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:40:01 crc kubenswrapper[4732]: I0402 14:40:01.924815 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:40:01 crc kubenswrapper[4732]: I0402 14:40:01.924872 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 14:40:01 crc kubenswrapper[4732]: I0402 14:40:01.925737 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e40e82e6d4518aba30c53d5cb0838adda1128bec7118e470ccaa5d5e5e745a10"} pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 02 14:40:01 crc kubenswrapper[4732]: I0402 14:40:01.925807 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" containerID="cri-o://e40e82e6d4518aba30c53d5cb0838adda1128bec7118e470ccaa5d5e5e745a10" gracePeriod=600 Apr 02 14:40:01 crc kubenswrapper[4732]: E0402 14:40:01.946983 4732 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.138:38856->38.102.83.138:34999: write tcp 38.102.83.138:38856->38.102.83.138:34999: write: broken pipe Apr 02 14:40:02 crc kubenswrapper[4732]: I0402 14:40:02.916709 4732 generic.go:334] "Generic (PLEG): container finished" podID="429f68bb-47d9-4185-8dcb-26fb3ad3a20c" containerID="c7e7e57d13c9db90f242c9b4a24badf33e93df956cf2c6df7fde03bab03e2b9d" exitCode=0 Apr 02 14:40:02 crc kubenswrapper[4732]: I0402 14:40:02.917238 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585680-mnl44" event={"ID":"429f68bb-47d9-4185-8dcb-26fb3ad3a20c","Type":"ContainerDied","Data":"c7e7e57d13c9db90f242c9b4a24badf33e93df956cf2c6df7fde03bab03e2b9d"} Apr 02 14:40:02 crc kubenswrapper[4732]: I0402 14:40:02.920228 4732 generic.go:334] "Generic (PLEG): container finished" podID="38409e5e-4545-49da-8f6c-4bfb30582878" containerID="e40e82e6d4518aba30c53d5cb0838adda1128bec7118e470ccaa5d5e5e745a10" exitCode=0 Apr 02 14:40:02 crc kubenswrapper[4732]: I0402 14:40:02.920258 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerDied","Data":"e40e82e6d4518aba30c53d5cb0838adda1128bec7118e470ccaa5d5e5e745a10"} Apr 02 14:40:02 crc kubenswrapper[4732]: I0402 14:40:02.920277 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerStarted","Data":"4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0"} Apr 02 14:40:02 crc kubenswrapper[4732]: I0402 14:40:02.920294 4732 scope.go:117] "RemoveContainer" containerID="709a640877bb61587cf466971cf30393cd3affce375c74cd38fe9de80c188cd2" Apr 02 14:40:04 crc kubenswrapper[4732]: I0402 14:40:04.324481 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585680-mnl44" Apr 02 14:40:04 crc kubenswrapper[4732]: I0402 14:40:04.422048 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg2hm\" (UniqueName: \"kubernetes.io/projected/429f68bb-47d9-4185-8dcb-26fb3ad3a20c-kube-api-access-cg2hm\") pod \"429f68bb-47d9-4185-8dcb-26fb3ad3a20c\" (UID: \"429f68bb-47d9-4185-8dcb-26fb3ad3a20c\") " Apr 02 14:40:04 crc kubenswrapper[4732]: I0402 14:40:04.430397 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/429f68bb-47d9-4185-8dcb-26fb3ad3a20c-kube-api-access-cg2hm" (OuterVolumeSpecName: "kube-api-access-cg2hm") pod "429f68bb-47d9-4185-8dcb-26fb3ad3a20c" (UID: "429f68bb-47d9-4185-8dcb-26fb3ad3a20c"). InnerVolumeSpecName "kube-api-access-cg2hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:40:04 crc kubenswrapper[4732]: I0402 14:40:04.524481 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg2hm\" (UniqueName: \"kubernetes.io/projected/429f68bb-47d9-4185-8dcb-26fb3ad3a20c-kube-api-access-cg2hm\") on node \"crc\" DevicePath \"\"" Apr 02 14:40:04 crc kubenswrapper[4732]: I0402 14:40:04.941601 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585680-mnl44" event={"ID":"429f68bb-47d9-4185-8dcb-26fb3ad3a20c","Type":"ContainerDied","Data":"47bcbf53d0a75d499c4dbc1eecbcffbdda9c5a198d671c8066753beef19db399"} Apr 02 14:40:04 crc kubenswrapper[4732]: I0402 14:40:04.941836 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47bcbf53d0a75d499c4dbc1eecbcffbdda9c5a198d671c8066753beef19db399" Apr 02 14:40:04 crc kubenswrapper[4732]: I0402 14:40:04.941889 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585680-mnl44" Apr 02 14:40:05 crc kubenswrapper[4732]: I0402 14:40:05.401650 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585674-jn5d6"] Apr 02 14:40:05 crc kubenswrapper[4732]: I0402 14:40:05.411936 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585674-jn5d6"] Apr 02 14:40:06 crc kubenswrapper[4732]: I0402 14:40:06.695002 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cfcb44-41f6-400e-b7b5-a881187eb46f" path="/var/lib/kubelet/pods/b6cfcb44-41f6-400e-b7b5-a881187eb46f/volumes" Apr 02 14:40:35 crc kubenswrapper[4732]: I0402 14:40:35.095308 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-glbf9"] Apr 02 14:40:35 crc kubenswrapper[4732]: E0402 14:40:35.096411 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="429f68bb-47d9-4185-8dcb-26fb3ad3a20c" containerName="oc" Apr 02 14:40:35 crc kubenswrapper[4732]: I0402 14:40:35.096426 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="429f68bb-47d9-4185-8dcb-26fb3ad3a20c" containerName="oc" Apr 02 14:40:35 crc kubenswrapper[4732]: I0402 14:40:35.096701 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="429f68bb-47d9-4185-8dcb-26fb3ad3a20c" containerName="oc" Apr 02 14:40:35 crc kubenswrapper[4732]: I0402 14:40:35.098304 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-glbf9" Apr 02 14:40:35 crc kubenswrapper[4732]: I0402 14:40:35.118380 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-glbf9"] Apr 02 14:40:35 crc kubenswrapper[4732]: I0402 14:40:35.215560 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq82q\" (UniqueName: \"kubernetes.io/projected/2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd-kube-api-access-cq82q\") pod \"certified-operators-glbf9\" (UID: \"2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd\") " pod="openshift-marketplace/certified-operators-glbf9" Apr 02 14:40:35 crc kubenswrapper[4732]: I0402 14:40:35.215633 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd-catalog-content\") pod \"certified-operators-glbf9\" (UID: \"2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd\") " pod="openshift-marketplace/certified-operators-glbf9" Apr 02 14:40:35 crc kubenswrapper[4732]: I0402 14:40:35.215664 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd-utilities\") pod \"certified-operators-glbf9\" (UID: \"2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd\") " pod="openshift-marketplace/certified-operators-glbf9" Apr 02 14:40:35 crc kubenswrapper[4732]: I0402 14:40:35.317552 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq82q\" (UniqueName: \"kubernetes.io/projected/2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd-kube-api-access-cq82q\") pod \"certified-operators-glbf9\" (UID: \"2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd\") " pod="openshift-marketplace/certified-operators-glbf9" Apr 02 14:40:35 crc kubenswrapper[4732]: I0402 14:40:35.317599 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd-catalog-content\") pod \"certified-operators-glbf9\" (UID: \"2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd\") " pod="openshift-marketplace/certified-operators-glbf9" Apr 02 14:40:35 crc kubenswrapper[4732]: I0402 14:40:35.317634 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd-utilities\") pod \"certified-operators-glbf9\" (UID: \"2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd\") " pod="openshift-marketplace/certified-operators-glbf9" Apr 02 14:40:35 crc kubenswrapper[4732]: I0402 14:40:35.318144 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd-catalog-content\") pod \"certified-operators-glbf9\" (UID: \"2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd\") " pod="openshift-marketplace/certified-operators-glbf9" Apr 02 14:40:35 crc kubenswrapper[4732]: I0402 14:40:35.318163 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd-utilities\") pod \"certified-operators-glbf9\" (UID: \"2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd\") " pod="openshift-marketplace/certified-operators-glbf9" Apr 02 14:40:35 crc kubenswrapper[4732]: I0402 14:40:35.337437 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq82q\" (UniqueName: \"kubernetes.io/projected/2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd-kube-api-access-cq82q\") pod \"certified-operators-glbf9\" (UID: \"2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd\") " pod="openshift-marketplace/certified-operators-glbf9" Apr 02 14:40:35 crc kubenswrapper[4732]: I0402 14:40:35.418221 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-glbf9" Apr 02 14:40:35 crc kubenswrapper[4732]: I0402 14:40:35.778528 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-glbf9"] Apr 02 14:40:35 crc kubenswrapper[4732]: W0402 14:40:35.778924 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a02b91f_2c9f_40ce_abbb_9bfb800cf9cd.slice/crio-40f30817aa8e038be419a553266f8b269c1be45d7b2eb1e382af3f318cfe6d6a WatchSource:0}: Error finding container 40f30817aa8e038be419a553266f8b269c1be45d7b2eb1e382af3f318cfe6d6a: Status 404 returned error can't find the container with id 40f30817aa8e038be419a553266f8b269c1be45d7b2eb1e382af3f318cfe6d6a Apr 02 14:40:36 crc kubenswrapper[4732]: I0402 14:40:36.272400 4732 generic.go:334] "Generic (PLEG): container finished" podID="2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd" containerID="ebe55c9bbfbdeb6e79321ad0521df475e61ab2de742502939c316039d2d055d6" exitCode=0 Apr 02 14:40:36 crc kubenswrapper[4732]: I0402 14:40:36.272438 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glbf9" event={"ID":"2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd","Type":"ContainerDied","Data":"ebe55c9bbfbdeb6e79321ad0521df475e61ab2de742502939c316039d2d055d6"} Apr 02 14:40:36 crc kubenswrapper[4732]: I0402 14:40:36.272464 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glbf9" event={"ID":"2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd","Type":"ContainerStarted","Data":"40f30817aa8e038be419a553266f8b269c1be45d7b2eb1e382af3f318cfe6d6a"} Apr 02 14:40:36 crc kubenswrapper[4732]: I0402 14:40:36.274890 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 02 14:40:37 crc kubenswrapper[4732]: I0402 14:40:37.291417 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glbf9" event={"ID":"2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd","Type":"ContainerStarted","Data":"0c27cce8963c020dd9f59f0fdb19a7928a5b7faa43fa8a012cc94ffafcbafc76"} Apr 02 14:40:38 crc kubenswrapper[4732]: I0402 14:40:38.301076 4732 generic.go:334] "Generic (PLEG): container finished" podID="2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd" containerID="0c27cce8963c020dd9f59f0fdb19a7928a5b7faa43fa8a012cc94ffafcbafc76" exitCode=0 Apr 02 14:40:38 crc kubenswrapper[4732]: I0402 14:40:38.301751 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glbf9" event={"ID":"2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd","Type":"ContainerDied","Data":"0c27cce8963c020dd9f59f0fdb19a7928a5b7faa43fa8a012cc94ffafcbafc76"} Apr 02 14:40:39 crc kubenswrapper[4732]: I0402 14:40:39.314811 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glbf9" event={"ID":"2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd","Type":"ContainerStarted","Data":"690d6f777ef42493bddd3fcf2e8bf0614e2769de411c4851b383021844f7fcc8"} Apr 02 14:40:39 crc kubenswrapper[4732]: I0402 14:40:39.340450 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-glbf9" podStartSLOduration=1.9411787459999998 podStartE2EDuration="4.340427171s" podCreationTimestamp="2026-04-02 14:40:35 +0000 UTC" firstStartedPulling="2026-04-02 14:40:36.274572351 +0000 UTC m=+3793.178979914" lastFinishedPulling="2026-04-02 14:40:38.673820786 +0000 UTC m=+3795.578228339" observedRunningTime="2026-04-02 14:40:39.333850552 +0000 UTC m=+3796.238258135" watchObservedRunningTime="2026-04-02 14:40:39.340427171 +0000 UTC m=+3796.244834734" Apr 02 14:40:45 crc kubenswrapper[4732]: I0402 14:40:45.418844 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-glbf9" Apr 02 14:40:45 crc kubenswrapper[4732]: I0402 14:40:45.419216 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-glbf9" Apr 02 14:40:45 crc kubenswrapper[4732]: I0402 14:40:45.500297 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-glbf9" Apr 02 14:40:46 crc kubenswrapper[4732]: I0402 14:40:46.439257 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-glbf9" Apr 02 14:40:46 crc kubenswrapper[4732]: I0402 14:40:46.507474 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-glbf9"] Apr 02 14:40:48 crc kubenswrapper[4732]: I0402 14:40:48.393705 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-glbf9" podUID="2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd" containerName="registry-server" containerID="cri-o://690d6f777ef42493bddd3fcf2e8bf0614e2769de411c4851b383021844f7fcc8" gracePeriod=2 Apr 02 14:40:48 crc kubenswrapper[4732]: I0402 14:40:48.968645 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-glbf9" Apr 02 14:40:49 crc kubenswrapper[4732]: I0402 14:40:49.087696 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq82q\" (UniqueName: \"kubernetes.io/projected/2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd-kube-api-access-cq82q\") pod \"2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd\" (UID: \"2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd\") " Apr 02 14:40:49 crc kubenswrapper[4732]: I0402 14:40:49.087791 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd-utilities\") pod \"2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd\" (UID: \"2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd\") " Apr 02 14:40:49 crc kubenswrapper[4732]: I0402 14:40:49.088005 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd-catalog-content\") pod \"2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd\" (UID: \"2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd\") " Apr 02 14:40:49 crc kubenswrapper[4732]: I0402 14:40:49.089749 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd-utilities" (OuterVolumeSpecName: "utilities") pod "2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd" (UID: "2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:40:49 crc kubenswrapper[4732]: I0402 14:40:49.095807 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd-kube-api-access-cq82q" (OuterVolumeSpecName: "kube-api-access-cq82q") pod "2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd" (UID: "2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd"). InnerVolumeSpecName "kube-api-access-cq82q". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:40:49 crc kubenswrapper[4732]: I0402 14:40:49.152592 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd" (UID: "2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:40:49 crc kubenswrapper[4732]: I0402 14:40:49.191399 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq82q\" (UniqueName: \"kubernetes.io/projected/2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd-kube-api-access-cq82q\") on node \"crc\" DevicePath \"\"" Apr 02 14:40:49 crc kubenswrapper[4732]: I0402 14:40:49.191460 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 14:40:49 crc kubenswrapper[4732]: I0402 14:40:49.191472 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 14:40:49 crc kubenswrapper[4732]: I0402 14:40:49.405112 4732 generic.go:334] "Generic (PLEG): container finished" podID="2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd" containerID="690d6f777ef42493bddd3fcf2e8bf0614e2769de411c4851b383021844f7fcc8" exitCode=0 Apr 02 14:40:49 crc kubenswrapper[4732]: I0402 14:40:49.405196 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-glbf9" Apr 02 14:40:49 crc kubenswrapper[4732]: I0402 14:40:49.405249 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glbf9" event={"ID":"2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd","Type":"ContainerDied","Data":"690d6f777ef42493bddd3fcf2e8bf0614e2769de411c4851b383021844f7fcc8"} Apr 02 14:40:49 crc kubenswrapper[4732]: I0402 14:40:49.405568 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glbf9" event={"ID":"2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd","Type":"ContainerDied","Data":"40f30817aa8e038be419a553266f8b269c1be45d7b2eb1e382af3f318cfe6d6a"} Apr 02 14:40:49 crc kubenswrapper[4732]: I0402 14:40:49.405588 4732 scope.go:117] "RemoveContainer" containerID="690d6f777ef42493bddd3fcf2e8bf0614e2769de411c4851b383021844f7fcc8" Apr 02 14:40:49 crc kubenswrapper[4732]: I0402 14:40:49.430579 4732 scope.go:117] "RemoveContainer" containerID="0c27cce8963c020dd9f59f0fdb19a7928a5b7faa43fa8a012cc94ffafcbafc76" Apr 02 14:40:49 crc kubenswrapper[4732]: I0402 14:40:49.444133 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-glbf9"] Apr 02 14:40:49 crc kubenswrapper[4732]: I0402 14:40:49.457085 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-glbf9"] Apr 02 14:40:49 crc kubenswrapper[4732]: I0402 14:40:49.472509 4732 scope.go:117] "RemoveContainer" containerID="ebe55c9bbfbdeb6e79321ad0521df475e61ab2de742502939c316039d2d055d6" Apr 02 14:40:49 crc kubenswrapper[4732]: I0402 14:40:49.524299 4732 scope.go:117] "RemoveContainer" containerID="690d6f777ef42493bddd3fcf2e8bf0614e2769de411c4851b383021844f7fcc8" Apr 02 14:40:49 crc kubenswrapper[4732]: E0402 14:40:49.524775 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"690d6f777ef42493bddd3fcf2e8bf0614e2769de411c4851b383021844f7fcc8\": container with ID starting with 690d6f777ef42493bddd3fcf2e8bf0614e2769de411c4851b383021844f7fcc8 not found: ID does not exist" containerID="690d6f777ef42493bddd3fcf2e8bf0614e2769de411c4851b383021844f7fcc8" Apr 02 14:40:49 crc kubenswrapper[4732]: I0402 14:40:49.524814 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"690d6f777ef42493bddd3fcf2e8bf0614e2769de411c4851b383021844f7fcc8"} err="failed to get container status \"690d6f777ef42493bddd3fcf2e8bf0614e2769de411c4851b383021844f7fcc8\": rpc error: code = NotFound desc = could not find container \"690d6f777ef42493bddd3fcf2e8bf0614e2769de411c4851b383021844f7fcc8\": container with ID starting with 690d6f777ef42493bddd3fcf2e8bf0614e2769de411c4851b383021844f7fcc8 not found: ID does not exist" Apr 02 14:40:49 crc kubenswrapper[4732]: I0402 14:40:49.524838 4732 scope.go:117] "RemoveContainer" containerID="0c27cce8963c020dd9f59f0fdb19a7928a5b7faa43fa8a012cc94ffafcbafc76" Apr 02 14:40:49 crc kubenswrapper[4732]: E0402 14:40:49.525164 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c27cce8963c020dd9f59f0fdb19a7928a5b7faa43fa8a012cc94ffafcbafc76\": container with ID starting with 0c27cce8963c020dd9f59f0fdb19a7928a5b7faa43fa8a012cc94ffafcbafc76 not found: ID does not exist" containerID="0c27cce8963c020dd9f59f0fdb19a7928a5b7faa43fa8a012cc94ffafcbafc76" Apr 02 14:40:49 crc kubenswrapper[4732]: I0402 14:40:49.525192 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c27cce8963c020dd9f59f0fdb19a7928a5b7faa43fa8a012cc94ffafcbafc76"} err="failed to get container status \"0c27cce8963c020dd9f59f0fdb19a7928a5b7faa43fa8a012cc94ffafcbafc76\": rpc error: code = NotFound desc = could not find container \"0c27cce8963c020dd9f59f0fdb19a7928a5b7faa43fa8a012cc94ffafcbafc76\": container with ID starting with 0c27cce8963c020dd9f59f0fdb19a7928a5b7faa43fa8a012cc94ffafcbafc76 not found: ID does not exist" Apr 02 14:40:49 crc kubenswrapper[4732]: I0402 14:40:49.525208 4732 scope.go:117] "RemoveContainer" containerID="ebe55c9bbfbdeb6e79321ad0521df475e61ab2de742502939c316039d2d055d6" Apr 02 14:40:49 crc kubenswrapper[4732]: E0402 14:40:49.525442 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebe55c9bbfbdeb6e79321ad0521df475e61ab2de742502939c316039d2d055d6\": container with ID starting with ebe55c9bbfbdeb6e79321ad0521df475e61ab2de742502939c316039d2d055d6 not found: ID does not exist" containerID="ebe55c9bbfbdeb6e79321ad0521df475e61ab2de742502939c316039d2d055d6" Apr 02 14:40:49 crc kubenswrapper[4732]: I0402 14:40:49.525470 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebe55c9bbfbdeb6e79321ad0521df475e61ab2de742502939c316039d2d055d6"} err="failed to get container status \"ebe55c9bbfbdeb6e79321ad0521df475e61ab2de742502939c316039d2d055d6\": rpc error: code = NotFound desc = could not find container \"ebe55c9bbfbdeb6e79321ad0521df475e61ab2de742502939c316039d2d055d6\": container with ID starting with ebe55c9bbfbdeb6e79321ad0521df475e61ab2de742502939c316039d2d055d6 not found: ID does not exist" Apr 02 14:40:50 crc kubenswrapper[4732]: I0402 14:40:50.698468 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd" path="/var/lib/kubelet/pods/2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd/volumes" Apr 02 14:40:52 crc kubenswrapper[4732]: I0402 14:40:52.933403 4732 scope.go:117] "RemoveContainer" containerID="644e12a213d36dbaf61adacd3d46a4851e3728404d3bfd3873c17c62266ec012" Apr 02 14:41:31 crc kubenswrapper[4732]: I0402 14:41:31.866452 4732 generic.go:334] "Generic (PLEG): container finished" podID="d3dcab4b-fbf1-451c-834e-c09b71ba0532" containerID="48960aaf1ab4f6d3adaeed7e914fb355d25232b633c9de5e96cc72b7f29d1994" exitCode=0 Apr 02 14:41:31 crc kubenswrapper[4732]: I0402 14:41:31.866536 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mjg8z/must-gather-jrdpk" event={"ID":"d3dcab4b-fbf1-451c-834e-c09b71ba0532","Type":"ContainerDied","Data":"48960aaf1ab4f6d3adaeed7e914fb355d25232b633c9de5e96cc72b7f29d1994"} Apr 02 14:41:31 crc kubenswrapper[4732]: I0402 14:41:31.867664 4732 scope.go:117] "RemoveContainer" containerID="48960aaf1ab4f6d3adaeed7e914fb355d25232b633c9de5e96cc72b7f29d1994" Apr 02 14:41:32 crc kubenswrapper[4732]: I0402 14:41:32.351348 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mjg8z_must-gather-jrdpk_d3dcab4b-fbf1-451c-834e-c09b71ba0532/gather/0.log" Apr 02 14:41:40 crc kubenswrapper[4732]: I0402 14:41:40.737269 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mjg8z/must-gather-jrdpk"] Apr 02 14:41:40 crc kubenswrapper[4732]: I0402 14:41:40.738292 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-mjg8z/must-gather-jrdpk" podUID="d3dcab4b-fbf1-451c-834e-c09b71ba0532" containerName="copy" containerID="cri-o://abea6c7ce604b4382966cd050f23a9e9443e3fac1f49a575f219a6f335b008ce" gracePeriod=2 Apr 02 14:41:40 crc kubenswrapper[4732]: I0402 14:41:40.749297 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mjg8z/must-gather-jrdpk"] Apr 02 14:41:40 crc kubenswrapper[4732]: I0402 14:41:40.982734 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mjg8z_must-gather-jrdpk_d3dcab4b-fbf1-451c-834e-c09b71ba0532/copy/0.log" Apr 02 14:41:41 crc kubenswrapper[4732]: I0402 14:41:41.006165 4732 generic.go:334] "Generic (PLEG): container finished" podID="d3dcab4b-fbf1-451c-834e-c09b71ba0532" containerID="abea6c7ce604b4382966cd050f23a9e9443e3fac1f49a575f219a6f335b008ce" exitCode=143 Apr 02 14:41:41 crc kubenswrapper[4732]: I0402 14:41:41.183021 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mjg8z_must-gather-jrdpk_d3dcab4b-fbf1-451c-834e-c09b71ba0532/copy/0.log" Apr 02 14:41:41 crc kubenswrapper[4732]: I0402 14:41:41.185091 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mjg8z/must-gather-jrdpk" Apr 02 14:41:41 crc kubenswrapper[4732]: I0402 14:41:41.281128 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d3dcab4b-fbf1-451c-834e-c09b71ba0532-must-gather-output\") pod \"d3dcab4b-fbf1-451c-834e-c09b71ba0532\" (UID: \"d3dcab4b-fbf1-451c-834e-c09b71ba0532\") " Apr 02 14:41:41 crc kubenswrapper[4732]: I0402 14:41:41.281647 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx52g\" (UniqueName: \"kubernetes.io/projected/d3dcab4b-fbf1-451c-834e-c09b71ba0532-kube-api-access-bx52g\") pod \"d3dcab4b-fbf1-451c-834e-c09b71ba0532\" (UID: \"d3dcab4b-fbf1-451c-834e-c09b71ba0532\") " Apr 02 14:41:41 crc kubenswrapper[4732]: I0402 14:41:41.287308 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3dcab4b-fbf1-451c-834e-c09b71ba0532-kube-api-access-bx52g" (OuterVolumeSpecName: "kube-api-access-bx52g") pod "d3dcab4b-fbf1-451c-834e-c09b71ba0532" (UID: "d3dcab4b-fbf1-451c-834e-c09b71ba0532"). InnerVolumeSpecName "kube-api-access-bx52g". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:41:41 crc kubenswrapper[4732]: I0402 14:41:41.385902 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx52g\" (UniqueName: \"kubernetes.io/projected/d3dcab4b-fbf1-451c-834e-c09b71ba0532-kube-api-access-bx52g\") on node \"crc\" DevicePath \"\"" Apr 02 14:41:41 crc kubenswrapper[4732]: I0402 14:41:41.447726 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3dcab4b-fbf1-451c-834e-c09b71ba0532-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d3dcab4b-fbf1-451c-834e-c09b71ba0532" (UID: "d3dcab4b-fbf1-451c-834e-c09b71ba0532"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:41:41 crc kubenswrapper[4732]: I0402 14:41:41.487325 4732 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d3dcab4b-fbf1-451c-834e-c09b71ba0532-must-gather-output\") on node \"crc\" DevicePath \"\"" Apr 02 14:41:42 crc kubenswrapper[4732]: I0402 14:41:42.016810 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mjg8z_must-gather-jrdpk_d3dcab4b-fbf1-451c-834e-c09b71ba0532/copy/0.log" Apr 02 14:41:42 crc kubenswrapper[4732]: I0402 14:41:42.017151 4732 scope.go:117] "RemoveContainer" containerID="abea6c7ce604b4382966cd050f23a9e9443e3fac1f49a575f219a6f335b008ce" Apr 02 14:41:42 crc kubenswrapper[4732]: I0402 14:41:42.017198 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mjg8z/must-gather-jrdpk" Apr 02 14:41:42 crc kubenswrapper[4732]: I0402 14:41:42.037592 4732 scope.go:117] "RemoveContainer" containerID="48960aaf1ab4f6d3adaeed7e914fb355d25232b633c9de5e96cc72b7f29d1994" Apr 02 14:41:42 crc kubenswrapper[4732]: I0402 14:41:42.694817 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3dcab4b-fbf1-451c-834e-c09b71ba0532" path="/var/lib/kubelet/pods/d3dcab4b-fbf1-451c-834e-c09b71ba0532/volumes" Apr 02 14:42:00 crc kubenswrapper[4732]: I0402 14:42:00.182985 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585682-pz4km"] Apr 02 14:42:00 crc kubenswrapper[4732]: E0402 14:42:00.184517 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd" containerName="extract-utilities" Apr 02 14:42:00 crc kubenswrapper[4732]: I0402 14:42:00.184539 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd" containerName="extract-utilities" Apr 02 14:42:00 crc kubenswrapper[4732]: E0402 14:42:00.184576 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd" containerName="registry-server" Apr 02 14:42:00 crc kubenswrapper[4732]: I0402 14:42:00.184585 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd" containerName="registry-server" Apr 02 14:42:00 crc kubenswrapper[4732]: E0402 14:42:00.184688 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3dcab4b-fbf1-451c-834e-c09b71ba0532" containerName="gather" Apr 02 14:42:00 crc kubenswrapper[4732]: I0402 14:42:00.184700 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3dcab4b-fbf1-451c-834e-c09b71ba0532" containerName="gather" Apr 02 14:42:00 crc kubenswrapper[4732]: E0402 14:42:00.184737 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd" containerName="extract-content" Apr 02 14:42:00 crc kubenswrapper[4732]: I0402 14:42:00.184745 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd" containerName="extract-content" Apr 02 14:42:00 crc kubenswrapper[4732]: E0402 14:42:00.184763 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3dcab4b-fbf1-451c-834e-c09b71ba0532" containerName="copy" Apr 02 14:42:00 crc kubenswrapper[4732]: I0402 14:42:00.184778 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3dcab4b-fbf1-451c-834e-c09b71ba0532" containerName="copy" Apr 02 14:42:00 crc kubenswrapper[4732]: I0402 14:42:00.185244 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3dcab4b-fbf1-451c-834e-c09b71ba0532" containerName="copy" Apr 02 14:42:00 crc kubenswrapper[4732]: I0402 14:42:00.185298 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3dcab4b-fbf1-451c-834e-c09b71ba0532" containerName="gather" Apr 02 14:42:00 crc kubenswrapper[4732]: I0402 14:42:00.185317 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a02b91f-2c9f-40ce-abbb-9bfb800cf9cd" containerName="registry-server" Apr 02 14:42:00 crc kubenswrapper[4732]: I0402 14:42:00.186388 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585682-pz4km" Apr 02 14:42:00 crc kubenswrapper[4732]: I0402 14:42:00.190154 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:42:00 crc kubenswrapper[4732]: I0402 14:42:00.190358 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:42:00 crc kubenswrapper[4732]: I0402 14:42:00.190456 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:42:00 crc kubenswrapper[4732]: I0402 14:42:00.217011 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585682-pz4km"] Apr 02 14:42:00 crc kubenswrapper[4732]: I0402 14:42:00.258439 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctgzp\" (UniqueName: \"kubernetes.io/projected/d9b1b986-3261-47c8-8483-6951c031f154-kube-api-access-ctgzp\") pod \"auto-csr-approver-29585682-pz4km\" (UID: \"d9b1b986-3261-47c8-8483-6951c031f154\") " pod="openshift-infra/auto-csr-approver-29585682-pz4km" Apr 02 14:42:00 crc kubenswrapper[4732]: I0402 14:42:00.360779 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctgzp\" (UniqueName: \"kubernetes.io/projected/d9b1b986-3261-47c8-8483-6951c031f154-kube-api-access-ctgzp\") pod \"auto-csr-approver-29585682-pz4km\" (UID: \"d9b1b986-3261-47c8-8483-6951c031f154\") " pod="openshift-infra/auto-csr-approver-29585682-pz4km" Apr 02 14:42:00 crc kubenswrapper[4732]: I0402 14:42:00.396000 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctgzp\" (UniqueName: \"kubernetes.io/projected/d9b1b986-3261-47c8-8483-6951c031f154-kube-api-access-ctgzp\") pod \"auto-csr-approver-29585682-pz4km\" (UID: \"d9b1b986-3261-47c8-8483-6951c031f154\") " pod="openshift-infra/auto-csr-approver-29585682-pz4km" Apr 02 14:42:00 crc kubenswrapper[4732]: I0402 14:42:00.517681 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585682-pz4km" Apr 02 14:42:00 crc kubenswrapper[4732]: I0402 14:42:00.982539 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585682-pz4km"] Apr 02 14:42:01 crc kubenswrapper[4732]: I0402 14:42:01.223009 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585682-pz4km" event={"ID":"d9b1b986-3261-47c8-8483-6951c031f154","Type":"ContainerStarted","Data":"efb382cf21d2a39a9895b923b03bf5780c26dca18a29540da01338b4a903b7f4"} Apr 02 14:42:03 crc kubenswrapper[4732]: I0402 14:42:03.253179 4732 generic.go:334] "Generic (PLEG): container finished" podID="d9b1b986-3261-47c8-8483-6951c031f154" containerID="6208c1665c72ff244438793fdbe555f38aad7bda2dcdc0b338432085591a69fd" exitCode=0 Apr 02 14:42:03 crc kubenswrapper[4732]: I0402 14:42:03.253877 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585682-pz4km" event={"ID":"d9b1b986-3261-47c8-8483-6951c031f154","Type":"ContainerDied","Data":"6208c1665c72ff244438793fdbe555f38aad7bda2dcdc0b338432085591a69fd"} Apr 02 14:42:04 crc kubenswrapper[4732]: I0402 14:42:04.587214 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585682-pz4km" Apr 02 14:42:04 crc kubenswrapper[4732]: I0402 14:42:04.643599 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctgzp\" (UniqueName: \"kubernetes.io/projected/d9b1b986-3261-47c8-8483-6951c031f154-kube-api-access-ctgzp\") pod \"d9b1b986-3261-47c8-8483-6951c031f154\" (UID: \"d9b1b986-3261-47c8-8483-6951c031f154\") " Apr 02 14:42:04 crc kubenswrapper[4732]: I0402 14:42:04.648726 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9b1b986-3261-47c8-8483-6951c031f154-kube-api-access-ctgzp" (OuterVolumeSpecName: "kube-api-access-ctgzp") pod "d9b1b986-3261-47c8-8483-6951c031f154" (UID: "d9b1b986-3261-47c8-8483-6951c031f154"). InnerVolumeSpecName "kube-api-access-ctgzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:42:04 crc kubenswrapper[4732]: I0402 14:42:04.745721 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctgzp\" (UniqueName: \"kubernetes.io/projected/d9b1b986-3261-47c8-8483-6951c031f154-kube-api-access-ctgzp\") on node \"crc\" DevicePath \"\"" Apr 02 14:42:05 crc kubenswrapper[4732]: I0402 14:42:05.282468 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585682-pz4km" event={"ID":"d9b1b986-3261-47c8-8483-6951c031f154","Type":"ContainerDied","Data":"efb382cf21d2a39a9895b923b03bf5780c26dca18a29540da01338b4a903b7f4"} Apr 02 14:42:05 crc kubenswrapper[4732]: I0402 14:42:05.282531 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efb382cf21d2a39a9895b923b03bf5780c26dca18a29540da01338b4a903b7f4" Apr 02 14:42:05 crc kubenswrapper[4732]: I0402 14:42:05.282564 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585682-pz4km" Apr 02 14:42:05 crc kubenswrapper[4732]: I0402 14:42:05.685153 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585676-gp9tx"] Apr 02 14:42:05 crc kubenswrapper[4732]: I0402 14:42:05.696862 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585676-gp9tx"] Apr 02 14:42:06 crc kubenswrapper[4732]: I0402 14:42:06.692943 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acc0658a-3ba9-4d19-85f4-6e2c1270efcf" path="/var/lib/kubelet/pods/acc0658a-3ba9-4d19-85f4-6e2c1270efcf/volumes" Apr 02 14:42:31 crc kubenswrapper[4732]: I0402 14:42:31.924736 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:42:31 crc kubenswrapper[4732]: I0402 14:42:31.925548 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:42:53 crc kubenswrapper[4732]: I0402 14:42:53.066727 4732 scope.go:117] "RemoveContainer" containerID="6f66ec762eaa36ac21c4234d510ee069d198ab5a88a0e055d6f699d9fe50da9b" Apr 02 14:42:53 crc kubenswrapper[4732]: I0402 14:42:53.111257 4732 scope.go:117] "RemoveContainer" containerID="29ddeebdf8d34aea2651d495ea74d95c814bc1a0dfd8b0291df6b463bd39abfc" Apr 02 14:43:01 crc kubenswrapper[4732]: I0402 14:43:01.924450 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:43:01 crc kubenswrapper[4732]: I0402 14:43:01.925306 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:43:31 crc kubenswrapper[4732]: I0402 14:43:31.924761 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:43:31 crc kubenswrapper[4732]: I0402 14:43:31.925372 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:43:31 crc kubenswrapper[4732]: I0402 14:43:31.925419 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 14:43:31 crc kubenswrapper[4732]: I0402 14:43:31.926243 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0"} pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 02 14:43:31 crc kubenswrapper[4732]: I0402 14:43:31.926310 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" containerID="cri-o://4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" gracePeriod=600 Apr 02 14:43:32 crc kubenswrapper[4732]: E0402 14:43:32.046094 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:43:32 crc kubenswrapper[4732]: I0402 14:43:32.241428 4732 generic.go:334] "Generic (PLEG): container finished" podID="38409e5e-4545-49da-8f6c-4bfb30582878" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" exitCode=0 Apr 02 14:43:32 crc kubenswrapper[4732]: I0402 14:43:32.241478 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerDied","Data":"4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0"} Apr 02 14:43:32 crc kubenswrapper[4732]: I0402 14:43:32.241514 4732 scope.go:117] "RemoveContainer" containerID="e40e82e6d4518aba30c53d5cb0838adda1128bec7118e470ccaa5d5e5e745a10" Apr 02 14:43:32 crc kubenswrapper[4732]: I0402 14:43:32.242235 4732 scope.go:117] "RemoveContainer" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" Apr 02 14:43:32 crc kubenswrapper[4732]: E0402 14:43:32.242732 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:43:44 crc kubenswrapper[4732]: I0402 14:43:44.689714 4732 scope.go:117] "RemoveContainer" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" Apr 02 14:43:44 crc kubenswrapper[4732]: E0402 14:43:44.690715 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:43:57 crc kubenswrapper[4732]: I0402 14:43:57.681083 4732 scope.go:117] "RemoveContainer" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" Apr 02 14:43:57 crc kubenswrapper[4732]: E0402 14:43:57.681952 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:44:00 crc kubenswrapper[4732]: I0402 14:44:00.166736 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585684-bb29b"] Apr 02 14:44:00 crc kubenswrapper[4732]: E0402 14:44:00.167752 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b1b986-3261-47c8-8483-6951c031f154" containerName="oc" Apr 02 14:44:00 crc kubenswrapper[4732]: I0402 14:44:00.167774 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b1b986-3261-47c8-8483-6951c031f154" containerName="oc" Apr 02 14:44:00 crc kubenswrapper[4732]: I0402 14:44:00.168149 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9b1b986-3261-47c8-8483-6951c031f154" containerName="oc" Apr 02 14:44:00 crc kubenswrapper[4732]: I0402 14:44:00.169260 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585684-bb29b" Apr 02 14:44:00 crc kubenswrapper[4732]: I0402 14:44:00.173663 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:44:00 crc kubenswrapper[4732]: I0402 14:44:00.174045 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:44:00 crc kubenswrapper[4732]: I0402 14:44:00.175037 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:44:00 crc kubenswrapper[4732]: I0402 14:44:00.176675 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585684-bb29b"] Apr 02 14:44:00 crc kubenswrapper[4732]: I0402 14:44:00.259631 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvsp5\" (UniqueName: \"kubernetes.io/projected/6a6a3e51-c56e-43ec-8508-45e7abb53983-kube-api-access-hvsp5\") pod \"auto-csr-approver-29585684-bb29b\" (UID: \"6a6a3e51-c56e-43ec-8508-45e7abb53983\") " pod="openshift-infra/auto-csr-approver-29585684-bb29b" Apr 02 14:44:00 crc kubenswrapper[4732]: I0402 14:44:00.362240 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvsp5\" (UniqueName: \"kubernetes.io/projected/6a6a3e51-c56e-43ec-8508-45e7abb53983-kube-api-access-hvsp5\") pod \"auto-csr-approver-29585684-bb29b\" (UID: \"6a6a3e51-c56e-43ec-8508-45e7abb53983\") " pod="openshift-infra/auto-csr-approver-29585684-bb29b" Apr 02 14:44:00 crc kubenswrapper[4732]: I0402 14:44:00.380576 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvsp5\" (UniqueName: \"kubernetes.io/projected/6a6a3e51-c56e-43ec-8508-45e7abb53983-kube-api-access-hvsp5\") pod \"auto-csr-approver-29585684-bb29b\" (UID: \"6a6a3e51-c56e-43ec-8508-45e7abb53983\") " pod="openshift-infra/auto-csr-approver-29585684-bb29b" Apr 02 14:44:00 crc kubenswrapper[4732]: I0402 14:44:00.562247 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585684-bb29b" Apr 02 14:44:01 crc kubenswrapper[4732]: I0402 14:44:01.016950 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585684-bb29b"] Apr 02 14:44:01 crc kubenswrapper[4732]: I0402 14:44:01.549330 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585684-bb29b" event={"ID":"6a6a3e51-c56e-43ec-8508-45e7abb53983","Type":"ContainerStarted","Data":"a4dc3c972d2f69310344d9a9a00a4e9eec201d79433aa91d6d09fa682aa7fafd"} Apr 02 14:44:03 crc kubenswrapper[4732]: I0402 14:44:03.574840 4732 generic.go:334] "Generic (PLEG): container finished" podID="6a6a3e51-c56e-43ec-8508-45e7abb53983" containerID="c94186b4ff4b3c12330241a6b42c436be8cd92cf4158eea1ea75938bfed5598c" exitCode=0 Apr 02 14:44:03 crc kubenswrapper[4732]: I0402 14:44:03.574979 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585684-bb29b" event={"ID":"6a6a3e51-c56e-43ec-8508-45e7abb53983","Type":"ContainerDied","Data":"c94186b4ff4b3c12330241a6b42c436be8cd92cf4158eea1ea75938bfed5598c"} Apr 02 14:44:05 crc kubenswrapper[4732]: I0402 14:44:05.016656 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585684-bb29b" Apr 02 14:44:05 crc kubenswrapper[4732]: I0402 14:44:05.155389 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvsp5\" (UniqueName: \"kubernetes.io/projected/6a6a3e51-c56e-43ec-8508-45e7abb53983-kube-api-access-hvsp5\") pod \"6a6a3e51-c56e-43ec-8508-45e7abb53983\" (UID: \"6a6a3e51-c56e-43ec-8508-45e7abb53983\") " Apr 02 14:44:05 crc kubenswrapper[4732]: I0402 14:44:05.160807 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a6a3e51-c56e-43ec-8508-45e7abb53983-kube-api-access-hvsp5" (OuterVolumeSpecName: "kube-api-access-hvsp5") pod "6a6a3e51-c56e-43ec-8508-45e7abb53983" (UID: "6a6a3e51-c56e-43ec-8508-45e7abb53983"). InnerVolumeSpecName "kube-api-access-hvsp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:44:05 crc kubenswrapper[4732]: I0402 14:44:05.258395 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvsp5\" (UniqueName: \"kubernetes.io/projected/6a6a3e51-c56e-43ec-8508-45e7abb53983-kube-api-access-hvsp5\") on node \"crc\" DevicePath \"\"" Apr 02 14:44:05 crc kubenswrapper[4732]: I0402 14:44:05.593032 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585684-bb29b" event={"ID":"6a6a3e51-c56e-43ec-8508-45e7abb53983","Type":"ContainerDied","Data":"a4dc3c972d2f69310344d9a9a00a4e9eec201d79433aa91d6d09fa682aa7fafd"} Apr 02 14:44:05 crc kubenswrapper[4732]: I0402 14:44:05.593442 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4dc3c972d2f69310344d9a9a00a4e9eec201d79433aa91d6d09fa682aa7fafd" Apr 02 14:44:05 crc kubenswrapper[4732]: I0402 14:44:05.593065 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585684-bb29b" Apr 02 14:44:06 crc kubenswrapper[4732]: I0402 14:44:06.097281 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585678-tc75x"] Apr 02 14:44:06 crc kubenswrapper[4732]: I0402 14:44:06.107034 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585678-tc75x"] Apr 02 14:44:06 crc kubenswrapper[4732]: I0402 14:44:06.700303 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c4b5ea7-e9e1-45d9-80b7-e3a61b5464cd" path="/var/lib/kubelet/pods/1c4b5ea7-e9e1-45d9-80b7-e3a61b5464cd/volumes" Apr 02 14:44:08 crc kubenswrapper[4732]: I0402 14:44:08.680949 4732 scope.go:117] "RemoveContainer" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" Apr 02 14:44:08 crc kubenswrapper[4732]: E0402 14:44:08.681788 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:44:23 crc kubenswrapper[4732]: I0402 14:44:23.681560 4732 scope.go:117] "RemoveContainer" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" Apr 02 14:44:23 crc kubenswrapper[4732]: E0402 14:44:23.683079 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:44:34 crc kubenswrapper[4732]: I0402 14:44:34.696461 4732 scope.go:117] "RemoveContainer" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" Apr 02 14:44:34 crc kubenswrapper[4732]: E0402 14:44:34.697852 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:44:42 crc kubenswrapper[4732]: I0402 14:44:42.556363 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6jxwm/must-gather-gwhm7"] Apr 02 14:44:42 crc kubenswrapper[4732]: E0402 14:44:42.557476 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a6a3e51-c56e-43ec-8508-45e7abb53983" containerName="oc" Apr 02 14:44:42 crc kubenswrapper[4732]: I0402 14:44:42.557491 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a6a3e51-c56e-43ec-8508-45e7abb53983" containerName="oc" Apr 02 14:44:42 crc kubenswrapper[4732]: I0402 14:44:42.557751 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a6a3e51-c56e-43ec-8508-45e7abb53983" containerName="oc" Apr 02 14:44:42 crc kubenswrapper[4732]: I0402 14:44:42.559019 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6jxwm/must-gather-gwhm7" Apr 02 14:44:42 crc kubenswrapper[4732]: I0402 14:44:42.566041 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6jxwm"/"openshift-service-ca.crt" Apr 02 14:44:42 crc kubenswrapper[4732]: I0402 14:44:42.566243 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6jxwm/must-gather-gwhm7"] Apr 02 14:44:42 crc kubenswrapper[4732]: I0402 14:44:42.566268 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6jxwm"/"kube-root-ca.crt" Apr 02 14:44:42 crc kubenswrapper[4732]: I0402 14:44:42.566442 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-6jxwm"/"default-dockercfg-rn7h4" Apr 02 14:44:42 crc kubenswrapper[4732]: I0402 14:44:42.633595 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbclr\" (UniqueName: \"kubernetes.io/projected/2033c214-17d0-4695-8655-f142b35e0518-kube-api-access-tbclr\") pod \"must-gather-gwhm7\" (UID: \"2033c214-17d0-4695-8655-f142b35e0518\") " pod="openshift-must-gather-6jxwm/must-gather-gwhm7" Apr 02 14:44:42 crc kubenswrapper[4732]: I0402 14:44:42.633694 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2033c214-17d0-4695-8655-f142b35e0518-must-gather-output\") pod \"must-gather-gwhm7\" (UID: \"2033c214-17d0-4695-8655-f142b35e0518\") " pod="openshift-must-gather-6jxwm/must-gather-gwhm7" Apr 02 14:44:42 crc kubenswrapper[4732]: I0402 14:44:42.735948 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbclr\" (UniqueName: \"kubernetes.io/projected/2033c214-17d0-4695-8655-f142b35e0518-kube-api-access-tbclr\") pod \"must-gather-gwhm7\" (UID: \"2033c214-17d0-4695-8655-f142b35e0518\") " pod="openshift-must-gather-6jxwm/must-gather-gwhm7" Apr 02 14:44:42 crc kubenswrapper[4732]: I0402 14:44:42.736285 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2033c214-17d0-4695-8655-f142b35e0518-must-gather-output\") pod \"must-gather-gwhm7\" (UID: \"2033c214-17d0-4695-8655-f142b35e0518\") " pod="openshift-must-gather-6jxwm/must-gather-gwhm7" Apr 02 14:44:42 crc kubenswrapper[4732]: I0402 14:44:42.736642 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2033c214-17d0-4695-8655-f142b35e0518-must-gather-output\") pod \"must-gather-gwhm7\" (UID: \"2033c214-17d0-4695-8655-f142b35e0518\") " pod="openshift-must-gather-6jxwm/must-gather-gwhm7" Apr 02 14:44:42 crc kubenswrapper[4732]: I0402 14:44:42.752848 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbclr\" (UniqueName: \"kubernetes.io/projected/2033c214-17d0-4695-8655-f142b35e0518-kube-api-access-tbclr\") pod \"must-gather-gwhm7\" (UID: \"2033c214-17d0-4695-8655-f142b35e0518\") " pod="openshift-must-gather-6jxwm/must-gather-gwhm7" Apr 02 14:44:42 crc kubenswrapper[4732]: I0402 14:44:42.882401 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6jxwm/must-gather-gwhm7" Apr 02 14:44:43 crc kubenswrapper[4732]: I0402 14:44:43.359945 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6jxwm/must-gather-gwhm7"] Apr 02 14:44:44 crc kubenswrapper[4732]: I0402 14:44:44.032367 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6jxwm/must-gather-gwhm7" event={"ID":"2033c214-17d0-4695-8655-f142b35e0518","Type":"ContainerStarted","Data":"141655c1dc953019c7e69575f337c44e388c4da53c4a597b0d55f3f76ecc204e"} Apr 02 14:44:44 crc kubenswrapper[4732]: I0402 14:44:44.032751 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6jxwm/must-gather-gwhm7" event={"ID":"2033c214-17d0-4695-8655-f142b35e0518","Type":"ContainerStarted","Data":"4fdc5ea3309b18ddf66f90cafd00459635ffbc886f6ca2b56d49e846bfc0355c"} Apr 02 14:44:44 crc kubenswrapper[4732]: I0402 14:44:44.032761 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6jxwm/must-gather-gwhm7" event={"ID":"2033c214-17d0-4695-8655-f142b35e0518","Type":"ContainerStarted","Data":"823efdc180f07d987be7e60cfaefb250c957a29676f83ac74678852189b21cbd"} Apr 02 14:44:45 crc kubenswrapper[4732]: I0402 14:44:45.089058 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6jxwm/must-gather-gwhm7" podStartSLOduration=3.089026933 podStartE2EDuration="3.089026933s" podCreationTimestamp="2026-04-02 14:44:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:44:45.079420041 +0000 UTC m=+4041.983827674" watchObservedRunningTime="2026-04-02 14:44:45.089026933 +0000 UTC m=+4041.993434536" Apr 02 14:44:47 crc kubenswrapper[4732]: I0402 14:44:47.121978 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6jxwm/crc-debug-mxx62"] Apr 02 14:44:47 crc kubenswrapper[4732]: I0402 14:44:47.124215 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6jxwm/crc-debug-mxx62" Apr 02 14:44:47 crc kubenswrapper[4732]: I0402 14:44:47.233244 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec84acf5-e0df-4f95-a1b5-082a2a6c69ae-host\") pod \"crc-debug-mxx62\" (UID: \"ec84acf5-e0df-4f95-a1b5-082a2a6c69ae\") " pod="openshift-must-gather-6jxwm/crc-debug-mxx62" Apr 02 14:44:47 crc kubenswrapper[4732]: I0402 14:44:47.233388 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tggv5\" (UniqueName: \"kubernetes.io/projected/ec84acf5-e0df-4f95-a1b5-082a2a6c69ae-kube-api-access-tggv5\") pod \"crc-debug-mxx62\" (UID: \"ec84acf5-e0df-4f95-a1b5-082a2a6c69ae\") " pod="openshift-must-gather-6jxwm/crc-debug-mxx62" Apr 02 14:44:47 crc kubenswrapper[4732]: I0402 14:44:47.335148 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec84acf5-e0df-4f95-a1b5-082a2a6c69ae-host\") pod \"crc-debug-mxx62\" (UID: \"ec84acf5-e0df-4f95-a1b5-082a2a6c69ae\") " pod="openshift-must-gather-6jxwm/crc-debug-mxx62" Apr 02 14:44:47 crc kubenswrapper[4732]: I0402 14:44:47.335245 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec84acf5-e0df-4f95-a1b5-082a2a6c69ae-host\") pod \"crc-debug-mxx62\" (UID: \"ec84acf5-e0df-4f95-a1b5-082a2a6c69ae\") " pod="openshift-must-gather-6jxwm/crc-debug-mxx62" Apr 02 14:44:47 crc kubenswrapper[4732]: I0402 14:44:47.335257 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tggv5\" (UniqueName: \"kubernetes.io/projected/ec84acf5-e0df-4f95-a1b5-082a2a6c69ae-kube-api-access-tggv5\") pod \"crc-debug-mxx62\" (UID: \"ec84acf5-e0df-4f95-a1b5-082a2a6c69ae\") " pod="openshift-must-gather-6jxwm/crc-debug-mxx62" Apr 02 14:44:47 crc kubenswrapper[4732]: I0402 14:44:47.361564 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tggv5\" (UniqueName: \"kubernetes.io/projected/ec84acf5-e0df-4f95-a1b5-082a2a6c69ae-kube-api-access-tggv5\") pod \"crc-debug-mxx62\" (UID: \"ec84acf5-e0df-4f95-a1b5-082a2a6c69ae\") " pod="openshift-must-gather-6jxwm/crc-debug-mxx62" Apr 02 14:44:47 crc kubenswrapper[4732]: I0402 14:44:47.443669 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6jxwm/crc-debug-mxx62" Apr 02 14:44:47 crc kubenswrapper[4732]: W0402 14:44:47.479379 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec84acf5_e0df_4f95_a1b5_082a2a6c69ae.slice/crio-17700ff731b0ac57a71f1d0733a4ef30ba704a8155f0936b0350fd68a0a12ca3 WatchSource:0}: Error finding container 17700ff731b0ac57a71f1d0733a4ef30ba704a8155f0936b0350fd68a0a12ca3: Status 404 returned error can't find the container with id 17700ff731b0ac57a71f1d0733a4ef30ba704a8155f0936b0350fd68a0a12ca3 Apr 02 14:44:47 crc kubenswrapper[4732]: I0402 14:44:47.680116 4732 scope.go:117] "RemoveContainer" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" Apr 02 14:44:47 crc kubenswrapper[4732]: E0402 14:44:47.680424 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:44:48 crc kubenswrapper[4732]: I0402 14:44:48.081414 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6jxwm/crc-debug-mxx62" event={"ID":"ec84acf5-e0df-4f95-a1b5-082a2a6c69ae","Type":"ContainerStarted","Data":"1364bdd53c1e58011b0eb407960d1ef555f311e2787030d306fb507fcf225f62"} Apr 02 14:44:48 crc kubenswrapper[4732]: I0402 14:44:48.081938 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6jxwm/crc-debug-mxx62" event={"ID":"ec84acf5-e0df-4f95-a1b5-082a2a6c69ae","Type":"ContainerStarted","Data":"17700ff731b0ac57a71f1d0733a4ef30ba704a8155f0936b0350fd68a0a12ca3"} Apr 02 14:44:53 crc kubenswrapper[4732]: I0402 14:44:53.271071 4732 scope.go:117] "RemoveContainer" containerID="fb2895b298118bda51efe5d7a874c2e09106874e5e025a83b0573f2fac1fb3d8" Apr 02 14:45:00 crc kubenswrapper[4732]: I0402 14:45:00.143257 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6jxwm/crc-debug-mxx62" podStartSLOduration=13.143237194 podStartE2EDuration="13.143237194s" podCreationTimestamp="2026-04-02 14:44:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:44:48.107747708 +0000 UTC m=+4045.012155261" watchObservedRunningTime="2026-04-02 14:45:00.143237194 +0000 UTC m=+4057.047644767" Apr 02 14:45:00 crc kubenswrapper[4732]: I0402 14:45:00.153985 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29585685-k6kxt"] Apr 02 14:45:00 crc kubenswrapper[4732]: I0402 14:45:00.155293 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29585685-k6kxt" Apr 02 14:45:00 crc kubenswrapper[4732]: I0402 14:45:00.157851 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Apr 02 14:45:00 crc kubenswrapper[4732]: I0402 14:45:00.158185 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Apr 02 14:45:00 crc kubenswrapper[4732]: I0402 14:45:00.180943 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29585685-k6kxt"] Apr 02 14:45:00 crc kubenswrapper[4732]: I0402 14:45:00.200156 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bf17bfd-927c-4ece-9e32-47cb9e2486db-secret-volume\") pod \"collect-profiles-29585685-k6kxt\" (UID: \"8bf17bfd-927c-4ece-9e32-47cb9e2486db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585685-k6kxt" Apr 02 14:45:00 crc kubenswrapper[4732]: I0402 14:45:00.200343 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfq2l\" (UniqueName: \"kubernetes.io/projected/8bf17bfd-927c-4ece-9e32-47cb9e2486db-kube-api-access-jfq2l\") pod \"collect-profiles-29585685-k6kxt\" (UID: \"8bf17bfd-927c-4ece-9e32-47cb9e2486db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585685-k6kxt" Apr 02 14:45:00 crc kubenswrapper[4732]: I0402 14:45:00.200381 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bf17bfd-927c-4ece-9e32-47cb9e2486db-config-volume\") pod \"collect-profiles-29585685-k6kxt\" (UID: \"8bf17bfd-927c-4ece-9e32-47cb9e2486db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585685-k6kxt" Apr 02 14:45:00 crc kubenswrapper[4732]: I0402 14:45:00.302134 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfq2l\" (UniqueName: \"kubernetes.io/projected/8bf17bfd-927c-4ece-9e32-47cb9e2486db-kube-api-access-jfq2l\") pod \"collect-profiles-29585685-k6kxt\" (UID: \"8bf17bfd-927c-4ece-9e32-47cb9e2486db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585685-k6kxt" Apr 02 14:45:00 crc kubenswrapper[4732]: I0402 14:45:00.302449 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bf17bfd-927c-4ece-9e32-47cb9e2486db-config-volume\") pod \"collect-profiles-29585685-k6kxt\" (UID: \"8bf17bfd-927c-4ece-9e32-47cb9e2486db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585685-k6kxt" Apr 02 14:45:00 crc kubenswrapper[4732]: I0402 14:45:00.302565 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bf17bfd-927c-4ece-9e32-47cb9e2486db-secret-volume\") pod \"collect-profiles-29585685-k6kxt\" (UID: \"8bf17bfd-927c-4ece-9e32-47cb9e2486db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585685-k6kxt" Apr 02 14:45:00 crc kubenswrapper[4732]: I0402 14:45:00.303384 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bf17bfd-927c-4ece-9e32-47cb9e2486db-config-volume\") pod \"collect-profiles-29585685-k6kxt\" (UID: \"8bf17bfd-927c-4ece-9e32-47cb9e2486db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585685-k6kxt" Apr 02 14:45:00 crc kubenswrapper[4732]: I0402 14:45:00.316333 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bf17bfd-927c-4ece-9e32-47cb9e2486db-secret-volume\") pod \"collect-profiles-29585685-k6kxt\" (UID: \"8bf17bfd-927c-4ece-9e32-47cb9e2486db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585685-k6kxt" Apr 02 14:45:00 crc kubenswrapper[4732]: I0402 14:45:00.332677 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfq2l\" (UniqueName: \"kubernetes.io/projected/8bf17bfd-927c-4ece-9e32-47cb9e2486db-kube-api-access-jfq2l\") pod \"collect-profiles-29585685-k6kxt\" (UID: \"8bf17bfd-927c-4ece-9e32-47cb9e2486db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29585685-k6kxt" Apr 02 14:45:00 crc kubenswrapper[4732]: I0402 14:45:00.475592 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29585685-k6kxt" Apr 02 14:45:00 crc kubenswrapper[4732]: I0402 14:45:00.956232 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29585685-k6kxt"] Apr 02 14:45:00 crc kubenswrapper[4732]: W0402 14:45:00.957588 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bf17bfd_927c_4ece_9e32_47cb9e2486db.slice/crio-a94a58bab2964c88017b86c94329cb45778c5ff5afad4d618b6a3c085b7e169c WatchSource:0}: Error finding container a94a58bab2964c88017b86c94329cb45778c5ff5afad4d618b6a3c085b7e169c: Status 404 returned error can't find the container with id a94a58bab2964c88017b86c94329cb45778c5ff5afad4d618b6a3c085b7e169c Apr 02 14:45:01 crc kubenswrapper[4732]: I0402 14:45:01.199494 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29585685-k6kxt" event={"ID":"8bf17bfd-927c-4ece-9e32-47cb9e2486db","Type":"ContainerStarted","Data":"0abca38d5113131421ddbdc0a0405b22a378cabf71584acfe4ad3d24ae2a2203"} Apr 02 14:45:01 crc kubenswrapper[4732]: I0402 14:45:01.199554 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29585685-k6kxt" event={"ID":"8bf17bfd-927c-4ece-9e32-47cb9e2486db","Type":"ContainerStarted","Data":"a94a58bab2964c88017b86c94329cb45778c5ff5afad4d618b6a3c085b7e169c"} Apr 02 14:45:01 crc kubenswrapper[4732]: I0402 14:45:01.217529 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29585685-k6kxt" podStartSLOduration=1.217507716 podStartE2EDuration="1.217507716s" podCreationTimestamp="2026-04-02 14:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-02 14:45:01.215882572 +0000 UTC m=+4058.120290155" watchObservedRunningTime="2026-04-02 14:45:01.217507716 +0000 UTC m=+4058.121915279" Apr 02 14:45:02 crc kubenswrapper[4732]: I0402 14:45:02.209104 4732 generic.go:334] "Generic (PLEG): container finished" podID="8bf17bfd-927c-4ece-9e32-47cb9e2486db" containerID="0abca38d5113131421ddbdc0a0405b22a378cabf71584acfe4ad3d24ae2a2203" exitCode=0 Apr 02 14:45:02 crc kubenswrapper[4732]: I0402 14:45:02.209157 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29585685-k6kxt" event={"ID":"8bf17bfd-927c-4ece-9e32-47cb9e2486db","Type":"ContainerDied","Data":"0abca38d5113131421ddbdc0a0405b22a378cabf71584acfe4ad3d24ae2a2203"} Apr 02 14:45:02 crc kubenswrapper[4732]: I0402 14:45:02.680731 4732 scope.go:117] "RemoveContainer" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" Apr 02 14:45:02 crc kubenswrapper[4732]: E0402 14:45:02.681310 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:45:04 crc kubenswrapper[4732]: I0402 14:45:04.240702 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29585685-k6kxt" event={"ID":"8bf17bfd-927c-4ece-9e32-47cb9e2486db","Type":"ContainerDied","Data":"a94a58bab2964c88017b86c94329cb45778c5ff5afad4d618b6a3c085b7e169c"} Apr 02 14:45:04 crc kubenswrapper[4732]: I0402 14:45:04.241145 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a94a58bab2964c88017b86c94329cb45778c5ff5afad4d618b6a3c085b7e169c" Apr 02 14:45:04 crc kubenswrapper[4732]: I0402 14:45:04.256387 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29585685-k6kxt" Apr 02 14:45:04 crc kubenswrapper[4732]: I0402 14:45:04.315014 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bf17bfd-927c-4ece-9e32-47cb9e2486db-secret-volume\") pod \"8bf17bfd-927c-4ece-9e32-47cb9e2486db\" (UID: \"8bf17bfd-927c-4ece-9e32-47cb9e2486db\") " Apr 02 14:45:04 crc kubenswrapper[4732]: I0402 14:45:04.315327 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfq2l\" (UniqueName: \"kubernetes.io/projected/8bf17bfd-927c-4ece-9e32-47cb9e2486db-kube-api-access-jfq2l\") pod \"8bf17bfd-927c-4ece-9e32-47cb9e2486db\" (UID: \"8bf17bfd-927c-4ece-9e32-47cb9e2486db\") " Apr 02 14:45:04 crc kubenswrapper[4732]: I0402 14:45:04.315412 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bf17bfd-927c-4ece-9e32-47cb9e2486db-config-volume\") pod \"8bf17bfd-927c-4ece-9e32-47cb9e2486db\" (UID: \"8bf17bfd-927c-4ece-9e32-47cb9e2486db\") " Apr 02 14:45:04 crc kubenswrapper[4732]: I0402 14:45:04.316760 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bf17bfd-927c-4ece-9e32-47cb9e2486db-config-volume" (OuterVolumeSpecName: "config-volume") pod "8bf17bfd-927c-4ece-9e32-47cb9e2486db" (UID: "8bf17bfd-927c-4ece-9e32-47cb9e2486db"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 02 14:45:04 crc kubenswrapper[4732]: I0402 14:45:04.322483 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bf17bfd-927c-4ece-9e32-47cb9e2486db-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8bf17bfd-927c-4ece-9e32-47cb9e2486db" (UID: "8bf17bfd-927c-4ece-9e32-47cb9e2486db"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 02 14:45:04 crc kubenswrapper[4732]: I0402 14:45:04.323177 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bf17bfd-927c-4ece-9e32-47cb9e2486db-kube-api-access-jfq2l" (OuterVolumeSpecName: "kube-api-access-jfq2l") pod "8bf17bfd-927c-4ece-9e32-47cb9e2486db" (UID: "8bf17bfd-927c-4ece-9e32-47cb9e2486db"). InnerVolumeSpecName "kube-api-access-jfq2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:45:04 crc kubenswrapper[4732]: I0402 14:45:04.420171 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfq2l\" (UniqueName: \"kubernetes.io/projected/8bf17bfd-927c-4ece-9e32-47cb9e2486db-kube-api-access-jfq2l\") on node \"crc\" DevicePath \"\"" Apr 02 14:45:04 crc kubenswrapper[4732]: I0402 14:45:04.420380 4732 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bf17bfd-927c-4ece-9e32-47cb9e2486db-config-volume\") on node \"crc\" DevicePath \"\"" Apr 02 14:45:04 crc kubenswrapper[4732]: I0402 14:45:04.420481 4732 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bf17bfd-927c-4ece-9e32-47cb9e2486db-secret-volume\") on node \"crc\" DevicePath \"\"" Apr 02 14:45:05 crc kubenswrapper[4732]: I0402 14:45:05.250264 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29585685-k6kxt" Apr 02 14:45:05 crc kubenswrapper[4732]: I0402 14:45:05.326262 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29585640-n2cb8"] Apr 02 14:45:05 crc kubenswrapper[4732]: I0402 14:45:05.334273 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29585640-n2cb8"] Apr 02 14:45:06 crc kubenswrapper[4732]: I0402 14:45:06.694117 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09" path="/var/lib/kubelet/pods/8d43ed34-cc78-4cfa-b5bd-b0c892cc6d09/volumes" Apr 02 14:45:06 crc kubenswrapper[4732]: I0402 14:45:06.746505 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-shnqf"] Apr 02 14:45:06 crc kubenswrapper[4732]: E0402 14:45:06.746979 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bf17bfd-927c-4ece-9e32-47cb9e2486db" containerName="collect-profiles" Apr 02 14:45:06 crc kubenswrapper[4732]: I0402 14:45:06.746999 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bf17bfd-927c-4ece-9e32-47cb9e2486db" containerName="collect-profiles" Apr 02 14:45:06 crc kubenswrapper[4732]: I0402 14:45:06.747179 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bf17bfd-927c-4ece-9e32-47cb9e2486db" containerName="collect-profiles" Apr 02 14:45:06 crc kubenswrapper[4732]: I0402 14:45:06.748469 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-shnqf" Apr 02 14:45:06 crc kubenswrapper[4732]: I0402 14:45:06.762792 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-shnqf"] Apr 02 14:45:06 crc kubenswrapper[4732]: I0402 14:45:06.856370 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grbtl\" (UniqueName: \"kubernetes.io/projected/96c289fd-5f8e-48fd-b94b-bbef6f896d4f-kube-api-access-grbtl\") pod \"redhat-marketplace-shnqf\" (UID: \"96c289fd-5f8e-48fd-b94b-bbef6f896d4f\") " pod="openshift-marketplace/redhat-marketplace-shnqf" Apr 02 14:45:06 crc kubenswrapper[4732]: I0402 14:45:06.856708 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96c289fd-5f8e-48fd-b94b-bbef6f896d4f-utilities\") pod \"redhat-marketplace-shnqf\" (UID: \"96c289fd-5f8e-48fd-b94b-bbef6f896d4f\") " pod="openshift-marketplace/redhat-marketplace-shnqf" Apr 02 14:45:06 crc kubenswrapper[4732]: I0402 14:45:06.856859 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96c289fd-5f8e-48fd-b94b-bbef6f896d4f-catalog-content\") pod \"redhat-marketplace-shnqf\" (UID: \"96c289fd-5f8e-48fd-b94b-bbef6f896d4f\") " pod="openshift-marketplace/redhat-marketplace-shnqf" Apr 02 14:45:06 crc kubenswrapper[4732]: I0402 14:45:06.957946 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96c289fd-5f8e-48fd-b94b-bbef6f896d4f-utilities\") pod \"redhat-marketplace-shnqf\" (UID: \"96c289fd-5f8e-48fd-b94b-bbef6f896d4f\") " pod="openshift-marketplace/redhat-marketplace-shnqf" Apr 02 14:45:06 crc kubenswrapper[4732]: I0402 14:45:06.958051 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96c289fd-5f8e-48fd-b94b-bbef6f896d4f-catalog-content\") pod \"redhat-marketplace-shnqf\" (UID: \"96c289fd-5f8e-48fd-b94b-bbef6f896d4f\") " pod="openshift-marketplace/redhat-marketplace-shnqf" Apr 02 14:45:06 crc kubenswrapper[4732]: I0402 14:45:06.958166 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grbtl\" (UniqueName: \"kubernetes.io/projected/96c289fd-5f8e-48fd-b94b-bbef6f896d4f-kube-api-access-grbtl\") pod \"redhat-marketplace-shnqf\" (UID: \"96c289fd-5f8e-48fd-b94b-bbef6f896d4f\") " pod="openshift-marketplace/redhat-marketplace-shnqf" Apr 02 14:45:06 crc kubenswrapper[4732]: I0402 14:45:06.958602 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96c289fd-5f8e-48fd-b94b-bbef6f896d4f-utilities\") pod \"redhat-marketplace-shnqf\" (UID: \"96c289fd-5f8e-48fd-b94b-bbef6f896d4f\") " pod="openshift-marketplace/redhat-marketplace-shnqf" Apr 02 14:45:06 crc kubenswrapper[4732]: I0402 14:45:06.958863 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96c289fd-5f8e-48fd-b94b-bbef6f896d4f-catalog-content\") pod \"redhat-marketplace-shnqf\" (UID: \"96c289fd-5f8e-48fd-b94b-bbef6f896d4f\") " pod="openshift-marketplace/redhat-marketplace-shnqf" Apr 02 14:45:07 crc kubenswrapper[4732]: I0402 14:45:07.000717 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grbtl\" (UniqueName: \"kubernetes.io/projected/96c289fd-5f8e-48fd-b94b-bbef6f896d4f-kube-api-access-grbtl\") pod \"redhat-marketplace-shnqf\" (UID: \"96c289fd-5f8e-48fd-b94b-bbef6f896d4f\") " pod="openshift-marketplace/redhat-marketplace-shnqf" Apr 02 14:45:07 crc kubenswrapper[4732]: I0402 14:45:07.070333 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-shnqf" Apr 02 14:45:07 crc kubenswrapper[4732]: I0402 14:45:07.563492 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-shnqf"] Apr 02 14:45:08 crc kubenswrapper[4732]: I0402 14:45:08.275825 4732 generic.go:334] "Generic (PLEG): container finished" podID="96c289fd-5f8e-48fd-b94b-bbef6f896d4f" containerID="800c4426a847af9dd1fac622baef99a9bd71ddaa6870af0ca73d28e38fc66d4b" exitCode=0 Apr 02 14:45:08 crc kubenswrapper[4732]: I0402 14:45:08.275927 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-shnqf" event={"ID":"96c289fd-5f8e-48fd-b94b-bbef6f896d4f","Type":"ContainerDied","Data":"800c4426a847af9dd1fac622baef99a9bd71ddaa6870af0ca73d28e38fc66d4b"} Apr 02 14:45:08 crc kubenswrapper[4732]: I0402 14:45:08.276061 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-shnqf" event={"ID":"96c289fd-5f8e-48fd-b94b-bbef6f896d4f","Type":"ContainerStarted","Data":"6d58130d5b0374e91e2055976b2149755275a8bfddf352b1a1ef30b47f23df6f"} Apr 02 14:45:09 crc kubenswrapper[4732]: I0402 14:45:09.286308 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-shnqf" event={"ID":"96c289fd-5f8e-48fd-b94b-bbef6f896d4f","Type":"ContainerStarted","Data":"4b7a90bc4e585221ae52dc89941a24be5edb21dbca866725f3d025894e38c4a9"} Apr 02 14:45:10 crc kubenswrapper[4732]: I0402 14:45:10.298550 4732 generic.go:334] "Generic (PLEG): container finished" podID="96c289fd-5f8e-48fd-b94b-bbef6f896d4f" containerID="4b7a90bc4e585221ae52dc89941a24be5edb21dbca866725f3d025894e38c4a9" exitCode=0 Apr 02 14:45:10 crc kubenswrapper[4732]: I0402 14:45:10.298587 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-shnqf" event={"ID":"96c289fd-5f8e-48fd-b94b-bbef6f896d4f","Type":"ContainerDied","Data":"4b7a90bc4e585221ae52dc89941a24be5edb21dbca866725f3d025894e38c4a9"} Apr 02 14:45:11 crc kubenswrapper[4732]: I0402 14:45:11.310114 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-shnqf" event={"ID":"96c289fd-5f8e-48fd-b94b-bbef6f896d4f","Type":"ContainerStarted","Data":"f8347ce3dc419a3521e60a5afb7edc70d81ce44d7d9c62b8a5625a60b90cdc09"} Apr 02 14:45:11 crc kubenswrapper[4732]: I0402 14:45:11.334649 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-shnqf" podStartSLOduration=2.865336353 podStartE2EDuration="5.334631101s" podCreationTimestamp="2026-04-02 14:45:06 +0000 UTC" firstStartedPulling="2026-04-02 14:45:08.277370814 +0000 UTC m=+4065.181778367" lastFinishedPulling="2026-04-02 14:45:10.746665542 +0000 UTC m=+4067.651073115" observedRunningTime="2026-04-02 14:45:11.327881966 +0000 UTC m=+4068.232289539" watchObservedRunningTime="2026-04-02 14:45:11.334631101 +0000 UTC m=+4068.239038654" Apr 02 14:45:15 crc kubenswrapper[4732]: I0402 14:45:15.680505 4732 scope.go:117] "RemoveContainer" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" Apr 02 14:45:15 crc kubenswrapper[4732]: E0402 14:45:15.682644 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:45:17 crc kubenswrapper[4732]: I0402 14:45:17.070840 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-shnqf" Apr 02 14:45:17 crc kubenswrapper[4732]: I0402 14:45:17.071243 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-shnqf" Apr 02 14:45:17 crc kubenswrapper[4732]: I0402 14:45:17.127161 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-shnqf" Apr 02 14:45:17 crc kubenswrapper[4732]: I0402 14:45:17.435275 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-shnqf" Apr 02 14:45:17 crc kubenswrapper[4732]: I0402 14:45:17.480970 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-shnqf"] Apr 02 14:45:19 crc kubenswrapper[4732]: I0402 14:45:19.402108 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-shnqf" podUID="96c289fd-5f8e-48fd-b94b-bbef6f896d4f" containerName="registry-server" containerID="cri-o://f8347ce3dc419a3521e60a5afb7edc70d81ce44d7d9c62b8a5625a60b90cdc09" gracePeriod=2 Apr 02 14:45:19 crc kubenswrapper[4732]: I0402 14:45:19.869722 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-shnqf" Apr 02 14:45:20 crc kubenswrapper[4732]: I0402 14:45:20.020403 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grbtl\" (UniqueName: \"kubernetes.io/projected/96c289fd-5f8e-48fd-b94b-bbef6f896d4f-kube-api-access-grbtl\") pod \"96c289fd-5f8e-48fd-b94b-bbef6f896d4f\" (UID: \"96c289fd-5f8e-48fd-b94b-bbef6f896d4f\") " Apr 02 14:45:20 crc kubenswrapper[4732]: I0402 14:45:20.020553 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96c289fd-5f8e-48fd-b94b-bbef6f896d4f-utilities\") pod \"96c289fd-5f8e-48fd-b94b-bbef6f896d4f\" (UID: \"96c289fd-5f8e-48fd-b94b-bbef6f896d4f\") " Apr 02 14:45:20 crc kubenswrapper[4732]: I0402 14:45:20.020605 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96c289fd-5f8e-48fd-b94b-bbef6f896d4f-catalog-content\") pod \"96c289fd-5f8e-48fd-b94b-bbef6f896d4f\" (UID: \"96c289fd-5f8e-48fd-b94b-bbef6f896d4f\") " Apr 02 14:45:20 crc kubenswrapper[4732]: I0402 14:45:20.023490 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96c289fd-5f8e-48fd-b94b-bbef6f896d4f-utilities" (OuterVolumeSpecName: "utilities") pod "96c289fd-5f8e-48fd-b94b-bbef6f896d4f" (UID: "96c289fd-5f8e-48fd-b94b-bbef6f896d4f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:45:20 crc kubenswrapper[4732]: I0402 14:45:20.044388 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96c289fd-5f8e-48fd-b94b-bbef6f896d4f-kube-api-access-grbtl" (OuterVolumeSpecName: "kube-api-access-grbtl") pod "96c289fd-5f8e-48fd-b94b-bbef6f896d4f" (UID: "96c289fd-5f8e-48fd-b94b-bbef6f896d4f"). InnerVolumeSpecName "kube-api-access-grbtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:45:20 crc kubenswrapper[4732]: I0402 14:45:20.066026 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96c289fd-5f8e-48fd-b94b-bbef6f896d4f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96c289fd-5f8e-48fd-b94b-bbef6f896d4f" (UID: "96c289fd-5f8e-48fd-b94b-bbef6f896d4f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:45:20 crc kubenswrapper[4732]: I0402 14:45:20.122445 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96c289fd-5f8e-48fd-b94b-bbef6f896d4f-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 14:45:20 crc kubenswrapper[4732]: I0402 14:45:20.122482 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96c289fd-5f8e-48fd-b94b-bbef6f896d4f-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 14:45:20 crc kubenswrapper[4732]: I0402 14:45:20.122492 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grbtl\" (UniqueName: \"kubernetes.io/projected/96c289fd-5f8e-48fd-b94b-bbef6f896d4f-kube-api-access-grbtl\") on node \"crc\" DevicePath \"\"" Apr 02 14:45:20 crc kubenswrapper[4732]: I0402 14:45:20.413960 4732 generic.go:334] "Generic (PLEG): container finished" podID="96c289fd-5f8e-48fd-b94b-bbef6f896d4f" containerID="f8347ce3dc419a3521e60a5afb7edc70d81ce44d7d9c62b8a5625a60b90cdc09" exitCode=0 Apr 02 14:45:20 crc kubenswrapper[4732]: I0402 14:45:20.414009 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-shnqf" event={"ID":"96c289fd-5f8e-48fd-b94b-bbef6f896d4f","Type":"ContainerDied","Data":"f8347ce3dc419a3521e60a5afb7edc70d81ce44d7d9c62b8a5625a60b90cdc09"} Apr 02 14:45:20 crc kubenswrapper[4732]: I0402 14:45:20.414042 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-shnqf" event={"ID":"96c289fd-5f8e-48fd-b94b-bbef6f896d4f","Type":"ContainerDied","Data":"6d58130d5b0374e91e2055976b2149755275a8bfddf352b1a1ef30b47f23df6f"} Apr 02 14:45:20 crc kubenswrapper[4732]: I0402 14:45:20.414053 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-shnqf" Apr 02 14:45:20 crc kubenswrapper[4732]: I0402 14:45:20.414066 4732 scope.go:117] "RemoveContainer" containerID="f8347ce3dc419a3521e60a5afb7edc70d81ce44d7d9c62b8a5625a60b90cdc09" Apr 02 14:45:20 crc kubenswrapper[4732]: I0402 14:45:20.448410 4732 scope.go:117] "RemoveContainer" containerID="4b7a90bc4e585221ae52dc89941a24be5edb21dbca866725f3d025894e38c4a9" Apr 02 14:45:20 crc kubenswrapper[4732]: I0402 14:45:20.460342 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-shnqf"] Apr 02 14:45:20 crc kubenswrapper[4732]: I0402 14:45:20.475629 4732 scope.go:117] "RemoveContainer" containerID="800c4426a847af9dd1fac622baef99a9bd71ddaa6870af0ca73d28e38fc66d4b" Apr 02 14:45:20 crc kubenswrapper[4732]: I0402 14:45:20.476751 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-shnqf"] Apr 02 14:45:20 crc kubenswrapper[4732]: I0402 14:45:20.515810 4732 scope.go:117] "RemoveContainer" containerID="f8347ce3dc419a3521e60a5afb7edc70d81ce44d7d9c62b8a5625a60b90cdc09" Apr 02 14:45:20 crc kubenswrapper[4732]: E0402 14:45:20.517184 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8347ce3dc419a3521e60a5afb7edc70d81ce44d7d9c62b8a5625a60b90cdc09\": container with ID starting with f8347ce3dc419a3521e60a5afb7edc70d81ce44d7d9c62b8a5625a60b90cdc09 not found: ID does not exist" containerID="f8347ce3dc419a3521e60a5afb7edc70d81ce44d7d9c62b8a5625a60b90cdc09" Apr 02 14:45:20 crc kubenswrapper[4732]: I0402 14:45:20.517233 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8347ce3dc419a3521e60a5afb7edc70d81ce44d7d9c62b8a5625a60b90cdc09"} err="failed to get container status \"f8347ce3dc419a3521e60a5afb7edc70d81ce44d7d9c62b8a5625a60b90cdc09\": rpc error: code = NotFound desc = could not find container \"f8347ce3dc419a3521e60a5afb7edc70d81ce44d7d9c62b8a5625a60b90cdc09\": container with ID starting with f8347ce3dc419a3521e60a5afb7edc70d81ce44d7d9c62b8a5625a60b90cdc09 not found: ID does not exist" Apr 02 14:45:20 crc kubenswrapper[4732]: I0402 14:45:20.517262 4732 scope.go:117] "RemoveContainer" containerID="4b7a90bc4e585221ae52dc89941a24be5edb21dbca866725f3d025894e38c4a9" Apr 02 14:45:20 crc kubenswrapper[4732]: E0402 14:45:20.517768 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b7a90bc4e585221ae52dc89941a24be5edb21dbca866725f3d025894e38c4a9\": container with ID starting with 4b7a90bc4e585221ae52dc89941a24be5edb21dbca866725f3d025894e38c4a9 not found: ID does not exist" containerID="4b7a90bc4e585221ae52dc89941a24be5edb21dbca866725f3d025894e38c4a9" Apr 02 14:45:20 crc kubenswrapper[4732]: I0402 14:45:20.517791 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b7a90bc4e585221ae52dc89941a24be5edb21dbca866725f3d025894e38c4a9"} err="failed to get container status \"4b7a90bc4e585221ae52dc89941a24be5edb21dbca866725f3d025894e38c4a9\": rpc error: code = NotFound desc = could not find container \"4b7a90bc4e585221ae52dc89941a24be5edb21dbca866725f3d025894e38c4a9\": container with ID starting with 4b7a90bc4e585221ae52dc89941a24be5edb21dbca866725f3d025894e38c4a9 not found: ID does not exist" Apr 02 14:45:20 crc kubenswrapper[4732]: I0402 14:45:20.517811 4732 scope.go:117] "RemoveContainer" containerID="800c4426a847af9dd1fac622baef99a9bd71ddaa6870af0ca73d28e38fc66d4b" Apr 02 14:45:20 crc kubenswrapper[4732]: E0402 14:45:20.518342 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"800c4426a847af9dd1fac622baef99a9bd71ddaa6870af0ca73d28e38fc66d4b\": container with ID starting with 800c4426a847af9dd1fac622baef99a9bd71ddaa6870af0ca73d28e38fc66d4b not found: ID does not exist" containerID="800c4426a847af9dd1fac622baef99a9bd71ddaa6870af0ca73d28e38fc66d4b" Apr 02 14:45:20 crc kubenswrapper[4732]: I0402 14:45:20.518370 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"800c4426a847af9dd1fac622baef99a9bd71ddaa6870af0ca73d28e38fc66d4b"} err="failed to get container status \"800c4426a847af9dd1fac622baef99a9bd71ddaa6870af0ca73d28e38fc66d4b\": rpc error: code = NotFound desc = could not find container \"800c4426a847af9dd1fac622baef99a9bd71ddaa6870af0ca73d28e38fc66d4b\": container with ID starting with 800c4426a847af9dd1fac622baef99a9bd71ddaa6870af0ca73d28e38fc66d4b not found: ID does not exist" Apr 02 14:45:20 crc kubenswrapper[4732]: I0402 14:45:20.691368 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96c289fd-5f8e-48fd-b94b-bbef6f896d4f" path="/var/lib/kubelet/pods/96c289fd-5f8e-48fd-b94b-bbef6f896d4f/volumes" Apr 02 14:45:24 crc kubenswrapper[4732]: I0402 14:45:24.459903 4732 generic.go:334] "Generic (PLEG): container finished" podID="ec84acf5-e0df-4f95-a1b5-082a2a6c69ae" containerID="1364bdd53c1e58011b0eb407960d1ef555f311e2787030d306fb507fcf225f62" exitCode=0 Apr 02 14:45:24 crc kubenswrapper[4732]: I0402 14:45:24.459952 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6jxwm/crc-debug-mxx62" event={"ID":"ec84acf5-e0df-4f95-a1b5-082a2a6c69ae","Type":"ContainerDied","Data":"1364bdd53c1e58011b0eb407960d1ef555f311e2787030d306fb507fcf225f62"} Apr 02 14:45:25 crc kubenswrapper[4732]: I0402 14:45:25.775343 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6jxwm/crc-debug-mxx62" Apr 02 14:45:25 crc kubenswrapper[4732]: I0402 14:45:25.808758 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6jxwm/crc-debug-mxx62"] Apr 02 14:45:25 crc kubenswrapper[4732]: I0402 14:45:25.831600 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6jxwm/crc-debug-mxx62"] Apr 02 14:45:25 crc kubenswrapper[4732]: I0402 14:45:25.927257 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tggv5\" (UniqueName: \"kubernetes.io/projected/ec84acf5-e0df-4f95-a1b5-082a2a6c69ae-kube-api-access-tggv5\") pod \"ec84acf5-e0df-4f95-a1b5-082a2a6c69ae\" (UID: \"ec84acf5-e0df-4f95-a1b5-082a2a6c69ae\") " Apr 02 14:45:25 crc kubenswrapper[4732]: I0402 14:45:25.927477 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec84acf5-e0df-4f95-a1b5-082a2a6c69ae-host\") pod \"ec84acf5-e0df-4f95-a1b5-082a2a6c69ae\" (UID: \"ec84acf5-e0df-4f95-a1b5-082a2a6c69ae\") " Apr 02 14:45:25 crc kubenswrapper[4732]: I0402 14:45:25.927683 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec84acf5-e0df-4f95-a1b5-082a2a6c69ae-host" (OuterVolumeSpecName: "host") pod "ec84acf5-e0df-4f95-a1b5-082a2a6c69ae" (UID: "ec84acf5-e0df-4f95-a1b5-082a2a6c69ae"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 14:45:25 crc kubenswrapper[4732]: I0402 14:45:25.928324 4732 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec84acf5-e0df-4f95-a1b5-082a2a6c69ae-host\") on node \"crc\" DevicePath \"\"" Apr 02 14:45:25 crc kubenswrapper[4732]: I0402 14:45:25.984529 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec84acf5-e0df-4f95-a1b5-082a2a6c69ae-kube-api-access-tggv5" (OuterVolumeSpecName: "kube-api-access-tggv5") pod "ec84acf5-e0df-4f95-a1b5-082a2a6c69ae" (UID: "ec84acf5-e0df-4f95-a1b5-082a2a6c69ae"). InnerVolumeSpecName "kube-api-access-tggv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:45:26 crc kubenswrapper[4732]: I0402 14:45:26.030396 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tggv5\" (UniqueName: \"kubernetes.io/projected/ec84acf5-e0df-4f95-a1b5-082a2a6c69ae-kube-api-access-tggv5\") on node \"crc\" DevicePath \"\"" Apr 02 14:45:26 crc kubenswrapper[4732]: I0402 14:45:26.478818 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17700ff731b0ac57a71f1d0733a4ef30ba704a8155f0936b0350fd68a0a12ca3" Apr 02 14:45:26 crc kubenswrapper[4732]: I0402 14:45:26.478889 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6jxwm/crc-debug-mxx62" Apr 02 14:45:26 crc kubenswrapper[4732]: I0402 14:45:26.701717 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec84acf5-e0df-4f95-a1b5-082a2a6c69ae" path="/var/lib/kubelet/pods/ec84acf5-e0df-4f95-a1b5-082a2a6c69ae/volumes" Apr 02 14:45:27 crc kubenswrapper[4732]: I0402 14:45:27.051873 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6jxwm/crc-debug-g7df5"] Apr 02 14:45:27 crc kubenswrapper[4732]: E0402 14:45:27.052501 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c289fd-5f8e-48fd-b94b-bbef6f896d4f" containerName="extract-content" Apr 02 14:45:27 crc kubenswrapper[4732]: I0402 14:45:27.052513 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c289fd-5f8e-48fd-b94b-bbef6f896d4f" containerName="extract-content" Apr 02 14:45:27 crc kubenswrapper[4732]: E0402 14:45:27.052525 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c289fd-5f8e-48fd-b94b-bbef6f896d4f" containerName="extract-utilities" Apr 02 14:45:27 crc kubenswrapper[4732]: I0402 14:45:27.052531 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c289fd-5f8e-48fd-b94b-bbef6f896d4f" containerName="extract-utilities" Apr 02 14:45:27 crc kubenswrapper[4732]: E0402 14:45:27.052551 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec84acf5-e0df-4f95-a1b5-082a2a6c69ae" containerName="container-00" Apr 02 14:45:27 crc kubenswrapper[4732]: I0402 14:45:27.052557 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec84acf5-e0df-4f95-a1b5-082a2a6c69ae" containerName="container-00" Apr 02 14:45:27 crc kubenswrapper[4732]: E0402 14:45:27.052586 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c289fd-5f8e-48fd-b94b-bbef6f896d4f" containerName="registry-server" Apr 02 14:45:27 crc kubenswrapper[4732]: I0402 14:45:27.052592 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c289fd-5f8e-48fd-b94b-bbef6f896d4f" containerName="registry-server" Apr 02 14:45:27 crc kubenswrapper[4732]: I0402 14:45:27.052785 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec84acf5-e0df-4f95-a1b5-082a2a6c69ae" containerName="container-00" Apr 02 14:45:27 crc kubenswrapper[4732]: I0402 14:45:27.052806 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="96c289fd-5f8e-48fd-b94b-bbef6f896d4f" containerName="registry-server" Apr 02 14:45:27 crc kubenswrapper[4732]: I0402 14:45:27.053355 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6jxwm/crc-debug-g7df5" Apr 02 14:45:27 crc kubenswrapper[4732]: I0402 14:45:27.150259 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a008576-ba6b-48bf-857b-d76cdc35c07c-host\") pod \"crc-debug-g7df5\" (UID: \"6a008576-ba6b-48bf-857b-d76cdc35c07c\") " pod="openshift-must-gather-6jxwm/crc-debug-g7df5" Apr 02 14:45:27 crc kubenswrapper[4732]: I0402 14:45:27.150423 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7wvl\" (UniqueName: \"kubernetes.io/projected/6a008576-ba6b-48bf-857b-d76cdc35c07c-kube-api-access-b7wvl\") pod \"crc-debug-g7df5\" (UID: \"6a008576-ba6b-48bf-857b-d76cdc35c07c\") " pod="openshift-must-gather-6jxwm/crc-debug-g7df5" Apr 02 14:45:27 crc kubenswrapper[4732]: I0402 14:45:27.251908 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7wvl\" (UniqueName: \"kubernetes.io/projected/6a008576-ba6b-48bf-857b-d76cdc35c07c-kube-api-access-b7wvl\") pod \"crc-debug-g7df5\" (UID: \"6a008576-ba6b-48bf-857b-d76cdc35c07c\") " pod="openshift-must-gather-6jxwm/crc-debug-g7df5" Apr 02 14:45:27 crc kubenswrapper[4732]: I0402 14:45:27.252101 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a008576-ba6b-48bf-857b-d76cdc35c07c-host\") pod \"crc-debug-g7df5\" (UID: \"6a008576-ba6b-48bf-857b-d76cdc35c07c\") " pod="openshift-must-gather-6jxwm/crc-debug-g7df5" Apr 02 14:45:27 crc kubenswrapper[4732]: I0402 14:45:27.252256 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a008576-ba6b-48bf-857b-d76cdc35c07c-host\") pod \"crc-debug-g7df5\" (UID: \"6a008576-ba6b-48bf-857b-d76cdc35c07c\") " pod="openshift-must-gather-6jxwm/crc-debug-g7df5" Apr 02 14:45:27 crc kubenswrapper[4732]: I0402 14:45:27.386297 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7wvl\" (UniqueName: \"kubernetes.io/projected/6a008576-ba6b-48bf-857b-d76cdc35c07c-kube-api-access-b7wvl\") pod \"crc-debug-g7df5\" (UID: \"6a008576-ba6b-48bf-857b-d76cdc35c07c\") " pod="openshift-must-gather-6jxwm/crc-debug-g7df5" Apr 02 14:45:27 crc kubenswrapper[4732]: I0402 14:45:27.669174 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6jxwm/crc-debug-g7df5" Apr 02 14:45:28 crc kubenswrapper[4732]: I0402 14:45:28.499072 4732 generic.go:334] "Generic (PLEG): container finished" podID="6a008576-ba6b-48bf-857b-d76cdc35c07c" containerID="f41c5a3ef3d9fb359dd941a2a370b29a00694a7b11300f9f2cdc6823fee02107" exitCode=0 Apr 02 14:45:28 crc kubenswrapper[4732]: I0402 14:45:28.499177 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6jxwm/crc-debug-g7df5" event={"ID":"6a008576-ba6b-48bf-857b-d76cdc35c07c","Type":"ContainerDied","Data":"f41c5a3ef3d9fb359dd941a2a370b29a00694a7b11300f9f2cdc6823fee02107"} Apr 02 14:45:28 crc kubenswrapper[4732]: I0402 14:45:28.499450 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6jxwm/crc-debug-g7df5" event={"ID":"6a008576-ba6b-48bf-857b-d76cdc35c07c","Type":"ContainerStarted","Data":"130d9a751d0930156ecb701bfa549015f6a5887910feb7efaaa51f567c06f63b"} Apr 02 14:45:28 crc kubenswrapper[4732]: I0402 14:45:28.944326 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6jxwm/crc-debug-g7df5"] Apr 02 14:45:28 crc kubenswrapper[4732]: I0402 14:45:28.956056 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6jxwm/crc-debug-g7df5"] Apr 02 14:45:29 crc kubenswrapper[4732]: I0402 14:45:29.618175 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6jxwm/crc-debug-g7df5" Apr 02 14:45:29 crc kubenswrapper[4732]: I0402 14:45:29.681031 4732 scope.go:117] "RemoveContainer" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" Apr 02 14:45:29 crc kubenswrapper[4732]: E0402 14:45:29.681226 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:45:29 crc kubenswrapper[4732]: I0402 14:45:29.805722 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a008576-ba6b-48bf-857b-d76cdc35c07c-host\") pod \"6a008576-ba6b-48bf-857b-d76cdc35c07c\" (UID: \"6a008576-ba6b-48bf-857b-d76cdc35c07c\") " Apr 02 14:45:29 crc kubenswrapper[4732]: I0402 14:45:29.805840 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a008576-ba6b-48bf-857b-d76cdc35c07c-host" (OuterVolumeSpecName: "host") pod "6a008576-ba6b-48bf-857b-d76cdc35c07c" (UID: "6a008576-ba6b-48bf-857b-d76cdc35c07c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 14:45:29 crc kubenswrapper[4732]: I0402 14:45:29.806275 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7wvl\" (UniqueName: \"kubernetes.io/projected/6a008576-ba6b-48bf-857b-d76cdc35c07c-kube-api-access-b7wvl\") pod \"6a008576-ba6b-48bf-857b-d76cdc35c07c\" (UID: \"6a008576-ba6b-48bf-857b-d76cdc35c07c\") " Apr 02 14:45:29 crc kubenswrapper[4732]: I0402 14:45:29.806911 4732 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a008576-ba6b-48bf-857b-d76cdc35c07c-host\") on node \"crc\" DevicePath \"\"" Apr 02 14:45:29 crc kubenswrapper[4732]: I0402 14:45:29.812345 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a008576-ba6b-48bf-857b-d76cdc35c07c-kube-api-access-b7wvl" (OuterVolumeSpecName: "kube-api-access-b7wvl") pod "6a008576-ba6b-48bf-857b-d76cdc35c07c" (UID: "6a008576-ba6b-48bf-857b-d76cdc35c07c"). InnerVolumeSpecName "kube-api-access-b7wvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:45:29 crc kubenswrapper[4732]: I0402 14:45:29.908451 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7wvl\" (UniqueName: \"kubernetes.io/projected/6a008576-ba6b-48bf-857b-d76cdc35c07c-kube-api-access-b7wvl\") on node \"crc\" DevicePath \"\"" Apr 02 14:45:30 crc kubenswrapper[4732]: I0402 14:45:30.191014 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6jxwm/crc-debug-b9sqn"] Apr 02 14:45:30 crc kubenswrapper[4732]: E0402 14:45:30.191457 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a008576-ba6b-48bf-857b-d76cdc35c07c" containerName="container-00" Apr 02 14:45:30 crc kubenswrapper[4732]: I0402 14:45:30.191476 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a008576-ba6b-48bf-857b-d76cdc35c07c" containerName="container-00" Apr 02 14:45:30 crc kubenswrapper[4732]: I0402 14:45:30.191690 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a008576-ba6b-48bf-857b-d76cdc35c07c" containerName="container-00" Apr 02 14:45:30 crc kubenswrapper[4732]: I0402 14:45:30.192298 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6jxwm/crc-debug-b9sqn" Apr 02 14:45:30 crc kubenswrapper[4732]: I0402 14:45:30.316042 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc6rh\" (UniqueName: \"kubernetes.io/projected/b7fb6002-79d8-4dc1-8ca5-078791862b04-kube-api-access-xc6rh\") pod \"crc-debug-b9sqn\" (UID: \"b7fb6002-79d8-4dc1-8ca5-078791862b04\") " pod="openshift-must-gather-6jxwm/crc-debug-b9sqn" Apr 02 14:45:30 crc kubenswrapper[4732]: I0402 14:45:30.316477 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7fb6002-79d8-4dc1-8ca5-078791862b04-host\") pod \"crc-debug-b9sqn\" (UID: \"b7fb6002-79d8-4dc1-8ca5-078791862b04\") " pod="openshift-must-gather-6jxwm/crc-debug-b9sqn" Apr 02 14:45:30 crc kubenswrapper[4732]: I0402 14:45:30.418045 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7fb6002-79d8-4dc1-8ca5-078791862b04-host\") pod \"crc-debug-b9sqn\" (UID: \"b7fb6002-79d8-4dc1-8ca5-078791862b04\") " pod="openshift-must-gather-6jxwm/crc-debug-b9sqn" Apr 02 14:45:30 crc kubenswrapper[4732]: I0402 14:45:30.418183 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7fb6002-79d8-4dc1-8ca5-078791862b04-host\") pod \"crc-debug-b9sqn\" (UID: \"b7fb6002-79d8-4dc1-8ca5-078791862b04\") " pod="openshift-must-gather-6jxwm/crc-debug-b9sqn" Apr 02 14:45:30 crc kubenswrapper[4732]: I0402 14:45:30.418732 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc6rh\" (UniqueName: \"kubernetes.io/projected/b7fb6002-79d8-4dc1-8ca5-078791862b04-kube-api-access-xc6rh\") pod \"crc-debug-b9sqn\" (UID: \"b7fb6002-79d8-4dc1-8ca5-078791862b04\") " pod="openshift-must-gather-6jxwm/crc-debug-b9sqn" Apr 02 14:45:30 crc kubenswrapper[4732]: I0402 14:45:30.448455 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc6rh\" (UniqueName: \"kubernetes.io/projected/b7fb6002-79d8-4dc1-8ca5-078791862b04-kube-api-access-xc6rh\") pod \"crc-debug-b9sqn\" (UID: \"b7fb6002-79d8-4dc1-8ca5-078791862b04\") " pod="openshift-must-gather-6jxwm/crc-debug-b9sqn" Apr 02 14:45:30 crc kubenswrapper[4732]: I0402 14:45:30.510385 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6jxwm/crc-debug-b9sqn" Apr 02 14:45:30 crc kubenswrapper[4732]: I0402 14:45:30.519027 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="130d9a751d0930156ecb701bfa549015f6a5887910feb7efaaa51f567c06f63b" Apr 02 14:45:30 crc kubenswrapper[4732]: I0402 14:45:30.519090 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6jxwm/crc-debug-g7df5" Apr 02 14:45:30 crc kubenswrapper[4732]: I0402 14:45:30.735312 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a008576-ba6b-48bf-857b-d76cdc35c07c" path="/var/lib/kubelet/pods/6a008576-ba6b-48bf-857b-d76cdc35c07c/volumes" Apr 02 14:45:31 crc kubenswrapper[4732]: I0402 14:45:31.529489 4732 generic.go:334] "Generic (PLEG): container finished" podID="b7fb6002-79d8-4dc1-8ca5-078791862b04" containerID="cce79c739620774d4fa2d34bd65d96f08e93effbed8d3f605b23554020f97694" exitCode=0 Apr 02 14:45:31 crc kubenswrapper[4732]: I0402 14:45:31.529569 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6jxwm/crc-debug-b9sqn" event={"ID":"b7fb6002-79d8-4dc1-8ca5-078791862b04","Type":"ContainerDied","Data":"cce79c739620774d4fa2d34bd65d96f08e93effbed8d3f605b23554020f97694"} Apr 02 14:45:31 crc kubenswrapper[4732]: I0402 14:45:31.529871 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6jxwm/crc-debug-b9sqn" event={"ID":"b7fb6002-79d8-4dc1-8ca5-078791862b04","Type":"ContainerStarted","Data":"5cb918684b3c79725ab88f43f54a466bfdd7ffc472004eac2f25ff83a467faf2"} Apr 02 14:45:31 crc kubenswrapper[4732]: I0402 14:45:31.567935 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6jxwm/crc-debug-b9sqn"] Apr 02 14:45:31 crc kubenswrapper[4732]: I0402 14:45:31.577257 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6jxwm/crc-debug-b9sqn"] Apr 02 14:45:32 crc kubenswrapper[4732]: I0402 14:45:32.650657 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6jxwm/crc-debug-b9sqn" Apr 02 14:45:32 crc kubenswrapper[4732]: I0402 14:45:32.761095 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc6rh\" (UniqueName: \"kubernetes.io/projected/b7fb6002-79d8-4dc1-8ca5-078791862b04-kube-api-access-xc6rh\") pod \"b7fb6002-79d8-4dc1-8ca5-078791862b04\" (UID: \"b7fb6002-79d8-4dc1-8ca5-078791862b04\") " Apr 02 14:45:32 crc kubenswrapper[4732]: I0402 14:45:32.761157 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7fb6002-79d8-4dc1-8ca5-078791862b04-host\") pod \"b7fb6002-79d8-4dc1-8ca5-078791862b04\" (UID: \"b7fb6002-79d8-4dc1-8ca5-078791862b04\") " Apr 02 14:45:32 crc kubenswrapper[4732]: I0402 14:45:32.761390 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7fb6002-79d8-4dc1-8ca5-078791862b04-host" (OuterVolumeSpecName: "host") pod "b7fb6002-79d8-4dc1-8ca5-078791862b04" (UID: "b7fb6002-79d8-4dc1-8ca5-078791862b04"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 02 14:45:32 crc kubenswrapper[4732]: I0402 14:45:32.762344 4732 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7fb6002-79d8-4dc1-8ca5-078791862b04-host\") on node \"crc\" DevicePath \"\"" Apr 02 14:45:32 crc kubenswrapper[4732]: I0402 14:45:32.766396 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7fb6002-79d8-4dc1-8ca5-078791862b04-kube-api-access-xc6rh" (OuterVolumeSpecName: "kube-api-access-xc6rh") pod "b7fb6002-79d8-4dc1-8ca5-078791862b04" (UID: "b7fb6002-79d8-4dc1-8ca5-078791862b04"). InnerVolumeSpecName "kube-api-access-xc6rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:45:32 crc kubenswrapper[4732]: I0402 14:45:32.864745 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc6rh\" (UniqueName: \"kubernetes.io/projected/b7fb6002-79d8-4dc1-8ca5-078791862b04-kube-api-access-xc6rh\") on node \"crc\" DevicePath \"\"" Apr 02 14:45:33 crc kubenswrapper[4732]: I0402 14:45:33.556918 4732 scope.go:117] "RemoveContainer" containerID="cce79c739620774d4fa2d34bd65d96f08e93effbed8d3f605b23554020f97694" Apr 02 14:45:33 crc kubenswrapper[4732]: I0402 14:45:33.557180 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6jxwm/crc-debug-b9sqn" Apr 02 14:45:34 crc kubenswrapper[4732]: I0402 14:45:34.690779 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7fb6002-79d8-4dc1-8ca5-078791862b04" path="/var/lib/kubelet/pods/b7fb6002-79d8-4dc1-8ca5-078791862b04/volumes" Apr 02 14:45:41 crc kubenswrapper[4732]: I0402 14:45:41.680263 4732 scope.go:117] "RemoveContainer" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" Apr 02 14:45:41 crc kubenswrapper[4732]: E0402 14:45:41.681466 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:45:53 crc kubenswrapper[4732]: I0402 14:45:53.371469 4732 scope.go:117] "RemoveContainer" containerID="4347607c418aa9f67017b66ba5ecfbe94038cd13bb2b1b6bc83411912a4e5471" Apr 02 14:45:55 crc kubenswrapper[4732]: I0402 14:45:55.680495 4732 scope.go:117] "RemoveContainer" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" Apr 02 14:45:55 crc kubenswrapper[4732]: E0402 14:45:55.681476 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:46:00 crc kubenswrapper[4732]: I0402 14:46:00.158035 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585686-fdtl8"] Apr 02 14:46:00 crc kubenswrapper[4732]: E0402 14:46:00.159040 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7fb6002-79d8-4dc1-8ca5-078791862b04" containerName="container-00" Apr 02 14:46:00 crc kubenswrapper[4732]: I0402 14:46:00.159056 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7fb6002-79d8-4dc1-8ca5-078791862b04" containerName="container-00" Apr 02 14:46:00 crc kubenswrapper[4732]: I0402 14:46:00.159467 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7fb6002-79d8-4dc1-8ca5-078791862b04" containerName="container-00" Apr 02 14:46:00 crc kubenswrapper[4732]: I0402 14:46:00.160201 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585686-fdtl8" Apr 02 14:46:00 crc kubenswrapper[4732]: I0402 14:46:00.162926 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:46:00 crc kubenswrapper[4732]: I0402 14:46:00.162997 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:46:00 crc kubenswrapper[4732]: I0402 14:46:00.163295 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:46:00 crc kubenswrapper[4732]: I0402 14:46:00.170996 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585686-fdtl8"] Apr 02 14:46:00 crc kubenswrapper[4732]: I0402 14:46:00.294399 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhvlr\" (UniqueName: \"kubernetes.io/projected/a0d13f1b-e44f-4e95-8f60-9da27a2a247a-kube-api-access-dhvlr\") pod \"auto-csr-approver-29585686-fdtl8\" (UID: \"a0d13f1b-e44f-4e95-8f60-9da27a2a247a\") " pod="openshift-infra/auto-csr-approver-29585686-fdtl8" Apr 02 14:46:00 crc kubenswrapper[4732]: I0402 14:46:00.395724 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhvlr\" (UniqueName: \"kubernetes.io/projected/a0d13f1b-e44f-4e95-8f60-9da27a2a247a-kube-api-access-dhvlr\") pod \"auto-csr-approver-29585686-fdtl8\" (UID: \"a0d13f1b-e44f-4e95-8f60-9da27a2a247a\") " pod="openshift-infra/auto-csr-approver-29585686-fdtl8" Apr 02 14:46:00 crc kubenswrapper[4732]: I0402 14:46:00.414115 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhvlr\" (UniqueName: \"kubernetes.io/projected/a0d13f1b-e44f-4e95-8f60-9da27a2a247a-kube-api-access-dhvlr\") pod \"auto-csr-approver-29585686-fdtl8\" (UID: \"a0d13f1b-e44f-4e95-8f60-9da27a2a247a\") " pod="openshift-infra/auto-csr-approver-29585686-fdtl8" Apr 02 14:46:00 crc kubenswrapper[4732]: I0402 14:46:00.478041 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585686-fdtl8" Apr 02 14:46:01 crc kubenswrapper[4732]: I0402 14:46:01.031973 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585686-fdtl8"] Apr 02 14:46:01 crc kubenswrapper[4732]: I0402 14:46:01.048971 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 02 14:46:01 crc kubenswrapper[4732]: I0402 14:46:01.819337 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585686-fdtl8" event={"ID":"a0d13f1b-e44f-4e95-8f60-9da27a2a247a","Type":"ContainerStarted","Data":"38992442a5943b40dad350794e0214c9d47a625f137a6caf19b6befdd00912fe"} Apr 02 14:46:02 crc kubenswrapper[4732]: I0402 14:46:02.145066 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6c879c6666-5kls7_362f9e50-6f86-41ff-ae02-e0b8565fa55f/barbican-api/0.log" Apr 02 14:46:02 crc kubenswrapper[4732]: I0402 14:46:02.322200 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6c879c6666-5kls7_362f9e50-6f86-41ff-ae02-e0b8565fa55f/barbican-api-log/0.log" Apr 02 14:46:02 crc kubenswrapper[4732]: I0402 14:46:02.346186 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f897cfb64-ql8wz_5e017590-845a-4f52-a6ae-258890dd6388/barbican-keystone-listener/0.log" Apr 02 14:46:02 crc kubenswrapper[4732]: I0402 14:46:02.379650 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f897cfb64-ql8wz_5e017590-845a-4f52-a6ae-258890dd6388/barbican-keystone-listener-log/0.log" Apr 02 14:46:02 crc kubenswrapper[4732]: I0402 14:46:02.572146 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-98d7bf879-xkszz_8eed39a7-f437-403d-acab-246fa6d25c4b/barbican-worker/0.log" Apr 02 14:46:02 crc kubenswrapper[4732]: I0402 14:46:02.587015 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-98d7bf879-xkszz_8eed39a7-f437-403d-acab-246fa6d25c4b/barbican-worker-log/0.log" Apr 02 14:46:02 crc kubenswrapper[4732]: I0402 14:46:02.829437 4732 generic.go:334] "Generic (PLEG): container finished" podID="a0d13f1b-e44f-4e95-8f60-9da27a2a247a" containerID="503d568eb55b92d13a248a42a5ed973002e5488ea17bb8b75bf7db78eb07e54a" exitCode=0 Apr 02 14:46:02 crc kubenswrapper[4732]: I0402 14:46:02.829491 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585686-fdtl8" event={"ID":"a0d13f1b-e44f-4e95-8f60-9da27a2a247a","Type":"ContainerDied","Data":"503d568eb55b92d13a248a42a5ed973002e5488ea17bb8b75bf7db78eb07e54a"} Apr 02 14:46:02 crc kubenswrapper[4732]: I0402 14:46:02.874394 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-8g4nr_a67d60f0-3912-4fc4-96b7-f96831ff23d3/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:46:04 crc kubenswrapper[4732]: I0402 14:46:04.024059 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ad6cbd5d-4434-4885-bf56-8ee47171b897/ceilometer-central-agent/0.log" Apr 02 14:46:04 crc kubenswrapper[4732]: I0402 14:46:04.142713 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ad6cbd5d-4434-4885-bf56-8ee47171b897/ceilometer-notification-agent/0.log" Apr 02 14:46:04 crc kubenswrapper[4732]: I0402 14:46:04.376440 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ad6cbd5d-4434-4885-bf56-8ee47171b897/sg-core/0.log" Apr 02 14:46:04 crc kubenswrapper[4732]: I0402 14:46:04.388231 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ad6cbd5d-4434-4885-bf56-8ee47171b897/proxy-httpd/0.log" Apr 02 14:46:04 crc kubenswrapper[4732]: I0402 14:46:04.509104 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9fc3b5a4-f5bf-44ea-aa53-d93a32900271/cinder-api/0.log" Apr 02 14:46:04 crc kubenswrapper[4732]: I0402 14:46:04.546703 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9fc3b5a4-f5bf-44ea-aa53-d93a32900271/cinder-api-log/0.log" Apr 02 14:46:04 crc kubenswrapper[4732]: I0402 14:46:04.563756 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585686-fdtl8" Apr 02 14:46:04 crc kubenswrapper[4732]: I0402 14:46:04.727264 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d7613bfc-a605-4485-b771-242a65e30df8/cinder-scheduler/0.log" Apr 02 14:46:04 crc kubenswrapper[4732]: I0402 14:46:04.732663 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhvlr\" (UniqueName: \"kubernetes.io/projected/a0d13f1b-e44f-4e95-8f60-9da27a2a247a-kube-api-access-dhvlr\") pod \"a0d13f1b-e44f-4e95-8f60-9da27a2a247a\" (UID: \"a0d13f1b-e44f-4e95-8f60-9da27a2a247a\") " Apr 02 14:46:04 crc kubenswrapper[4732]: I0402 14:46:04.754329 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0d13f1b-e44f-4e95-8f60-9da27a2a247a-kube-api-access-dhvlr" (OuterVolumeSpecName: "kube-api-access-dhvlr") pod "a0d13f1b-e44f-4e95-8f60-9da27a2a247a" (UID: "a0d13f1b-e44f-4e95-8f60-9da27a2a247a"). InnerVolumeSpecName "kube-api-access-dhvlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:46:04 crc kubenswrapper[4732]: I0402 14:46:04.834805 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhvlr\" (UniqueName: \"kubernetes.io/projected/a0d13f1b-e44f-4e95-8f60-9da27a2a247a-kube-api-access-dhvlr\") on node \"crc\" DevicePath \"\"" Apr 02 14:46:04 crc kubenswrapper[4732]: I0402 14:46:04.835791 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d7613bfc-a605-4485-b771-242a65e30df8/probe/0.log" Apr 02 14:46:05 crc kubenswrapper[4732]: I0402 14:46:05.011558 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-z2s8z_21ce48db-fb4b-4086-86cc-a32f30ebd002/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:46:05 crc kubenswrapper[4732]: I0402 14:46:05.051331 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585686-fdtl8" event={"ID":"a0d13f1b-e44f-4e95-8f60-9da27a2a247a","Type":"ContainerDied","Data":"38992442a5943b40dad350794e0214c9d47a625f137a6caf19b6befdd00912fe"} Apr 02 14:46:05 crc kubenswrapper[4732]: I0402 14:46:05.051379 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38992442a5943b40dad350794e0214c9d47a625f137a6caf19b6befdd00912fe" Apr 02 14:46:05 crc kubenswrapper[4732]: I0402 14:46:05.051445 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585686-fdtl8" Apr 02 14:46:05 crc kubenswrapper[4732]: I0402 14:46:05.143804 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-bx2x5_d8bb9bae-9d09-42c5-a60a-134c907db6d5/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:46:05 crc kubenswrapper[4732]: I0402 14:46:05.338433 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-2wr76_b22602a0-7545-4c2d-8b16-2233288ab360/init/0.log" Apr 02 14:46:05 crc kubenswrapper[4732]: I0402 14:46:05.645870 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585680-mnl44"] Apr 02 14:46:05 crc kubenswrapper[4732]: I0402 14:46:05.657579 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585680-mnl44"] Apr 02 14:46:06 crc kubenswrapper[4732]: I0402 14:46:06.067654 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-2wr76_b22602a0-7545-4c2d-8b16-2233288ab360/init/0.log" Apr 02 14:46:06 crc kubenswrapper[4732]: I0402 14:46:06.215177 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-2wr76_b22602a0-7545-4c2d-8b16-2233288ab360/dnsmasq-dns/0.log" Apr 02 14:46:06 crc kubenswrapper[4732]: I0402 14:46:06.293552 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-5lm4q_240ff67d-47d5-4b2e-b744-e0e2332a9496/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:46:06 crc kubenswrapper[4732]: I0402 14:46:06.364235 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2bbb407d-51c0-4cca-99c6-9436acda495d/glance-httpd/0.log" Apr 02 14:46:06 crc kubenswrapper[4732]: I0402 14:46:06.409090 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2bbb407d-51c0-4cca-99c6-9436acda495d/glance-log/0.log" Apr 02 14:46:06 crc kubenswrapper[4732]: I0402 14:46:06.509835 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8724b48c-9ac7-43a2-8d27-7d16056387ca/glance-httpd/0.log" Apr 02 14:46:06 crc kubenswrapper[4732]: I0402 14:46:06.530390 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8724b48c-9ac7-43a2-8d27-7d16056387ca/glance-log/0.log" Apr 02 14:46:06 crc kubenswrapper[4732]: I0402 14:46:06.691347 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="429f68bb-47d9-4185-8dcb-26fb3ad3a20c" path="/var/lib/kubelet/pods/429f68bb-47d9-4185-8dcb-26fb3ad3a20c/volumes" Apr 02 14:46:06 crc kubenswrapper[4732]: I0402 14:46:06.710928 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-54f994999b-b88d7_97d6e519-a82f-4ce5-9199-4d7db769f86b/horizon/0.log" Apr 02 14:46:06 crc kubenswrapper[4732]: I0402 14:46:06.912309 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-k8mpl_2c668a5c-f7f6-4aa9-b177-b21e16bf4cfd/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:46:07 crc kubenswrapper[4732]: I0402 14:46:07.160805 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29585641-zldph_61cd5173-b5d3-4cd7-a8ea-e4300054f364/keystone-cron/0.log" Apr 02 14:46:07 crc kubenswrapper[4732]: I0402 14:46:07.270579 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-54f994999b-b88d7_97d6e519-a82f-4ce5-9199-4d7db769f86b/horizon-log/0.log" Apr 02 14:46:07 crc kubenswrapper[4732]: I0402 14:46:07.379891 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-49z8f_2072a722-772d-4379-a439-fdebfa6e219e/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:46:07 crc kubenswrapper[4732]: I0402 14:46:07.462213 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_96308a9a-b137-4d84-a470-74395c7a5d60/kube-state-metrics/0.log" Apr 02 14:46:07 crc kubenswrapper[4732]: I0402 14:46:07.511989 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-d4c8876f7-592x4_5acfdea3-28ba-47f3-860c-6e7af2fe3222/keystone-api/0.log" Apr 02 14:46:07 crc kubenswrapper[4732]: I0402 14:46:07.967543 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-58f8f59779-9rrsx_cd0e816f-6d8e-4ed8-884c-ee38cec72d94/neutron-api/0.log" Apr 02 14:46:07 crc kubenswrapper[4732]: I0402 14:46:07.978531 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-58f8f59779-9rrsx_cd0e816f-6d8e-4ed8-884c-ee38cec72d94/neutron-httpd/0.log" Apr 02 14:46:08 crc kubenswrapper[4732]: I0402 14:46:08.157251 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-t9h4h_6d71fa88-324b-440b-aefd-492ac7ff7cd5/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:46:08 crc kubenswrapper[4732]: I0402 14:46:08.258880 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-sjrwt_9e89ed59-ef4b-44a7-b6de-d98b2319ee10/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:46:08 crc kubenswrapper[4732]: I0402 14:46:08.690065 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16/nova-api-log/0.log" Apr 02 14:46:08 crc kubenswrapper[4732]: I0402 14:46:08.765533 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_b5f7b2b4-c3da-49e6-b873-c2937dc27bbf/nova-cell0-conductor-conductor/0.log" Apr 02 14:46:09 crc kubenswrapper[4732]: I0402 14:46:09.071313 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_43834d16-35ed-4baa-8292-26a762220c9a/nova-cell1-conductor-conductor/0.log" Apr 02 14:46:09 crc kubenswrapper[4732]: I0402 14:46:09.164540 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_35b61eb9-52fd-4d29-8942-c1c18b2f4aff/nova-cell1-novncproxy-novncproxy/0.log" Apr 02 14:46:09 crc kubenswrapper[4732]: I0402 14:46:09.223873 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3ab608f9-1ada-4c9c-85a1-43fc8cf5cc16/nova-api-api/0.log" Apr 02 14:46:09 crc kubenswrapper[4732]: I0402 14:46:09.602938 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4/nova-metadata-log/0.log" Apr 02 14:46:09 crc kubenswrapper[4732]: I0402 14:46:09.682222 4732 scope.go:117] "RemoveContainer" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" Apr 02 14:46:09 crc kubenswrapper[4732]: E0402 14:46:09.682467 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:46:10 crc kubenswrapper[4732]: I0402 14:46:10.037937 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c6916cdf-dcaa-4e17-b33c-3fc6684abb46/nova-scheduler-scheduler/0.log" Apr 02 14:46:10 crc kubenswrapper[4732]: I0402 14:46:10.125695 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_688bac91-aede-4c9f-a063-6469bb03db8c/mysql-bootstrap/0.log" Apr 02 14:46:10 crc kubenswrapper[4732]: I0402 14:46:10.144492 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c8e7f93a-58bc-4949-9fa4-7ee18fdaa7e4/nova-metadata-metadata/0.log" Apr 02 14:46:10 crc kubenswrapper[4732]: I0402 14:46:10.154820 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-gmk6c_f0eca204-c72d-4909-89ba-03d2b1976e07/nova-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:46:10 crc kubenswrapper[4732]: I0402 14:46:10.323799 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_688bac91-aede-4c9f-a063-6469bb03db8c/mysql-bootstrap/0.log" Apr 02 14:46:10 crc kubenswrapper[4732]: I0402 14:46:10.396256 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2fbe66fb-6f02-432d-8acf-50fec5339d96/mysql-bootstrap/0.log" Apr 02 14:46:10 crc kubenswrapper[4732]: I0402 14:46:10.435307 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_688bac91-aede-4c9f-a063-6469bb03db8c/galera/0.log" Apr 02 14:46:10 crc kubenswrapper[4732]: I0402 14:46:10.637843 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2fbe66fb-6f02-432d-8acf-50fec5339d96/mysql-bootstrap/0.log" Apr 02 14:46:10 crc kubenswrapper[4732]: I0402 14:46:10.657413 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2fbe66fb-6f02-432d-8acf-50fec5339d96/galera/0.log" Apr 02 14:46:10 crc kubenswrapper[4732]: I0402 14:46:10.686201 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_66ae86e8-597c-4fdb-b0da-283cf37afba2/openstackclient/0.log" Apr 02 14:46:10 crc kubenswrapper[4732]: I0402 14:46:10.917516 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-5222s_8af6391f-4f8b-4473-8e7c-186c9c838527/ovn-controller/0.log" Apr 02 14:46:10 crc kubenswrapper[4732]: I0402 14:46:10.954975 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-7swjs_731d113e-365b-4d68-a0e9-402bb8a8e9b7/openstack-network-exporter/0.log" Apr 02 14:46:11 crc kubenswrapper[4732]: I0402 14:46:11.143387 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l4ttl_5eba7503-ee7b-40ba-a0dc-e11fad40c2b7/ovsdb-server-init/0.log" Apr 02 14:46:11 crc kubenswrapper[4732]: I0402 14:46:11.326997 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l4ttl_5eba7503-ee7b-40ba-a0dc-e11fad40c2b7/ovsdb-server-init/0.log" Apr 02 14:46:11 crc kubenswrapper[4732]: I0402 14:46:11.340410 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l4ttl_5eba7503-ee7b-40ba-a0dc-e11fad40c2b7/ovs-vswitchd/0.log" Apr 02 14:46:11 crc kubenswrapper[4732]: I0402 14:46:11.370958 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l4ttl_5eba7503-ee7b-40ba-a0dc-e11fad40c2b7/ovsdb-server/0.log" Apr 02 14:46:11 crc kubenswrapper[4732]: I0402 14:46:11.625595 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3da22737-6be3-4ffc-afff-b5d7fb20a283/openstack-network-exporter/0.log" Apr 02 14:46:11 crc kubenswrapper[4732]: I0402 14:46:11.696426 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-bc8s7_e6ca7706-9083-4555-b762-1d24315b85ea/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:46:11 crc kubenswrapper[4732]: I0402 14:46:11.701538 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3da22737-6be3-4ffc-afff-b5d7fb20a283/ovn-northd/0.log" Apr 02 14:46:11 crc kubenswrapper[4732]: I0402 14:46:11.839363 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d8dff454-a625-4309-92b6-8ab92d4bd60a/openstack-network-exporter/0.log" Apr 02 14:46:11 crc kubenswrapper[4732]: I0402 14:46:11.901764 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d8dff454-a625-4309-92b6-8ab92d4bd60a/ovsdbserver-nb/0.log" Apr 02 14:46:12 crc kubenswrapper[4732]: I0402 14:46:12.042310 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f84d20f6-82ec-45d6-8487-4ed2ed90b286/openstack-network-exporter/0.log" Apr 02 14:46:12 crc kubenswrapper[4732]: I0402 14:46:12.163788 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f84d20f6-82ec-45d6-8487-4ed2ed90b286/ovsdbserver-sb/0.log" Apr 02 14:46:12 crc kubenswrapper[4732]: I0402 14:46:12.255170 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5688fc477d-p59pf_c11a1fe8-1217-4e5b-b172-642b85527099/placement-api/0.log" Apr 02 14:46:12 crc kubenswrapper[4732]: I0402 14:46:12.431926 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_29e95846-a0bc-4d8b-ad4d-457766418564/setup-container/0.log" Apr 02 14:46:12 crc kubenswrapper[4732]: I0402 14:46:12.462553 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5688fc477d-p59pf_c11a1fe8-1217-4e5b-b172-642b85527099/placement-log/0.log" Apr 02 14:46:12 crc kubenswrapper[4732]: I0402 14:46:12.685195 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c9e1cd50-72d3-4ccc-9f49-c4c1619252fc/setup-container/0.log" Apr 02 14:46:12 crc kubenswrapper[4732]: I0402 14:46:12.699519 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_29e95846-a0bc-4d8b-ad4d-457766418564/setup-container/0.log" Apr 02 14:46:12 crc kubenswrapper[4732]: I0402 14:46:12.714826 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_29e95846-a0bc-4d8b-ad4d-457766418564/rabbitmq/0.log" Apr 02 14:46:12 crc kubenswrapper[4732]: I0402 14:46:12.880844 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c9e1cd50-72d3-4ccc-9f49-c4c1619252fc/setup-container/0.log" Apr 02 14:46:12 crc kubenswrapper[4732]: I0402 14:46:12.977186 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-9sgtg_160a19c0-4b2b-439a-9ea5-0f0ec2d4aede/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:46:13 crc kubenswrapper[4732]: I0402 14:46:13.016657 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c9e1cd50-72d3-4ccc-9f49-c4c1619252fc/rabbitmq/0.log" Apr 02 14:46:13 crc kubenswrapper[4732]: I0402 14:46:13.562900 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-szt5p_d2c0401e-94b6-46b0-84f5-59ffac42c2f7/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:46:13 crc kubenswrapper[4732]: I0402 14:46:13.746976 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-kjz5w_28b7c53a-39ed-4eea-8697-50dc3eb09818/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:46:13 crc kubenswrapper[4732]: I0402 14:46:13.804791 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-sdvqf_c5427d0c-bb3a-491e-8461-8e189da84bd9/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:46:14 crc kubenswrapper[4732]: I0402 14:46:14.011952 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-8ct56_ede1fe7d-16b7-41be-af74-8933aa0a1e83/ssh-known-hosts-edpm-deployment/0.log" Apr 02 14:46:14 crc kubenswrapper[4732]: I0402 14:46:14.171709 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9f57ff6c-7m8sr_7f6ffca1-ce91-4e20-8cbc-38a3eab1616e/proxy-httpd/0.log" Apr 02 14:46:14 crc kubenswrapper[4732]: I0402 14:46:14.210657 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9f57ff6c-7m8sr_7f6ffca1-ce91-4e20-8cbc-38a3eab1616e/proxy-server/0.log" Apr 02 14:46:14 crc kubenswrapper[4732]: I0402 14:46:14.331640 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-wsskk_81138fff-8b7c-4cf3-8aa5-2582d80483e1/swift-ring-rebalance/0.log" Apr 02 14:46:14 crc kubenswrapper[4732]: I0402 14:46:14.399414 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/account-auditor/0.log" Apr 02 14:46:14 crc kubenswrapper[4732]: I0402 14:46:14.532159 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/account-reaper/0.log" Apr 02 14:46:14 crc kubenswrapper[4732]: I0402 14:46:14.573836 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/account-replicator/0.log" Apr 02 14:46:14 crc kubenswrapper[4732]: I0402 14:46:14.590580 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/account-server/0.log" Apr 02 14:46:14 crc kubenswrapper[4732]: I0402 14:46:14.672122 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/container-auditor/0.log" Apr 02 14:46:14 crc kubenswrapper[4732]: I0402 14:46:14.754961 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/container-server/0.log" Apr 02 14:46:14 crc kubenswrapper[4732]: I0402 14:46:14.807317 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/container-replicator/0.log" Apr 02 14:46:14 crc kubenswrapper[4732]: I0402 14:46:14.850929 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/container-updater/0.log" Apr 02 14:46:14 crc kubenswrapper[4732]: I0402 14:46:14.895448 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/object-auditor/0.log" Apr 02 14:46:14 crc kubenswrapper[4732]: I0402 14:46:14.975820 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/object-expirer/0.log" Apr 02 14:46:15 crc kubenswrapper[4732]: I0402 14:46:15.387548 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/object-updater/0.log" Apr 02 14:46:15 crc kubenswrapper[4732]: I0402 14:46:15.396352 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/object-server/0.log" Apr 02 14:46:15 crc kubenswrapper[4732]: I0402 14:46:15.398302 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/rsync/0.log" Apr 02 14:46:15 crc kubenswrapper[4732]: I0402 14:46:15.439108 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/object-replicator/0.log" Apr 02 14:46:15 crc kubenswrapper[4732]: I0402 14:46:15.621424 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529a332e-d2c3-49c5-86d5-e672811d00cd/swift-recon-cron/0.log" Apr 02 14:46:15 crc kubenswrapper[4732]: I0402 14:46:15.847385 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_17645883-477c-437a-b87a-b412f9bbe29e/tempest-tests-tempest-tests-runner/0.log" Apr 02 14:46:15 crc kubenswrapper[4732]: I0402 14:46:15.962691 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_c4323eb5-6b45-4766-961f-eef53306dad0/test-operator-logs-container/0.log" Apr 02 14:46:16 crc kubenswrapper[4732]: I0402 14:46:16.010689 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-9frcg_0e3946af-2a00-4313-9a3b-79acd9152f58/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:46:16 crc kubenswrapper[4732]: I0402 14:46:16.101150 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-7z2sv_ce9af86e-92fb-4693-8af9-4d95af13b999/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Apr 02 14:46:23 crc kubenswrapper[4732]: I0402 14:46:23.774932 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_4f87d2b1-82d0-4126-aeae-46aa84ba3d1f/memcached/0.log" Apr 02 14:46:24 crc kubenswrapper[4732]: I0402 14:46:24.687081 4732 scope.go:117] "RemoveContainer" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" Apr 02 14:46:24 crc kubenswrapper[4732]: E0402 14:46:24.687294 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:46:39 crc kubenswrapper[4732]: I0402 14:46:39.680807 4732 scope.go:117] "RemoveContainer" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" Apr 02 14:46:39 crc kubenswrapper[4732]: E0402 14:46:39.681625 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:46:43 crc kubenswrapper[4732]: I0402 14:46:43.168491 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf_85e10d53-2bd5-4a87-8ec8-89a9ef13f766/util/0.log" Apr 02 14:46:43 crc kubenswrapper[4732]: I0402 14:46:43.341266 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf_85e10d53-2bd5-4a87-8ec8-89a9ef13f766/pull/0.log" Apr 02 14:46:43 crc kubenswrapper[4732]: I0402 14:46:43.363599 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf_85e10d53-2bd5-4a87-8ec8-89a9ef13f766/pull/0.log" Apr 02 14:46:43 crc kubenswrapper[4732]: I0402 14:46:43.373643 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf_85e10d53-2bd5-4a87-8ec8-89a9ef13f766/util/0.log" Apr 02 14:46:43 crc kubenswrapper[4732]: I0402 14:46:43.521281 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf_85e10d53-2bd5-4a87-8ec8-89a9ef13f766/util/0.log" Apr 02 14:46:43 crc kubenswrapper[4732]: I0402 14:46:43.531180 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf_85e10d53-2bd5-4a87-8ec8-89a9ef13f766/extract/0.log" Apr 02 14:46:43 crc kubenswrapper[4732]: I0402 14:46:43.552913 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_59b48476de953ff34716fd3dc2a51be4f696c7acf616b0748be8aa4a1db4vvf_85e10d53-2bd5-4a87-8ec8-89a9ef13f766/pull/0.log" Apr 02 14:46:43 crc kubenswrapper[4732]: I0402 14:46:43.785252 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-86644c9c9c-nhxqn_08d5eea8-7c67-4aa1-ad91-ab1c60214872/manager/0.log" Apr 02 14:46:43 crc kubenswrapper[4732]: I0402 14:46:43.786732 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d46cccfb9-65vqg_d925f7c0-af6d-49d5-a09f-82afb7c58a15/manager/0.log" Apr 02 14:46:43 crc kubenswrapper[4732]: I0402 14:46:43.930769 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-58689c6fff-47xk7_4c76cc17-ab86-4c9c-9438-7e72e2ce895f/manager/0.log" Apr 02 14:46:44 crc kubenswrapper[4732]: I0402 14:46:44.061095 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-648bdc7f99-vt6x9_5e74dfe1-0e0f-4b70-8b9a-db645eb40e05/manager/0.log" Apr 02 14:46:44 crc kubenswrapper[4732]: I0402 14:46:44.191416 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8684f86954-xgncs_12296214-f552-4868-8884-66c241eb973b/manager/0.log" Apr 02 14:46:44 crc kubenswrapper[4732]: I0402 14:46:44.223243 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6ccfd84cb4-hv8p6_879197e5-dc13-4c17-b8ac-7e51a97aa0f2/manager/0.log" Apr 02 14:46:45 crc kubenswrapper[4732]: I0402 14:46:45.099768 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5f96574b5-nbm76_e46394c5-fd9e-4c0d-8e78-96723f5931d9/manager/0.log" Apr 02 14:46:45 crc kubenswrapper[4732]: I0402 14:46:45.289485 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-58f79b884c-5q7cz_43f86830-d407-4dc4-9b09-388fb5db82c8/manager/0.log" Apr 02 14:46:45 crc kubenswrapper[4732]: I0402 14:46:45.329010 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6b7497dc59-ph5hk_084daf4c-82c9-42e7-8eb9-3ae4658c1742/manager/0.log" Apr 02 14:46:45 crc kubenswrapper[4732]: I0402 14:46:45.348802 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-dbf8bb784-4kz5n_6b75349c-23b4-4dc0-914f-f1dc82b12e18/manager/0.log" Apr 02 14:46:45 crc kubenswrapper[4732]: I0402 14:46:45.515325 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6554749d88-4cwml_338e9bfc-709f-49f2-8456-9dbe8b815382/manager/0.log" Apr 02 14:46:45 crc kubenswrapper[4732]: I0402 14:46:45.611960 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-z49v9_1c68b230-3f85-41f9-a6ed-7da1d0738748/manager/0.log" Apr 02 14:46:45 crc kubenswrapper[4732]: I0402 14:46:45.805598 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7594f57946-2rvck_5319722c-7913-4dcd-a03d-dc7a5040b434/manager/0.log" Apr 02 14:46:45 crc kubenswrapper[4732]: I0402 14:46:45.807050 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d6f9fd68c-mvbqt_b49d6074-a4b1-4658-b6b8-95bfe63163b0/manager/0.log" Apr 02 14:46:45 crc kubenswrapper[4732]: I0402 14:46:45.960039 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-b7b49d78f-c6jbh_d5a07520-1380-45b1-a00a-7148b158711e/manager/0.log" Apr 02 14:46:46 crc kubenswrapper[4732]: I0402 14:46:46.122399 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-f786688f5-tv7s7_1e51ae33-6c4c-4e7f-8309-ef0c8901b6ed/operator/0.log" Apr 02 14:46:46 crc kubenswrapper[4732]: I0402 14:46:46.344007 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-r7tzm_7dea8805-bd1c-400b-bddf-d3ac2cd57617/registry-server/0.log" Apr 02 14:46:46 crc kubenswrapper[4732]: I0402 14:46:46.567690 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-84464c7c78-tgrmk_bec355a9-c60e-4480-a32c-f1a43ef27131/manager/0.log" Apr 02 14:46:46 crc kubenswrapper[4732]: I0402 14:46:46.700149 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-559d8fdb6b-mfzk4_6296d461-d333-4b9c-a082-e48db64bdd96/manager/0.log" Apr 02 14:46:46 crc kubenswrapper[4732]: I0402 14:46:46.856644 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-fbdcf7f7b-bw2df_5b424dbc-80ac-46ae-90d2-c69fdf4c14d7/manager/0.log" Apr 02 14:46:47 crc kubenswrapper[4732]: I0402 14:46:47.155456 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6f76d4c7-nbxg9_426c551b-e661-40e0-9aa3-a83897ce2814/manager/0.log" Apr 02 14:46:47 crc kubenswrapper[4732]: I0402 14:46:47.212135 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56ccc97cf5-ztlzz_bd54902c-4922-4c49-85c1-280af54370ba/manager/0.log" Apr 02 14:46:47 crc kubenswrapper[4732]: I0402 14:46:47.272512 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5985877f6-hxnth_e37f12bd-d8bf-4e9d-86ab-4da2dfbeff5f/manager/0.log" Apr 02 14:46:47 crc kubenswrapper[4732]: I0402 14:46:47.417206 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-989fbd45-w2zrf_64892a56-9180-4d1d-ad33-d87caa5f2002/manager/0.log" Apr 02 14:46:51 crc kubenswrapper[4732]: I0402 14:46:51.681864 4732 scope.go:117] "RemoveContainer" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" Apr 02 14:46:51 crc kubenswrapper[4732]: E0402 14:46:51.682451 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:46:53 crc kubenswrapper[4732]: I0402 14:46:53.769331 4732 scope.go:117] "RemoveContainer" containerID="c7e7e57d13c9db90f242c9b4a24badf33e93df956cf2c6df7fde03bab03e2b9d" Apr 02 14:46:57 crc kubenswrapper[4732]: I0402 14:46:57.393299 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bn5br"] Apr 02 14:46:57 crc kubenswrapper[4732]: E0402 14:46:57.394491 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d13f1b-e44f-4e95-8f60-9da27a2a247a" containerName="oc" Apr 02 14:46:57 crc kubenswrapper[4732]: I0402 14:46:57.394510 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d13f1b-e44f-4e95-8f60-9da27a2a247a" containerName="oc" Apr 02 14:46:57 crc kubenswrapper[4732]: I0402 14:46:57.394777 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0d13f1b-e44f-4e95-8f60-9da27a2a247a" containerName="oc" Apr 02 14:46:57 crc kubenswrapper[4732]: I0402 14:46:57.397127 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bn5br" Apr 02 14:46:57 crc kubenswrapper[4732]: I0402 14:46:57.408429 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bn5br"] Apr 02 14:46:57 crc kubenswrapper[4732]: I0402 14:46:57.417905 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28a06597-0ccd-4f06-b2d7-ff634ff90d24-catalog-content\") pod \"redhat-operators-bn5br\" (UID: \"28a06597-0ccd-4f06-b2d7-ff634ff90d24\") " pod="openshift-marketplace/redhat-operators-bn5br" Apr 02 14:46:57 crc kubenswrapper[4732]: I0402 14:46:57.417996 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28a06597-0ccd-4f06-b2d7-ff634ff90d24-utilities\") pod \"redhat-operators-bn5br\" (UID: \"28a06597-0ccd-4f06-b2d7-ff634ff90d24\") " pod="openshift-marketplace/redhat-operators-bn5br" Apr 02 14:46:57 crc kubenswrapper[4732]: I0402 14:46:57.418066 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqqcx\" (UniqueName: \"kubernetes.io/projected/28a06597-0ccd-4f06-b2d7-ff634ff90d24-kube-api-access-dqqcx\") pod \"redhat-operators-bn5br\" (UID: \"28a06597-0ccd-4f06-b2d7-ff634ff90d24\") " pod="openshift-marketplace/redhat-operators-bn5br" Apr 02 14:46:57 crc kubenswrapper[4732]: I0402 14:46:57.520533 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqqcx\" (UniqueName: \"kubernetes.io/projected/28a06597-0ccd-4f06-b2d7-ff634ff90d24-kube-api-access-dqqcx\") pod \"redhat-operators-bn5br\" (UID: \"28a06597-0ccd-4f06-b2d7-ff634ff90d24\") " pod="openshift-marketplace/redhat-operators-bn5br" Apr 02 14:46:57 crc kubenswrapper[4732]: I0402 14:46:57.520654 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28a06597-0ccd-4f06-b2d7-ff634ff90d24-catalog-content\") pod \"redhat-operators-bn5br\" (UID: \"28a06597-0ccd-4f06-b2d7-ff634ff90d24\") " pod="openshift-marketplace/redhat-operators-bn5br" Apr 02 14:46:57 crc kubenswrapper[4732]: I0402 14:46:57.520718 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28a06597-0ccd-4f06-b2d7-ff634ff90d24-utilities\") pod \"redhat-operators-bn5br\" (UID: \"28a06597-0ccd-4f06-b2d7-ff634ff90d24\") " pod="openshift-marketplace/redhat-operators-bn5br" Apr 02 14:46:57 crc kubenswrapper[4732]: I0402 14:46:57.521141 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28a06597-0ccd-4f06-b2d7-ff634ff90d24-catalog-content\") pod \"redhat-operators-bn5br\" (UID: \"28a06597-0ccd-4f06-b2d7-ff634ff90d24\") " pod="openshift-marketplace/redhat-operators-bn5br" Apr 02 14:46:57 crc kubenswrapper[4732]: I0402 14:46:57.521209 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28a06597-0ccd-4f06-b2d7-ff634ff90d24-utilities\") pod \"redhat-operators-bn5br\" (UID: \"28a06597-0ccd-4f06-b2d7-ff634ff90d24\") " pod="openshift-marketplace/redhat-operators-bn5br" Apr 02 14:46:57 crc kubenswrapper[4732]: I0402 14:46:57.543740 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqqcx\" (UniqueName: \"kubernetes.io/projected/28a06597-0ccd-4f06-b2d7-ff634ff90d24-kube-api-access-dqqcx\") pod \"redhat-operators-bn5br\" (UID: \"28a06597-0ccd-4f06-b2d7-ff634ff90d24\") " pod="openshift-marketplace/redhat-operators-bn5br" Apr 02 14:46:57 crc kubenswrapper[4732]: I0402 14:46:57.732286 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bn5br" Apr 02 14:46:58 crc kubenswrapper[4732]: I0402 14:46:58.255728 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bn5br"] Apr 02 14:46:58 crc kubenswrapper[4732]: I0402 14:46:58.633445 4732 generic.go:334] "Generic (PLEG): container finished" podID="28a06597-0ccd-4f06-b2d7-ff634ff90d24" containerID="823f98e8b3ece0140cacb7c0f243a1838b47a97dd9eb0d08f0c8e81948d454d8" exitCode=0 Apr 02 14:46:58 crc kubenswrapper[4732]: I0402 14:46:58.633542 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn5br" event={"ID":"28a06597-0ccd-4f06-b2d7-ff634ff90d24","Type":"ContainerDied","Data":"823f98e8b3ece0140cacb7c0f243a1838b47a97dd9eb0d08f0c8e81948d454d8"} Apr 02 14:46:58 crc kubenswrapper[4732]: I0402 14:46:58.633840 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn5br" event={"ID":"28a06597-0ccd-4f06-b2d7-ff634ff90d24","Type":"ContainerStarted","Data":"151c3020b423c4f8bcf65f4b1e2f5026ebbc3daa6d572b7809c0eb0271a375d9"} Apr 02 14:47:06 crc kubenswrapper[4732]: I0402 14:47:06.681267 4732 scope.go:117] "RemoveContainer" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" Apr 02 14:47:06 crc kubenswrapper[4732]: E0402 14:47:06.681960 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:47:06 crc kubenswrapper[4732]: I0402 14:47:06.950470 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-szr89_1eee8837-1a56-40df-b564-bb65ad94d593/control-plane-machine-set-operator/0.log" Apr 02 14:47:07 crc kubenswrapper[4732]: I0402 14:47:07.154553 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kpg9f_3568fcc7-10bd-4972-9782-b97aa3c9c8a0/kube-rbac-proxy/0.log" Apr 02 14:47:07 crc kubenswrapper[4732]: I0402 14:47:07.186573 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kpg9f_3568fcc7-10bd-4972-9782-b97aa3c9c8a0/machine-api-operator/0.log" Apr 02 14:47:10 crc kubenswrapper[4732]: I0402 14:47:10.758754 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn5br" event={"ID":"28a06597-0ccd-4f06-b2d7-ff634ff90d24","Type":"ContainerStarted","Data":"8597f18c64adf31ceae839d57eaaa098c9784296db90a5436d75f4cdeeb162cb"} Apr 02 14:47:11 crc kubenswrapper[4732]: I0402 14:47:11.771806 4732 generic.go:334] "Generic (PLEG): container finished" podID="28a06597-0ccd-4f06-b2d7-ff634ff90d24" containerID="8597f18c64adf31ceae839d57eaaa098c9784296db90a5436d75f4cdeeb162cb" exitCode=0 Apr 02 14:47:11 crc kubenswrapper[4732]: I0402 14:47:11.771908 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn5br" event={"ID":"28a06597-0ccd-4f06-b2d7-ff634ff90d24","Type":"ContainerDied","Data":"8597f18c64adf31ceae839d57eaaa098c9784296db90a5436d75f4cdeeb162cb"} Apr 02 14:47:12 crc kubenswrapper[4732]: I0402 14:47:12.784784 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn5br" event={"ID":"28a06597-0ccd-4f06-b2d7-ff634ff90d24","Type":"ContainerStarted","Data":"e3c1e4efb1e916d36ce3ffcb99516dbb10c1e8bb7306c9b1e9a71a0fee17633c"} Apr 02 14:47:12 crc kubenswrapper[4732]: I0402 14:47:12.808568 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bn5br" podStartSLOduration=1.9402307859999999 podStartE2EDuration="15.808542397s" podCreationTimestamp="2026-04-02 14:46:57 +0000 UTC" firstStartedPulling="2026-04-02 14:46:58.636371242 +0000 UTC m=+4175.540778795" lastFinishedPulling="2026-04-02 14:47:12.504682853 +0000 UTC m=+4189.409090406" observedRunningTime="2026-04-02 14:47:12.800745924 +0000 UTC m=+4189.705153517" watchObservedRunningTime="2026-04-02 14:47:12.808542397 +0000 UTC m=+4189.712949980" Apr 02 14:47:17 crc kubenswrapper[4732]: I0402 14:47:17.732780 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bn5br" Apr 02 14:47:17 crc kubenswrapper[4732]: I0402 14:47:17.733668 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bn5br" Apr 02 14:47:18 crc kubenswrapper[4732]: I0402 14:47:18.789561 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bn5br" podUID="28a06597-0ccd-4f06-b2d7-ff634ff90d24" containerName="registry-server" probeResult="failure" output=< Apr 02 14:47:18 crc kubenswrapper[4732]: timeout: failed to connect service ":50051" within 1s Apr 02 14:47:18 crc kubenswrapper[4732]: > Apr 02 14:47:20 crc kubenswrapper[4732]: I0402 14:47:20.674243 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-pnknx_fc67973f-1e3f-4aea-bf48-5914e7c8ddbb/cert-manager-controller/0.log" Apr 02 14:47:20 crc kubenswrapper[4732]: I0402 14:47:20.821995 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-v9s5t_11b5be71-32b7-43a3-bf27-d6b1c73844c6/cert-manager-cainjector/0.log" Apr 02 14:47:20 crc kubenswrapper[4732]: I0402 14:47:20.856981 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-fhv6z_4bbf6f84-13bb-4562-af35-10ab372d6580/cert-manager-webhook/0.log" Apr 02 14:47:21 crc kubenswrapper[4732]: I0402 14:47:21.680402 4732 scope.go:117] "RemoveContainer" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" Apr 02 14:47:21 crc kubenswrapper[4732]: E0402 14:47:21.680780 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:47:27 crc kubenswrapper[4732]: I0402 14:47:27.785006 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bn5br" Apr 02 14:47:27 crc kubenswrapper[4732]: I0402 14:47:27.837664 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bn5br" Apr 02 14:47:28 crc kubenswrapper[4732]: I0402 14:47:28.435663 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bn5br"] Apr 02 14:47:28 crc kubenswrapper[4732]: I0402 14:47:28.594312 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6c8xq"] Apr 02 14:47:28 crc kubenswrapper[4732]: I0402 14:47:28.594540 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6c8xq" podUID="a9edc76f-550a-4fb7-b8e5-24a34beb38f8" containerName="registry-server" containerID="cri-o://76848bb4992e39db8b30cb769c61916b185a374bc7e93a6edf4e62d2b5eace04" gracePeriod=2 Apr 02 14:47:28 crc kubenswrapper[4732]: I0402 14:47:28.955968 4732 generic.go:334] "Generic (PLEG): container finished" podID="a9edc76f-550a-4fb7-b8e5-24a34beb38f8" containerID="76848bb4992e39db8b30cb769c61916b185a374bc7e93a6edf4e62d2b5eace04" exitCode=0 Apr 02 14:47:28 crc kubenswrapper[4732]: I0402 14:47:28.956085 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6c8xq" event={"ID":"a9edc76f-550a-4fb7-b8e5-24a34beb38f8","Type":"ContainerDied","Data":"76848bb4992e39db8b30cb769c61916b185a374bc7e93a6edf4e62d2b5eace04"} Apr 02 14:47:29 crc kubenswrapper[4732]: I0402 14:47:29.228308 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6c8xq" Apr 02 14:47:29 crc kubenswrapper[4732]: I0402 14:47:29.339924 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdbj7\" (UniqueName: \"kubernetes.io/projected/a9edc76f-550a-4fb7-b8e5-24a34beb38f8-kube-api-access-zdbj7\") pod \"a9edc76f-550a-4fb7-b8e5-24a34beb38f8\" (UID: \"a9edc76f-550a-4fb7-b8e5-24a34beb38f8\") " Apr 02 14:47:29 crc kubenswrapper[4732]: I0402 14:47:29.340004 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9edc76f-550a-4fb7-b8e5-24a34beb38f8-utilities\") pod \"a9edc76f-550a-4fb7-b8e5-24a34beb38f8\" (UID: \"a9edc76f-550a-4fb7-b8e5-24a34beb38f8\") " Apr 02 14:47:29 crc kubenswrapper[4732]: I0402 14:47:29.340118 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9edc76f-550a-4fb7-b8e5-24a34beb38f8-catalog-content\") pod \"a9edc76f-550a-4fb7-b8e5-24a34beb38f8\" (UID: \"a9edc76f-550a-4fb7-b8e5-24a34beb38f8\") " Apr 02 14:47:29 crc kubenswrapper[4732]: I0402 14:47:29.347414 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9edc76f-550a-4fb7-b8e5-24a34beb38f8-utilities" (OuterVolumeSpecName: "utilities") pod "a9edc76f-550a-4fb7-b8e5-24a34beb38f8" (UID: "a9edc76f-550a-4fb7-b8e5-24a34beb38f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:47:29 crc kubenswrapper[4732]: I0402 14:47:29.350892 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9edc76f-550a-4fb7-b8e5-24a34beb38f8-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 14:47:29 crc kubenswrapper[4732]: I0402 14:47:29.376278 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9edc76f-550a-4fb7-b8e5-24a34beb38f8-kube-api-access-zdbj7" (OuterVolumeSpecName: "kube-api-access-zdbj7") pod "a9edc76f-550a-4fb7-b8e5-24a34beb38f8" (UID: "a9edc76f-550a-4fb7-b8e5-24a34beb38f8"). InnerVolumeSpecName "kube-api-access-zdbj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:47:29 crc kubenswrapper[4732]: I0402 14:47:29.454924 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdbj7\" (UniqueName: \"kubernetes.io/projected/a9edc76f-550a-4fb7-b8e5-24a34beb38f8-kube-api-access-zdbj7\") on node \"crc\" DevicePath \"\"" Apr 02 14:47:29 crc kubenswrapper[4732]: I0402 14:47:29.537257 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9edc76f-550a-4fb7-b8e5-24a34beb38f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9edc76f-550a-4fb7-b8e5-24a34beb38f8" (UID: "a9edc76f-550a-4fb7-b8e5-24a34beb38f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:47:29 crc kubenswrapper[4732]: I0402 14:47:29.557502 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9edc76f-550a-4fb7-b8e5-24a34beb38f8-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 14:47:29 crc kubenswrapper[4732]: I0402 14:47:29.965869 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6c8xq" event={"ID":"a9edc76f-550a-4fb7-b8e5-24a34beb38f8","Type":"ContainerDied","Data":"95ce32db34c67e396cac46f125255e354efa091ef0c7ef82382af09b8513658c"} Apr 02 14:47:29 crc kubenswrapper[4732]: I0402 14:47:29.966130 4732 scope.go:117] "RemoveContainer" containerID="76848bb4992e39db8b30cb769c61916b185a374bc7e93a6edf4e62d2b5eace04" Apr 02 14:47:29 crc kubenswrapper[4732]: I0402 14:47:29.966242 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6c8xq" Apr 02 14:47:29 crc kubenswrapper[4732]: I0402 14:47:29.998746 4732 scope.go:117] "RemoveContainer" containerID="3e705a21eb2565c81efc11c83d8e0787a49f9f96bffeade869087511629c5b6b" Apr 02 14:47:30 crc kubenswrapper[4732]: I0402 14:47:30.000940 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6c8xq"] Apr 02 14:47:30 crc kubenswrapper[4732]: I0402 14:47:30.016402 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6c8xq"] Apr 02 14:47:30 crc kubenswrapper[4732]: I0402 14:47:30.022518 4732 scope.go:117] "RemoveContainer" containerID="0c0709b8ab176553c17093675d38ceb634c67c3962dbca740cd1444380dd16a8" Apr 02 14:47:30 crc kubenswrapper[4732]: I0402 14:47:30.691566 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9edc76f-550a-4fb7-b8e5-24a34beb38f8" path="/var/lib/kubelet/pods/a9edc76f-550a-4fb7-b8e5-24a34beb38f8/volumes" Apr 02 14:47:34 crc kubenswrapper[4732]: I0402 14:47:34.694801 4732 scope.go:117] "RemoveContainer" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" Apr 02 14:47:34 crc kubenswrapper[4732]: E0402 14:47:34.695659 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:47:35 crc kubenswrapper[4732]: I0402 14:47:35.494580 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7b5ddc4dc7-rpk7t_eae3a07b-45a3-4f0e-8fd6-ac653ab24deb/nmstate-console-plugin/0.log" Apr 02 14:47:35 crc kubenswrapper[4732]: I0402 14:47:35.648810 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-6mg2l_55a456a8-9ff7-4d10-a126-d662f361b74d/nmstate-handler/0.log" Apr 02 14:47:35 crc kubenswrapper[4732]: I0402 14:47:35.693216 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-cq88s_fb4efdc1-4ea6-4068-bea0-8f961de0328b/kube-rbac-proxy/0.log" Apr 02 14:47:35 crc kubenswrapper[4732]: I0402 14:47:35.749795 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-cq88s_fb4efdc1-4ea6-4068-bea0-8f961de0328b/nmstate-metrics/0.log" Apr 02 14:47:35 crc kubenswrapper[4732]: I0402 14:47:35.871671 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-6b8c6447b-p77cg_59b4decd-a99c-4637-bc8e-2a95d017696d/nmstate-operator/0.log" Apr 02 14:47:35 crc kubenswrapper[4732]: I0402 14:47:35.928113 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-pqqzz_2669e31a-19bb-42df-a5dd-5886b22e7674/nmstate-webhook/0.log" Apr 02 14:47:47 crc kubenswrapper[4732]: I0402 14:47:47.681459 4732 scope.go:117] "RemoveContainer" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" Apr 02 14:47:47 crc kubenswrapper[4732]: E0402 14:47:47.682689 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:47:59 crc kubenswrapper[4732]: I0402 14:47:59.681429 4732 scope.go:117] "RemoveContainer" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" Apr 02 14:47:59 crc kubenswrapper[4732]: E0402 14:47:59.682554 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:48:00 crc kubenswrapper[4732]: I0402 14:48:00.181132 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585688-xw4zs"] Apr 02 14:48:00 crc kubenswrapper[4732]: E0402 14:48:00.181737 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9edc76f-550a-4fb7-b8e5-24a34beb38f8" containerName="extract-content" Apr 02 14:48:00 crc kubenswrapper[4732]: I0402 14:48:00.181767 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9edc76f-550a-4fb7-b8e5-24a34beb38f8" containerName="extract-content" Apr 02 14:48:00 crc kubenswrapper[4732]: E0402 14:48:00.181805 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9edc76f-550a-4fb7-b8e5-24a34beb38f8" containerName="registry-server" Apr 02 14:48:00 crc kubenswrapper[4732]: I0402 14:48:00.181817 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9edc76f-550a-4fb7-b8e5-24a34beb38f8" containerName="registry-server" Apr 02 14:48:00 crc kubenswrapper[4732]: E0402 14:48:00.181849 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9edc76f-550a-4fb7-b8e5-24a34beb38f8" containerName="extract-utilities" Apr 02 14:48:00 crc kubenswrapper[4732]: I0402 14:48:00.181864 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9edc76f-550a-4fb7-b8e5-24a34beb38f8" containerName="extract-utilities" Apr 02 14:48:00 crc kubenswrapper[4732]: I0402 14:48:00.182207 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9edc76f-550a-4fb7-b8e5-24a34beb38f8" containerName="registry-server" Apr 02 14:48:00 crc kubenswrapper[4732]: I0402 14:48:00.183245 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585688-xw4zs" Apr 02 14:48:00 crc kubenswrapper[4732]: I0402 14:48:00.188918 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585688-xw4zs"] Apr 02 14:48:00 crc kubenswrapper[4732]: I0402 14:48:00.207397 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:48:00 crc kubenswrapper[4732]: I0402 14:48:00.208167 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:48:00 crc kubenswrapper[4732]: I0402 14:48:00.208165 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:48:00 crc kubenswrapper[4732]: I0402 14:48:00.243760 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vx6q\" (UniqueName: \"kubernetes.io/projected/48c33b2c-08b9-49be-9275-14660b4de57d-kube-api-access-4vx6q\") pod \"auto-csr-approver-29585688-xw4zs\" (UID: \"48c33b2c-08b9-49be-9275-14660b4de57d\") " pod="openshift-infra/auto-csr-approver-29585688-xw4zs" Apr 02 14:48:00 crc kubenswrapper[4732]: I0402 14:48:00.345554 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vx6q\" (UniqueName: \"kubernetes.io/projected/48c33b2c-08b9-49be-9275-14660b4de57d-kube-api-access-4vx6q\") pod \"auto-csr-approver-29585688-xw4zs\" (UID: \"48c33b2c-08b9-49be-9275-14660b4de57d\") " pod="openshift-infra/auto-csr-approver-29585688-xw4zs" Apr 02 14:48:00 crc kubenswrapper[4732]: I0402 14:48:00.363570 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vx6q\" (UniqueName: \"kubernetes.io/projected/48c33b2c-08b9-49be-9275-14660b4de57d-kube-api-access-4vx6q\") pod \"auto-csr-approver-29585688-xw4zs\" (UID: \"48c33b2c-08b9-49be-9275-14660b4de57d\") " pod="openshift-infra/auto-csr-approver-29585688-xw4zs" Apr 02 14:48:00 crc kubenswrapper[4732]: I0402 14:48:00.532084 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585688-xw4zs" Apr 02 14:48:01 crc kubenswrapper[4732]: I0402 14:48:01.003500 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585688-xw4zs"] Apr 02 14:48:01 crc kubenswrapper[4732]: I0402 14:48:01.267351 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585688-xw4zs" event={"ID":"48c33b2c-08b9-49be-9275-14660b4de57d","Type":"ContainerStarted","Data":"815ef7bceca384d80fb64092df5f5265502295e03bc86be86c454aaee145273a"} Apr 02 14:48:02 crc kubenswrapper[4732]: I0402 14:48:02.322855 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bb64cd5d7-5795k_15b5f14a-3755-4967-b789-555f8ac970a2/kube-rbac-proxy/0.log" Apr 02 14:48:02 crc kubenswrapper[4732]: I0402 14:48:02.430245 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bb64cd5d7-5795k_15b5f14a-3755-4967-b789-555f8ac970a2/controller/0.log" Apr 02 14:48:03 crc kubenswrapper[4732]: I0402 14:48:03.081980 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/cp-frr-files/0.log" Apr 02 14:48:03 crc kubenswrapper[4732]: I0402 14:48:03.253930 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/cp-frr-files/0.log" Apr 02 14:48:03 crc kubenswrapper[4732]: I0402 14:48:03.276984 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/cp-reloader/0.log" Apr 02 14:48:03 crc kubenswrapper[4732]: I0402 14:48:03.284489 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585688-xw4zs" event={"ID":"48c33b2c-08b9-49be-9275-14660b4de57d","Type":"ContainerStarted","Data":"46f44336eb7395e1a1e82ff8cacc5899bdd2e483c6ad0652ef728f274a4d9ec9"} Apr 02 14:48:03 crc kubenswrapper[4732]: I0402 14:48:03.318497 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/cp-metrics/0.log" Apr 02 14:48:03 crc kubenswrapper[4732]: I0402 14:48:03.325431 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29585688-xw4zs" podStartSLOduration=2.032165462 podStartE2EDuration="3.325409711s" podCreationTimestamp="2026-04-02 14:48:00 +0000 UTC" firstStartedPulling="2026-04-02 14:48:01.009863488 +0000 UTC m=+4237.914271041" lastFinishedPulling="2026-04-02 14:48:02.303107737 +0000 UTC m=+4239.207515290" observedRunningTime="2026-04-02 14:48:03.296598044 +0000 UTC m=+4240.201005607" watchObservedRunningTime="2026-04-02 14:48:03.325409711 +0000 UTC m=+4240.229817264" Apr 02 14:48:03 crc kubenswrapper[4732]: I0402 14:48:03.325847 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/cp-reloader/0.log" Apr 02 14:48:03 crc kubenswrapper[4732]: I0402 14:48:03.493338 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/cp-reloader/0.log" Apr 02 14:48:03 crc kubenswrapper[4732]: I0402 14:48:03.499248 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/cp-frr-files/0.log" Apr 02 14:48:03 crc kubenswrapper[4732]: I0402 14:48:03.513885 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/cp-metrics/0.log" Apr 02 14:48:03 crc kubenswrapper[4732]: I0402 14:48:03.532796 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/cp-metrics/0.log" Apr 02 14:48:03 crc kubenswrapper[4732]: I0402 14:48:03.702380 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/cp-frr-files/0.log" Apr 02 14:48:03 crc kubenswrapper[4732]: I0402 14:48:03.712868 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/cp-reloader/0.log" Apr 02 14:48:03 crc kubenswrapper[4732]: I0402 14:48:03.737680 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/cp-metrics/0.log" Apr 02 14:48:03 crc kubenswrapper[4732]: I0402 14:48:03.764813 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/controller/0.log" Apr 02 14:48:03 crc kubenswrapper[4732]: I0402 14:48:03.894262 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/frr-metrics/0.log" Apr 02 14:48:03 crc kubenswrapper[4732]: I0402 14:48:03.976876 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/kube-rbac-proxy/0.log" Apr 02 14:48:04 crc kubenswrapper[4732]: I0402 14:48:04.001786 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/kube-rbac-proxy-frr/0.log" Apr 02 14:48:04 crc kubenswrapper[4732]: I0402 14:48:04.105022 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/reloader/0.log" Apr 02 14:48:04 crc kubenswrapper[4732]: I0402 14:48:04.301632 4732 generic.go:334] "Generic (PLEG): container finished" podID="48c33b2c-08b9-49be-9275-14660b4de57d" containerID="46f44336eb7395e1a1e82ff8cacc5899bdd2e483c6ad0652ef728f274a4d9ec9" exitCode=0 Apr 02 14:48:04 crc kubenswrapper[4732]: I0402 14:48:04.301888 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585688-xw4zs" event={"ID":"48c33b2c-08b9-49be-9275-14660b4de57d","Type":"ContainerDied","Data":"46f44336eb7395e1a1e82ff8cacc5899bdd2e483c6ad0652ef728f274a4d9ec9"} Apr 02 14:48:04 crc kubenswrapper[4732]: I0402 14:48:04.701949 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-rrzxj_777a90c8-0e68-4362-a696-c92e0a49253f/frr-k8s-webhook-server/0.log" Apr 02 14:48:04 crc kubenswrapper[4732]: I0402 14:48:04.916568 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-86c87c56d7-qlfzr_bd386538-6696-4b2c-96e4-6f8e4b949364/manager/0.log" Apr 02 14:48:04 crc kubenswrapper[4732]: I0402 14:48:04.990209 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6948d8cf8d-vd8rt_329919c3-94d2-43c2-94a8-2ba9518b98fa/webhook-server/0.log" Apr 02 14:48:05 crc kubenswrapper[4732]: I0402 14:48:05.129288 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ng8gx_c89914a1-adc8-4baa-ae73-ec02091fca58/kube-rbac-proxy/0.log" Apr 02 14:48:05 crc kubenswrapper[4732]: I0402 14:48:05.710114 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585688-xw4zs" Apr 02 14:48:05 crc kubenswrapper[4732]: I0402 14:48:05.791492 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjkgp_268c93ae-fdc1-424e-806c-a4272b6e6ba0/frr/0.log" Apr 02 14:48:05 crc kubenswrapper[4732]: I0402 14:48:05.792362 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ng8gx_c89914a1-adc8-4baa-ae73-ec02091fca58/speaker/0.log" Apr 02 14:48:05 crc kubenswrapper[4732]: I0402 14:48:05.835812 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vx6q\" (UniqueName: \"kubernetes.io/projected/48c33b2c-08b9-49be-9275-14660b4de57d-kube-api-access-4vx6q\") pod \"48c33b2c-08b9-49be-9275-14660b4de57d\" (UID: \"48c33b2c-08b9-49be-9275-14660b4de57d\") " Apr 02 14:48:05 crc kubenswrapper[4732]: I0402 14:48:05.842713 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48c33b2c-08b9-49be-9275-14660b4de57d-kube-api-access-4vx6q" (OuterVolumeSpecName: "kube-api-access-4vx6q") pod "48c33b2c-08b9-49be-9275-14660b4de57d" (UID: "48c33b2c-08b9-49be-9275-14660b4de57d"). InnerVolumeSpecName "kube-api-access-4vx6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:48:05 crc kubenswrapper[4732]: I0402 14:48:05.940036 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vx6q\" (UniqueName: \"kubernetes.io/projected/48c33b2c-08b9-49be-9275-14660b4de57d-kube-api-access-4vx6q\") on node \"crc\" DevicePath \"\"" Apr 02 14:48:06 crc kubenswrapper[4732]: I0402 14:48:06.322361 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585688-xw4zs" event={"ID":"48c33b2c-08b9-49be-9275-14660b4de57d","Type":"ContainerDied","Data":"815ef7bceca384d80fb64092df5f5265502295e03bc86be86c454aaee145273a"} Apr 02 14:48:06 crc kubenswrapper[4732]: I0402 14:48:06.322399 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="815ef7bceca384d80fb64092df5f5265502295e03bc86be86c454aaee145273a" Apr 02 14:48:06 crc kubenswrapper[4732]: I0402 14:48:06.322433 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585688-xw4zs" Apr 02 14:48:06 crc kubenswrapper[4732]: I0402 14:48:06.372708 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585682-pz4km"] Apr 02 14:48:06 crc kubenswrapper[4732]: I0402 14:48:06.381307 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585682-pz4km"] Apr 02 14:48:06 crc kubenswrapper[4732]: I0402 14:48:06.700907 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9b1b986-3261-47c8-8483-6951c031f154" path="/var/lib/kubelet/pods/d9b1b986-3261-47c8-8483-6951c031f154/volumes" Apr 02 14:48:10 crc kubenswrapper[4732]: I0402 14:48:10.680992 4732 scope.go:117] "RemoveContainer" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" Apr 02 14:48:10 crc kubenswrapper[4732]: E0402 14:48:10.682121 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:48:18 crc kubenswrapper[4732]: I0402 14:48:18.113288 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx_77a3925c-07b0-47ea-950f-524ce995edbf/util/0.log" Apr 02 14:48:18 crc kubenswrapper[4732]: I0402 14:48:18.312175 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx_77a3925c-07b0-47ea-950f-524ce995edbf/util/0.log" Apr 02 14:48:18 crc kubenswrapper[4732]: I0402 14:48:18.313438 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx_77a3925c-07b0-47ea-950f-524ce995edbf/pull/0.log" Apr 02 14:48:18 crc kubenswrapper[4732]: I0402 14:48:18.319446 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx_77a3925c-07b0-47ea-950f-524ce995edbf/pull/0.log" Apr 02 14:48:18 crc kubenswrapper[4732]: I0402 14:48:18.465938 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx_77a3925c-07b0-47ea-950f-524ce995edbf/util/0.log" Apr 02 14:48:18 crc kubenswrapper[4732]: I0402 14:48:18.485337 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx_77a3925c-07b0-47ea-950f-524ce995edbf/extract/0.log" Apr 02 14:48:18 crc kubenswrapper[4732]: I0402 14:48:18.535119 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3eflcgrx_77a3925c-07b0-47ea-950f-524ce995edbf/pull/0.log" Apr 02 14:48:18 crc kubenswrapper[4732]: I0402 14:48:18.639179 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg_6c632108-b88a-4b6c-9368-49aacc9c04ec/util/0.log" Apr 02 14:48:18 crc kubenswrapper[4732]: I0402 14:48:18.782322 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg_6c632108-b88a-4b6c-9368-49aacc9c04ec/util/0.log" Apr 02 14:48:18 crc kubenswrapper[4732]: I0402 14:48:18.815404 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg_6c632108-b88a-4b6c-9368-49aacc9c04ec/pull/0.log" Apr 02 14:48:18 crc kubenswrapper[4732]: I0402 14:48:18.818361 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg_6c632108-b88a-4b6c-9368-49aacc9c04ec/pull/0.log" Apr 02 14:48:19 crc kubenswrapper[4732]: I0402 14:48:19.003002 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg_6c632108-b88a-4b6c-9368-49aacc9c04ec/pull/0.log" Apr 02 14:48:19 crc kubenswrapper[4732]: I0402 14:48:19.018801 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg_6c632108-b88a-4b6c-9368-49aacc9c04ec/extract/0.log" Apr 02 14:48:19 crc kubenswrapper[4732]: I0402 14:48:19.041987 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645jjqvg_6c632108-b88a-4b6c-9368-49aacc9c04ec/util/0.log" Apr 02 14:48:19 crc kubenswrapper[4732]: I0402 14:48:19.168399 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6265v_5d2e3468-f731-45d7-bc5f-5c32a739a196/extract-utilities/0.log" Apr 02 14:48:19 crc kubenswrapper[4732]: I0402 14:48:19.324238 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6265v_5d2e3468-f731-45d7-bc5f-5c32a739a196/extract-utilities/0.log" Apr 02 14:48:19 crc kubenswrapper[4732]: I0402 14:48:19.325897 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6265v_5d2e3468-f731-45d7-bc5f-5c32a739a196/extract-content/0.log" Apr 02 14:48:19 crc kubenswrapper[4732]: I0402 14:48:19.330551 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6265v_5d2e3468-f731-45d7-bc5f-5c32a739a196/extract-content/0.log" Apr 02 14:48:19 crc kubenswrapper[4732]: I0402 14:48:19.505410 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6265v_5d2e3468-f731-45d7-bc5f-5c32a739a196/extract-content/0.log" Apr 02 14:48:19 crc kubenswrapper[4732]: I0402 14:48:19.536941 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6265v_5d2e3468-f731-45d7-bc5f-5c32a739a196/extract-utilities/0.log" Apr 02 14:48:19 crc kubenswrapper[4732]: I0402 14:48:19.673800 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mvbv_7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b/extract-utilities/0.log" Apr 02 14:48:19 crc kubenswrapper[4732]: I0402 14:48:19.941724 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mvbv_7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b/extract-utilities/0.log" Apr 02 14:48:20 crc kubenswrapper[4732]: I0402 14:48:20.000082 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mvbv_7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b/extract-content/0.log" Apr 02 14:48:20 crc kubenswrapper[4732]: I0402 14:48:20.038422 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mvbv_7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b/extract-content/0.log" Apr 02 14:48:20 crc kubenswrapper[4732]: I0402 14:48:20.117532 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6265v_5d2e3468-f731-45d7-bc5f-5c32a739a196/registry-server/0.log" Apr 02 14:48:20 crc kubenswrapper[4732]: I0402 14:48:20.219434 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mvbv_7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b/extract-content/0.log" Apr 02 14:48:20 crc kubenswrapper[4732]: I0402 14:48:20.219459 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mvbv_7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b/extract-utilities/0.log" Apr 02 14:48:20 crc kubenswrapper[4732]: I0402 14:48:20.472312 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-w7mzq_c2c9f0ff-65e0-4d5f-8518-4461263be6c2/marketplace-operator/0.log" Apr 02 14:48:20 crc kubenswrapper[4732]: I0402 14:48:20.641339 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jlqh_76b4f705-b92d-4b11-8ddf-a4252a98c37d/extract-utilities/0.log" Apr 02 14:48:20 crc kubenswrapper[4732]: I0402 14:48:20.801362 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mvbv_7eb89e74-cc6f-4763-a2ed-e4a3e8c2f89b/registry-server/0.log" Apr 02 14:48:20 crc kubenswrapper[4732]: I0402 14:48:20.856232 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jlqh_76b4f705-b92d-4b11-8ddf-a4252a98c37d/extract-utilities/0.log" Apr 02 14:48:20 crc kubenswrapper[4732]: I0402 14:48:20.868160 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jlqh_76b4f705-b92d-4b11-8ddf-a4252a98c37d/extract-content/0.log" Apr 02 14:48:20 crc kubenswrapper[4732]: I0402 14:48:20.896299 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jlqh_76b4f705-b92d-4b11-8ddf-a4252a98c37d/extract-content/0.log" Apr 02 14:48:21 crc kubenswrapper[4732]: I0402 14:48:21.065563 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jlqh_76b4f705-b92d-4b11-8ddf-a4252a98c37d/extract-content/0.log" Apr 02 14:48:21 crc kubenswrapper[4732]: I0402 14:48:21.067439 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jlqh_76b4f705-b92d-4b11-8ddf-a4252a98c37d/extract-utilities/0.log" Apr 02 14:48:21 crc kubenswrapper[4732]: I0402 14:48:21.202671 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jlqh_76b4f705-b92d-4b11-8ddf-a4252a98c37d/registry-server/0.log" Apr 02 14:48:21 crc kubenswrapper[4732]: I0402 14:48:21.324418 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bn5br_28a06597-0ccd-4f06-b2d7-ff634ff90d24/extract-utilities/0.log" Apr 02 14:48:21 crc kubenswrapper[4732]: I0402 14:48:21.469577 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bn5br_28a06597-0ccd-4f06-b2d7-ff634ff90d24/extract-content/0.log" Apr 02 14:48:21 crc kubenswrapper[4732]: I0402 14:48:21.479953 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bn5br_28a06597-0ccd-4f06-b2d7-ff634ff90d24/extract-content/0.log" Apr 02 14:48:21 crc kubenswrapper[4732]: I0402 14:48:21.527189 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bn5br_28a06597-0ccd-4f06-b2d7-ff634ff90d24/extract-utilities/0.log" Apr 02 14:48:21 crc kubenswrapper[4732]: I0402 14:48:21.647079 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bn5br_28a06597-0ccd-4f06-b2d7-ff634ff90d24/extract-utilities/0.log" Apr 02 14:48:21 crc kubenswrapper[4732]: I0402 14:48:21.659267 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bn5br_28a06597-0ccd-4f06-b2d7-ff634ff90d24/extract-content/0.log" Apr 02 14:48:21 crc kubenswrapper[4732]: I0402 14:48:21.842863 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bn5br_28a06597-0ccd-4f06-b2d7-ff634ff90d24/registry-server/0.log" Apr 02 14:48:23 crc kubenswrapper[4732]: I0402 14:48:23.680783 4732 scope.go:117] "RemoveContainer" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" Apr 02 14:48:23 crc kubenswrapper[4732]: E0402 14:48:23.682265 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6vtmw_openshift-machine-config-operator(38409e5e-4545-49da-8f6c-4bfb30582878)\"" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" Apr 02 14:48:25 crc kubenswrapper[4732]: I0402 14:48:25.664629 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sr2l6"] Apr 02 14:48:25 crc kubenswrapper[4732]: E0402 14:48:25.665565 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c33b2c-08b9-49be-9275-14660b4de57d" containerName="oc" Apr 02 14:48:25 crc kubenswrapper[4732]: I0402 14:48:25.665579 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c33b2c-08b9-49be-9275-14660b4de57d" containerName="oc" Apr 02 14:48:25 crc kubenswrapper[4732]: I0402 14:48:25.665772 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="48c33b2c-08b9-49be-9275-14660b4de57d" containerName="oc" Apr 02 14:48:25 crc kubenswrapper[4732]: I0402 14:48:25.667232 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sr2l6" Apr 02 14:48:25 crc kubenswrapper[4732]: I0402 14:48:25.695195 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sr2l6"] Apr 02 14:48:25 crc kubenswrapper[4732]: I0402 14:48:25.825949 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e147561-99b2-4a06-9ec6-772dcf98f37c-utilities\") pod \"community-operators-sr2l6\" (UID: \"4e147561-99b2-4a06-9ec6-772dcf98f37c\") " pod="openshift-marketplace/community-operators-sr2l6" Apr 02 14:48:25 crc kubenswrapper[4732]: I0402 14:48:25.826072 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e147561-99b2-4a06-9ec6-772dcf98f37c-catalog-content\") pod \"community-operators-sr2l6\" (UID: \"4e147561-99b2-4a06-9ec6-772dcf98f37c\") " pod="openshift-marketplace/community-operators-sr2l6" Apr 02 14:48:25 crc kubenswrapper[4732]: I0402 14:48:25.826269 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c2gk\" (UniqueName: \"kubernetes.io/projected/4e147561-99b2-4a06-9ec6-772dcf98f37c-kube-api-access-4c2gk\") pod \"community-operators-sr2l6\" (UID: \"4e147561-99b2-4a06-9ec6-772dcf98f37c\") " pod="openshift-marketplace/community-operators-sr2l6" Apr 02 14:48:25 crc kubenswrapper[4732]: I0402 14:48:25.927936 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e147561-99b2-4a06-9ec6-772dcf98f37c-utilities\") pod \"community-operators-sr2l6\" (UID: \"4e147561-99b2-4a06-9ec6-772dcf98f37c\") " pod="openshift-marketplace/community-operators-sr2l6" Apr 02 14:48:25 crc kubenswrapper[4732]: I0402 14:48:25.928027 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e147561-99b2-4a06-9ec6-772dcf98f37c-catalog-content\") pod \"community-operators-sr2l6\" (UID: \"4e147561-99b2-4a06-9ec6-772dcf98f37c\") " pod="openshift-marketplace/community-operators-sr2l6" Apr 02 14:48:25 crc kubenswrapper[4732]: I0402 14:48:25.928103 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c2gk\" (UniqueName: \"kubernetes.io/projected/4e147561-99b2-4a06-9ec6-772dcf98f37c-kube-api-access-4c2gk\") pod \"community-operators-sr2l6\" (UID: \"4e147561-99b2-4a06-9ec6-772dcf98f37c\") " pod="openshift-marketplace/community-operators-sr2l6" Apr 02 14:48:25 crc kubenswrapper[4732]: I0402 14:48:25.928431 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e147561-99b2-4a06-9ec6-772dcf98f37c-utilities\") pod \"community-operators-sr2l6\" (UID: \"4e147561-99b2-4a06-9ec6-772dcf98f37c\") " pod="openshift-marketplace/community-operators-sr2l6" Apr 02 14:48:25 crc kubenswrapper[4732]: I0402 14:48:25.928554 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e147561-99b2-4a06-9ec6-772dcf98f37c-catalog-content\") pod \"community-operators-sr2l6\" (UID: \"4e147561-99b2-4a06-9ec6-772dcf98f37c\") " pod="openshift-marketplace/community-operators-sr2l6" Apr 02 14:48:25 crc kubenswrapper[4732]: I0402 14:48:25.952457 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c2gk\" (UniqueName: \"kubernetes.io/projected/4e147561-99b2-4a06-9ec6-772dcf98f37c-kube-api-access-4c2gk\") pod \"community-operators-sr2l6\" (UID: \"4e147561-99b2-4a06-9ec6-772dcf98f37c\") " pod="openshift-marketplace/community-operators-sr2l6" Apr 02 14:48:26 crc kubenswrapper[4732]: I0402 14:48:26.001222 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sr2l6" Apr 02 14:48:26 crc kubenswrapper[4732]: I0402 14:48:26.577008 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sr2l6"] Apr 02 14:48:27 crc kubenswrapper[4732]: I0402 14:48:27.505209 4732 generic.go:334] "Generic (PLEG): container finished" podID="4e147561-99b2-4a06-9ec6-772dcf98f37c" containerID="685d7b0aa96d3bb00a6d2a39043edefda6e32cecb577e3a393b2322eee132431" exitCode=0 Apr 02 14:48:27 crc kubenswrapper[4732]: I0402 14:48:27.505279 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr2l6" event={"ID":"4e147561-99b2-4a06-9ec6-772dcf98f37c","Type":"ContainerDied","Data":"685d7b0aa96d3bb00a6d2a39043edefda6e32cecb577e3a393b2322eee132431"} Apr 02 14:48:27 crc kubenswrapper[4732]: I0402 14:48:27.506463 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr2l6" event={"ID":"4e147561-99b2-4a06-9ec6-772dcf98f37c","Type":"ContainerStarted","Data":"47579b8664de9b5ba076b6a3191f1a0d19b29b32a2dd3fa9ea86116284001d48"} Apr 02 14:48:28 crc kubenswrapper[4732]: I0402 14:48:28.517844 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr2l6" event={"ID":"4e147561-99b2-4a06-9ec6-772dcf98f37c","Type":"ContainerStarted","Data":"e99b75698ccbc623d55d9d8f5ebf6f64446cee11baf5dfe4a6a1fc4bdb0c6bf5"} Apr 02 14:48:29 crc kubenswrapper[4732]: I0402 14:48:29.530442 4732 generic.go:334] "Generic (PLEG): container finished" podID="4e147561-99b2-4a06-9ec6-772dcf98f37c" containerID="e99b75698ccbc623d55d9d8f5ebf6f64446cee11baf5dfe4a6a1fc4bdb0c6bf5" exitCode=0 Apr 02 14:48:29 crc kubenswrapper[4732]: I0402 14:48:29.530498 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr2l6" event={"ID":"4e147561-99b2-4a06-9ec6-772dcf98f37c","Type":"ContainerDied","Data":"e99b75698ccbc623d55d9d8f5ebf6f64446cee11baf5dfe4a6a1fc4bdb0c6bf5"} Apr 02 14:48:30 crc kubenswrapper[4732]: I0402 14:48:30.544554 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr2l6" event={"ID":"4e147561-99b2-4a06-9ec6-772dcf98f37c","Type":"ContainerStarted","Data":"45dbde2b341d205d3059ed9d906a8ef180bd7865c61041eb5974ca01730d1dea"} Apr 02 14:48:30 crc kubenswrapper[4732]: I0402 14:48:30.574165 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sr2l6" podStartSLOduration=3.144260114 podStartE2EDuration="5.574144477s" podCreationTimestamp="2026-04-02 14:48:25 +0000 UTC" firstStartedPulling="2026-04-02 14:48:27.508246384 +0000 UTC m=+4264.412653937" lastFinishedPulling="2026-04-02 14:48:29.938130747 +0000 UTC m=+4266.842538300" observedRunningTime="2026-04-02 14:48:30.568295647 +0000 UTC m=+4267.472703220" watchObservedRunningTime="2026-04-02 14:48:30.574144477 +0000 UTC m=+4267.478552030" Apr 02 14:48:36 crc kubenswrapper[4732]: I0402 14:48:36.002136 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sr2l6" Apr 02 14:48:36 crc kubenswrapper[4732]: I0402 14:48:36.002698 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sr2l6" Apr 02 14:48:36 crc kubenswrapper[4732]: I0402 14:48:36.055689 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sr2l6" Apr 02 14:48:36 crc kubenswrapper[4732]: I0402 14:48:36.640157 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sr2l6" Apr 02 14:48:36 crc kubenswrapper[4732]: I0402 14:48:36.697460 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sr2l6"] Apr 02 14:48:37 crc kubenswrapper[4732]: I0402 14:48:37.683977 4732 scope.go:117] "RemoveContainer" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" Apr 02 14:48:38 crc kubenswrapper[4732]: I0402 14:48:38.611205 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerStarted","Data":"c6ec8830c9c22d8d796c164088d1dd9b788e6c5f062d2aea3a3dd2f0912e8bde"} Apr 02 14:48:38 crc kubenswrapper[4732]: I0402 14:48:38.611426 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sr2l6" podUID="4e147561-99b2-4a06-9ec6-772dcf98f37c" containerName="registry-server" containerID="cri-o://45dbde2b341d205d3059ed9d906a8ef180bd7865c61041eb5974ca01730d1dea" gracePeriod=2 Apr 02 14:48:39 crc kubenswrapper[4732]: I0402 14:48:39.641693 4732 generic.go:334] "Generic (PLEG): container finished" podID="4e147561-99b2-4a06-9ec6-772dcf98f37c" containerID="45dbde2b341d205d3059ed9d906a8ef180bd7865c61041eb5974ca01730d1dea" exitCode=0 Apr 02 14:48:39 crc kubenswrapper[4732]: I0402 14:48:39.642174 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr2l6" event={"ID":"4e147561-99b2-4a06-9ec6-772dcf98f37c","Type":"ContainerDied","Data":"45dbde2b341d205d3059ed9d906a8ef180bd7865c61041eb5974ca01730d1dea"} Apr 02 14:48:39 crc kubenswrapper[4732]: I0402 14:48:39.818526 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sr2l6" Apr 02 14:48:39 crc kubenswrapper[4732]: I0402 14:48:39.930013 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e147561-99b2-4a06-9ec6-772dcf98f37c-catalog-content\") pod \"4e147561-99b2-4a06-9ec6-772dcf98f37c\" (UID: \"4e147561-99b2-4a06-9ec6-772dcf98f37c\") " Apr 02 14:48:39 crc kubenswrapper[4732]: I0402 14:48:39.930064 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c2gk\" (UniqueName: \"kubernetes.io/projected/4e147561-99b2-4a06-9ec6-772dcf98f37c-kube-api-access-4c2gk\") pod \"4e147561-99b2-4a06-9ec6-772dcf98f37c\" (UID: \"4e147561-99b2-4a06-9ec6-772dcf98f37c\") " Apr 02 14:48:39 crc kubenswrapper[4732]: I0402 14:48:39.930148 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e147561-99b2-4a06-9ec6-772dcf98f37c-utilities\") pod \"4e147561-99b2-4a06-9ec6-772dcf98f37c\" (UID: \"4e147561-99b2-4a06-9ec6-772dcf98f37c\") " Apr 02 14:48:39 crc kubenswrapper[4732]: I0402 14:48:39.930901 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e147561-99b2-4a06-9ec6-772dcf98f37c-utilities" (OuterVolumeSpecName: "utilities") pod "4e147561-99b2-4a06-9ec6-772dcf98f37c" (UID: "4e147561-99b2-4a06-9ec6-772dcf98f37c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:48:40 crc kubenswrapper[4732]: I0402 14:48:40.034762 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e147561-99b2-4a06-9ec6-772dcf98f37c-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 14:48:40 crc kubenswrapper[4732]: I0402 14:48:40.051293 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e147561-99b2-4a06-9ec6-772dcf98f37c-kube-api-access-4c2gk" (OuterVolumeSpecName: "kube-api-access-4c2gk") pod "4e147561-99b2-4a06-9ec6-772dcf98f37c" (UID: "4e147561-99b2-4a06-9ec6-772dcf98f37c"). InnerVolumeSpecName "kube-api-access-4c2gk". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:48:40 crc kubenswrapper[4732]: I0402 14:48:40.132412 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e147561-99b2-4a06-9ec6-772dcf98f37c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e147561-99b2-4a06-9ec6-772dcf98f37c" (UID: "4e147561-99b2-4a06-9ec6-772dcf98f37c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:48:40 crc kubenswrapper[4732]: I0402 14:48:40.136622 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e147561-99b2-4a06-9ec6-772dcf98f37c-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 14:48:40 crc kubenswrapper[4732]: I0402 14:48:40.136649 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c2gk\" (UniqueName: \"kubernetes.io/projected/4e147561-99b2-4a06-9ec6-772dcf98f37c-kube-api-access-4c2gk\") on node \"crc\" DevicePath \"\"" Apr 02 14:48:40 crc kubenswrapper[4732]: I0402 14:48:40.658355 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr2l6" event={"ID":"4e147561-99b2-4a06-9ec6-772dcf98f37c","Type":"ContainerDied","Data":"47579b8664de9b5ba076b6a3191f1a0d19b29b32a2dd3fa9ea86116284001d48"} Apr 02 14:48:40 crc kubenswrapper[4732]: I0402 14:48:40.658715 4732 scope.go:117] "RemoveContainer" containerID="45dbde2b341d205d3059ed9d906a8ef180bd7865c61041eb5974ca01730d1dea" Apr 02 14:48:40 crc kubenswrapper[4732]: I0402 14:48:40.658901 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sr2l6" Apr 02 14:48:40 crc kubenswrapper[4732]: I0402 14:48:40.718870 4732 scope.go:117] "RemoveContainer" containerID="e99b75698ccbc623d55d9d8f5ebf6f64446cee11baf5dfe4a6a1fc4bdb0c6bf5" Apr 02 14:48:40 crc kubenswrapper[4732]: I0402 14:48:40.772453 4732 scope.go:117] "RemoveContainer" containerID="685d7b0aa96d3bb00a6d2a39043edefda6e32cecb577e3a393b2322eee132431" Apr 02 14:48:40 crc kubenswrapper[4732]: I0402 14:48:40.782268 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sr2l6"] Apr 02 14:48:40 crc kubenswrapper[4732]: I0402 14:48:40.799284 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sr2l6"] Apr 02 14:48:42 crc kubenswrapper[4732]: I0402 14:48:42.692395 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e147561-99b2-4a06-9ec6-772dcf98f37c" path="/var/lib/kubelet/pods/4e147561-99b2-4a06-9ec6-772dcf98f37c/volumes" Apr 02 14:48:53 crc kubenswrapper[4732]: I0402 14:48:53.893131 4732 scope.go:117] "RemoveContainer" containerID="6208c1665c72ff244438793fdbe555f38aad7bda2dcdc0b338432085591a69fd" Apr 02 14:50:00 crc kubenswrapper[4732]: I0402 14:50:00.145537 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585690-w726x"] Apr 02 14:50:00 crc kubenswrapper[4732]: E0402 14:50:00.146462 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e147561-99b2-4a06-9ec6-772dcf98f37c" containerName="registry-server" Apr 02 14:50:00 crc kubenswrapper[4732]: I0402 14:50:00.146477 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e147561-99b2-4a06-9ec6-772dcf98f37c" containerName="registry-server" Apr 02 14:50:00 crc kubenswrapper[4732]: E0402 14:50:00.146494 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e147561-99b2-4a06-9ec6-772dcf98f37c" containerName="extract-utilities" Apr 02 14:50:00 crc kubenswrapper[4732]: I0402 14:50:00.146500 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e147561-99b2-4a06-9ec6-772dcf98f37c" containerName="extract-utilities" Apr 02 14:50:00 crc kubenswrapper[4732]: E0402 14:50:00.146517 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e147561-99b2-4a06-9ec6-772dcf98f37c" containerName="extract-content" Apr 02 14:50:00 crc kubenswrapper[4732]: I0402 14:50:00.146523 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e147561-99b2-4a06-9ec6-772dcf98f37c" containerName="extract-content" Apr 02 14:50:00 crc kubenswrapper[4732]: I0402 14:50:00.146729 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e147561-99b2-4a06-9ec6-772dcf98f37c" containerName="registry-server" Apr 02 14:50:00 crc kubenswrapper[4732]: I0402 14:50:00.147349 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585690-w726x" Apr 02 14:50:00 crc kubenswrapper[4732]: I0402 14:50:00.149081 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:50:00 crc kubenswrapper[4732]: I0402 14:50:00.149986 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:50:00 crc kubenswrapper[4732]: I0402 14:50:00.150116 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:50:00 crc kubenswrapper[4732]: I0402 14:50:00.157214 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585690-w726x"] Apr 02 14:50:00 crc kubenswrapper[4732]: I0402 14:50:00.218384 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twtb8\" (UniqueName: \"kubernetes.io/projected/4a24efd3-0bf8-4d67-a529-5c8174ca09fd-kube-api-access-twtb8\") pod \"auto-csr-approver-29585690-w726x\" (UID: \"4a24efd3-0bf8-4d67-a529-5c8174ca09fd\") " pod="openshift-infra/auto-csr-approver-29585690-w726x" Apr 02 14:50:00 crc kubenswrapper[4732]: I0402 14:50:00.319917 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twtb8\" (UniqueName: \"kubernetes.io/projected/4a24efd3-0bf8-4d67-a529-5c8174ca09fd-kube-api-access-twtb8\") pod \"auto-csr-approver-29585690-w726x\" (UID: \"4a24efd3-0bf8-4d67-a529-5c8174ca09fd\") " pod="openshift-infra/auto-csr-approver-29585690-w726x" Apr 02 14:50:00 crc kubenswrapper[4732]: I0402 14:50:00.356412 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twtb8\" (UniqueName: \"kubernetes.io/projected/4a24efd3-0bf8-4d67-a529-5c8174ca09fd-kube-api-access-twtb8\") pod \"auto-csr-approver-29585690-w726x\" (UID: \"4a24efd3-0bf8-4d67-a529-5c8174ca09fd\") " pod="openshift-infra/auto-csr-approver-29585690-w726x" Apr 02 14:50:00 crc kubenswrapper[4732]: I0402 14:50:00.465000 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585690-w726x" Apr 02 14:50:00 crc kubenswrapper[4732]: I0402 14:50:00.954342 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585690-w726x"] Apr 02 14:50:01 crc kubenswrapper[4732]: I0402 14:50:01.438702 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585690-w726x" event={"ID":"4a24efd3-0bf8-4d67-a529-5c8174ca09fd","Type":"ContainerStarted","Data":"ab8fdf27171970d29687555e5b32de74785f0f94661c4376cede1d5d0efa657d"} Apr 02 14:50:03 crc kubenswrapper[4732]: I0402 14:50:03.458473 4732 generic.go:334] "Generic (PLEG): container finished" podID="4a24efd3-0bf8-4d67-a529-5c8174ca09fd" containerID="6b666539d7bb36a926ab5c3434ce751b6e526881b7b1439dee3ec215907f3620" exitCode=0 Apr 02 14:50:03 crc kubenswrapper[4732]: I0402 14:50:03.458544 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585690-w726x" event={"ID":"4a24efd3-0bf8-4d67-a529-5c8174ca09fd","Type":"ContainerDied","Data":"6b666539d7bb36a926ab5c3434ce751b6e526881b7b1439dee3ec215907f3620"} Apr 02 14:50:04 crc kubenswrapper[4732]: I0402 14:50:04.814426 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585690-w726x" Apr 02 14:50:04 crc kubenswrapper[4732]: I0402 14:50:04.917003 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twtb8\" (UniqueName: \"kubernetes.io/projected/4a24efd3-0bf8-4d67-a529-5c8174ca09fd-kube-api-access-twtb8\") pod \"4a24efd3-0bf8-4d67-a529-5c8174ca09fd\" (UID: \"4a24efd3-0bf8-4d67-a529-5c8174ca09fd\") " Apr 02 14:50:04 crc kubenswrapper[4732]: I0402 14:50:04.922161 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a24efd3-0bf8-4d67-a529-5c8174ca09fd-kube-api-access-twtb8" (OuterVolumeSpecName: "kube-api-access-twtb8") pod "4a24efd3-0bf8-4d67-a529-5c8174ca09fd" (UID: "4a24efd3-0bf8-4d67-a529-5c8174ca09fd"). InnerVolumeSpecName "kube-api-access-twtb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:50:05 crc kubenswrapper[4732]: I0402 14:50:05.019062 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twtb8\" (UniqueName: \"kubernetes.io/projected/4a24efd3-0bf8-4d67-a529-5c8174ca09fd-kube-api-access-twtb8\") on node \"crc\" DevicePath \"\"" Apr 02 14:50:05 crc kubenswrapper[4732]: I0402 14:50:05.495465 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585690-w726x" event={"ID":"4a24efd3-0bf8-4d67-a529-5c8174ca09fd","Type":"ContainerDied","Data":"ab8fdf27171970d29687555e5b32de74785f0f94661c4376cede1d5d0efa657d"} Apr 02 14:50:05 crc kubenswrapper[4732]: I0402 14:50:05.495516 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab8fdf27171970d29687555e5b32de74785f0f94661c4376cede1d5d0efa657d" Apr 02 14:50:05 crc kubenswrapper[4732]: I0402 14:50:05.495543 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585690-w726x" Apr 02 14:50:05 crc kubenswrapper[4732]: I0402 14:50:05.894337 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585684-bb29b"] Apr 02 14:50:05 crc kubenswrapper[4732]: I0402 14:50:05.903424 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585684-bb29b"] Apr 02 14:50:06 crc kubenswrapper[4732]: I0402 14:50:06.700714 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a6a3e51-c56e-43ec-8508-45e7abb53983" path="/var/lib/kubelet/pods/6a6a3e51-c56e-43ec-8508-45e7abb53983/volumes" Apr 02 14:50:14 crc kubenswrapper[4732]: I0402 14:50:14.587347 4732 generic.go:334] "Generic (PLEG): container finished" podID="2033c214-17d0-4695-8655-f142b35e0518" containerID="4fdc5ea3309b18ddf66f90cafd00459635ffbc886f6ca2b56d49e846bfc0355c" exitCode=0 Apr 02 14:50:14 crc kubenswrapper[4732]: I0402 14:50:14.587443 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6jxwm/must-gather-gwhm7" event={"ID":"2033c214-17d0-4695-8655-f142b35e0518","Type":"ContainerDied","Data":"4fdc5ea3309b18ddf66f90cafd00459635ffbc886f6ca2b56d49e846bfc0355c"} Apr 02 14:50:14 crc kubenswrapper[4732]: I0402 14:50:14.588525 4732 scope.go:117] "RemoveContainer" containerID="4fdc5ea3309b18ddf66f90cafd00459635ffbc886f6ca2b56d49e846bfc0355c" Apr 02 14:50:15 crc kubenswrapper[4732]: I0402 14:50:15.452097 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6jxwm_must-gather-gwhm7_2033c214-17d0-4695-8655-f142b35e0518/gather/0.log" Apr 02 14:50:27 crc kubenswrapper[4732]: I0402 14:50:27.162811 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6jxwm/must-gather-gwhm7"] Apr 02 14:50:27 crc kubenswrapper[4732]: I0402 14:50:27.163751 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-6jxwm/must-gather-gwhm7" podUID="2033c214-17d0-4695-8655-f142b35e0518" containerName="copy" containerID="cri-o://141655c1dc953019c7e69575f337c44e388c4da53c4a597b0d55f3f76ecc204e" gracePeriod=2 Apr 02 14:50:27 crc kubenswrapper[4732]: I0402 14:50:27.175607 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6jxwm/must-gather-gwhm7"] Apr 02 14:50:27 crc kubenswrapper[4732]: I0402 14:50:27.594523 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6jxwm_must-gather-gwhm7_2033c214-17d0-4695-8655-f142b35e0518/copy/0.log" Apr 02 14:50:27 crc kubenswrapper[4732]: I0402 14:50:27.595914 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6jxwm/must-gather-gwhm7" Apr 02 14:50:27 crc kubenswrapper[4732]: I0402 14:50:27.714430 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2033c214-17d0-4695-8655-f142b35e0518-must-gather-output\") pod \"2033c214-17d0-4695-8655-f142b35e0518\" (UID: \"2033c214-17d0-4695-8655-f142b35e0518\") " Apr 02 14:50:27 crc kubenswrapper[4732]: I0402 14:50:27.714508 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbclr\" (UniqueName: \"kubernetes.io/projected/2033c214-17d0-4695-8655-f142b35e0518-kube-api-access-tbclr\") pod \"2033c214-17d0-4695-8655-f142b35e0518\" (UID: \"2033c214-17d0-4695-8655-f142b35e0518\") " Apr 02 14:50:27 crc kubenswrapper[4732]: I0402 14:50:27.715309 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6jxwm_must-gather-gwhm7_2033c214-17d0-4695-8655-f142b35e0518/copy/0.log" Apr 02 14:50:27 crc kubenswrapper[4732]: I0402 14:50:27.715686 4732 generic.go:334] "Generic (PLEG): container finished" podID="2033c214-17d0-4695-8655-f142b35e0518" containerID="141655c1dc953019c7e69575f337c44e388c4da53c4a597b0d55f3f76ecc204e" exitCode=143 Apr 02 14:50:27 crc kubenswrapper[4732]: I0402 14:50:27.715738 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6jxwm/must-gather-gwhm7" Apr 02 14:50:27 crc kubenswrapper[4732]: I0402 14:50:27.715745 4732 scope.go:117] "RemoveContainer" containerID="141655c1dc953019c7e69575f337c44e388c4da53c4a597b0d55f3f76ecc204e" Apr 02 14:50:27 crc kubenswrapper[4732]: I0402 14:50:27.721133 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2033c214-17d0-4695-8655-f142b35e0518-kube-api-access-tbclr" (OuterVolumeSpecName: "kube-api-access-tbclr") pod "2033c214-17d0-4695-8655-f142b35e0518" (UID: "2033c214-17d0-4695-8655-f142b35e0518"). InnerVolumeSpecName "kube-api-access-tbclr". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:50:27 crc kubenswrapper[4732]: I0402 14:50:27.793298 4732 scope.go:117] "RemoveContainer" containerID="4fdc5ea3309b18ddf66f90cafd00459635ffbc886f6ca2b56d49e846bfc0355c" Apr 02 14:50:27 crc kubenswrapper[4732]: I0402 14:50:27.817941 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbclr\" (UniqueName: \"kubernetes.io/projected/2033c214-17d0-4695-8655-f142b35e0518-kube-api-access-tbclr\") on node \"crc\" DevicePath \"\"" Apr 02 14:50:27 crc kubenswrapper[4732]: I0402 14:50:27.835550 4732 scope.go:117] "RemoveContainer" containerID="141655c1dc953019c7e69575f337c44e388c4da53c4a597b0d55f3f76ecc204e" Apr 02 14:50:27 crc kubenswrapper[4732]: E0402 14:50:27.835996 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"141655c1dc953019c7e69575f337c44e388c4da53c4a597b0d55f3f76ecc204e\": container with ID starting with 141655c1dc953019c7e69575f337c44e388c4da53c4a597b0d55f3f76ecc204e not found: ID does not exist" containerID="141655c1dc953019c7e69575f337c44e388c4da53c4a597b0d55f3f76ecc204e" Apr 02 14:50:27 crc kubenswrapper[4732]: I0402 14:50:27.836031 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"141655c1dc953019c7e69575f337c44e388c4da53c4a597b0d55f3f76ecc204e"} err="failed to get container status \"141655c1dc953019c7e69575f337c44e388c4da53c4a597b0d55f3f76ecc204e\": rpc error: code = NotFound desc = could not find container \"141655c1dc953019c7e69575f337c44e388c4da53c4a597b0d55f3f76ecc204e\": container with ID starting with 141655c1dc953019c7e69575f337c44e388c4da53c4a597b0d55f3f76ecc204e not found: ID does not exist" Apr 02 14:50:27 crc kubenswrapper[4732]: I0402 14:50:27.836056 4732 scope.go:117] "RemoveContainer" containerID="4fdc5ea3309b18ddf66f90cafd00459635ffbc886f6ca2b56d49e846bfc0355c" Apr 02 14:50:27 crc kubenswrapper[4732]: E0402 14:50:27.836682 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fdc5ea3309b18ddf66f90cafd00459635ffbc886f6ca2b56d49e846bfc0355c\": container with ID starting with 4fdc5ea3309b18ddf66f90cafd00459635ffbc886f6ca2b56d49e846bfc0355c not found: ID does not exist" containerID="4fdc5ea3309b18ddf66f90cafd00459635ffbc886f6ca2b56d49e846bfc0355c" Apr 02 14:50:27 crc kubenswrapper[4732]: I0402 14:50:27.836734 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fdc5ea3309b18ddf66f90cafd00459635ffbc886f6ca2b56d49e846bfc0355c"} err="failed to get container status \"4fdc5ea3309b18ddf66f90cafd00459635ffbc886f6ca2b56d49e846bfc0355c\": rpc error: code = NotFound desc = could not find container \"4fdc5ea3309b18ddf66f90cafd00459635ffbc886f6ca2b56d49e846bfc0355c\": container with ID starting with 4fdc5ea3309b18ddf66f90cafd00459635ffbc886f6ca2b56d49e846bfc0355c not found: ID does not exist" Apr 02 14:50:27 crc kubenswrapper[4732]: I0402 14:50:27.874510 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2033c214-17d0-4695-8655-f142b35e0518-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2033c214-17d0-4695-8655-f142b35e0518" (UID: "2033c214-17d0-4695-8655-f142b35e0518"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:50:27 crc kubenswrapper[4732]: I0402 14:50:27.919418 4732 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2033c214-17d0-4695-8655-f142b35e0518-must-gather-output\") on node \"crc\" DevicePath \"\"" Apr 02 14:50:28 crc kubenswrapper[4732]: I0402 14:50:28.714089 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2033c214-17d0-4695-8655-f142b35e0518" path="/var/lib/kubelet/pods/2033c214-17d0-4695-8655-f142b35e0518/volumes" Apr 02 14:50:54 crc kubenswrapper[4732]: I0402 14:50:54.321339 4732 scope.go:117] "RemoveContainer" containerID="1364bdd53c1e58011b0eb407960d1ef555f311e2787030d306fb507fcf225f62" Apr 02 14:50:54 crc kubenswrapper[4732]: I0402 14:50:54.343282 4732 scope.go:117] "RemoveContainer" containerID="c94186b4ff4b3c12330241a6b42c436be8cd92cf4158eea1ea75938bfed5598c" Apr 02 14:51:01 crc kubenswrapper[4732]: I0402 14:51:01.925082 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:51:01 crc kubenswrapper[4732]: I0402 14:51:01.925664 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:51:31 crc kubenswrapper[4732]: I0402 14:51:31.924668 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:51:31 crc kubenswrapper[4732]: I0402 14:51:31.926442 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:51:41 crc kubenswrapper[4732]: I0402 14:51:41.334655 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gwjw2"] Apr 02 14:51:41 crc kubenswrapper[4732]: E0402 14:51:41.335919 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2033c214-17d0-4695-8655-f142b35e0518" containerName="gather" Apr 02 14:51:41 crc kubenswrapper[4732]: I0402 14:51:41.335943 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2033c214-17d0-4695-8655-f142b35e0518" containerName="gather" Apr 02 14:51:41 crc kubenswrapper[4732]: E0402 14:51:41.335991 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2033c214-17d0-4695-8655-f142b35e0518" containerName="copy" Apr 02 14:51:41 crc kubenswrapper[4732]: I0402 14:51:41.336002 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2033c214-17d0-4695-8655-f142b35e0518" containerName="copy" Apr 02 14:51:41 crc kubenswrapper[4732]: E0402 14:51:41.336017 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a24efd3-0bf8-4d67-a529-5c8174ca09fd" containerName="oc" Apr 02 14:51:41 crc kubenswrapper[4732]: I0402 14:51:41.336030 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a24efd3-0bf8-4d67-a529-5c8174ca09fd" containerName="oc" Apr 02 14:51:41 crc kubenswrapper[4732]: I0402 14:51:41.336320 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a24efd3-0bf8-4d67-a529-5c8174ca09fd" containerName="oc" Apr 02 14:51:41 crc kubenswrapper[4732]: I0402 14:51:41.336365 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2033c214-17d0-4695-8655-f142b35e0518" containerName="gather" Apr 02 14:51:41 crc kubenswrapper[4732]: I0402 14:51:41.336384 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2033c214-17d0-4695-8655-f142b35e0518" containerName="copy" Apr 02 14:51:41 crc kubenswrapper[4732]: I0402 14:51:41.338742 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwjw2" Apr 02 14:51:41 crc kubenswrapper[4732]: I0402 14:51:41.352085 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gwjw2"] Apr 02 14:51:41 crc kubenswrapper[4732]: I0402 14:51:41.529432 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2cd0fc0-9018-4137-83c3-279f35bad40f-catalog-content\") pod \"certified-operators-gwjw2\" (UID: \"f2cd0fc0-9018-4137-83c3-279f35bad40f\") " pod="openshift-marketplace/certified-operators-gwjw2" Apr 02 14:51:41 crc kubenswrapper[4732]: I0402 14:51:41.529521 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2cd0fc0-9018-4137-83c3-279f35bad40f-utilities\") pod \"certified-operators-gwjw2\" (UID: \"f2cd0fc0-9018-4137-83c3-279f35bad40f\") " pod="openshift-marketplace/certified-operators-gwjw2" Apr 02 14:51:41 crc kubenswrapper[4732]: I0402 14:51:41.529564 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjlnq\" (UniqueName: \"kubernetes.io/projected/f2cd0fc0-9018-4137-83c3-279f35bad40f-kube-api-access-pjlnq\") pod \"certified-operators-gwjw2\" (UID: \"f2cd0fc0-9018-4137-83c3-279f35bad40f\") " pod="openshift-marketplace/certified-operators-gwjw2" Apr 02 14:51:41 crc kubenswrapper[4732]: I0402 14:51:41.631811 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2cd0fc0-9018-4137-83c3-279f35bad40f-catalog-content\") pod \"certified-operators-gwjw2\" (UID: \"f2cd0fc0-9018-4137-83c3-279f35bad40f\") " pod="openshift-marketplace/certified-operators-gwjw2" Apr 02 14:51:41 crc kubenswrapper[4732]: I0402 14:51:41.631896 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2cd0fc0-9018-4137-83c3-279f35bad40f-utilities\") pod \"certified-operators-gwjw2\" (UID: \"f2cd0fc0-9018-4137-83c3-279f35bad40f\") " pod="openshift-marketplace/certified-operators-gwjw2" Apr 02 14:51:41 crc kubenswrapper[4732]: I0402 14:51:41.631942 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjlnq\" (UniqueName: \"kubernetes.io/projected/f2cd0fc0-9018-4137-83c3-279f35bad40f-kube-api-access-pjlnq\") pod \"certified-operators-gwjw2\" (UID: \"f2cd0fc0-9018-4137-83c3-279f35bad40f\") " pod="openshift-marketplace/certified-operators-gwjw2" Apr 02 14:51:41 crc kubenswrapper[4732]: I0402 14:51:41.632422 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2cd0fc0-9018-4137-83c3-279f35bad40f-catalog-content\") pod \"certified-operators-gwjw2\" (UID: \"f2cd0fc0-9018-4137-83c3-279f35bad40f\") " pod="openshift-marketplace/certified-operators-gwjw2" Apr 02 14:51:41 crc kubenswrapper[4732]: I0402 14:51:41.632869 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2cd0fc0-9018-4137-83c3-279f35bad40f-utilities\") pod \"certified-operators-gwjw2\" (UID: \"f2cd0fc0-9018-4137-83c3-279f35bad40f\") " pod="openshift-marketplace/certified-operators-gwjw2" Apr 02 14:51:41 crc kubenswrapper[4732]: I0402 14:51:41.659469 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjlnq\" (UniqueName: \"kubernetes.io/projected/f2cd0fc0-9018-4137-83c3-279f35bad40f-kube-api-access-pjlnq\") pod \"certified-operators-gwjw2\" (UID: \"f2cd0fc0-9018-4137-83c3-279f35bad40f\") " pod="openshift-marketplace/certified-operators-gwjw2" Apr 02 14:51:41 crc kubenswrapper[4732]: I0402 14:51:41.662316 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwjw2" Apr 02 14:51:42 crc kubenswrapper[4732]: I0402 14:51:41.999912 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gwjw2"] Apr 02 14:51:42 crc kubenswrapper[4732]: I0402 14:51:42.451885 4732 generic.go:334] "Generic (PLEG): container finished" podID="f2cd0fc0-9018-4137-83c3-279f35bad40f" containerID="4716b2c7663b4ed395c28f1554b4f3f32fc9549da124336abfab89089bb514cb" exitCode=0 Apr 02 14:51:42 crc kubenswrapper[4732]: I0402 14:51:42.451941 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwjw2" event={"ID":"f2cd0fc0-9018-4137-83c3-279f35bad40f","Type":"ContainerDied","Data":"4716b2c7663b4ed395c28f1554b4f3f32fc9549da124336abfab89089bb514cb"} Apr 02 14:51:42 crc kubenswrapper[4732]: I0402 14:51:42.451970 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwjw2" event={"ID":"f2cd0fc0-9018-4137-83c3-279f35bad40f","Type":"ContainerStarted","Data":"d57df54e8326243c24fdca52db7c624e2a2cac8fa93cce93b23cefd3b2eff603"} Apr 02 14:51:42 crc kubenswrapper[4732]: I0402 14:51:42.454304 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 02 14:51:44 crc kubenswrapper[4732]: I0402 14:51:44.475724 4732 generic.go:334] "Generic (PLEG): container finished" podID="f2cd0fc0-9018-4137-83c3-279f35bad40f" containerID="5c021472ccc9144db356d10ccd85fcb252602f5a0857c7a8c4f368db7ac0ae89" exitCode=0 Apr 02 14:51:44 crc kubenswrapper[4732]: I0402 14:51:44.475783 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwjw2" event={"ID":"f2cd0fc0-9018-4137-83c3-279f35bad40f","Type":"ContainerDied","Data":"5c021472ccc9144db356d10ccd85fcb252602f5a0857c7a8c4f368db7ac0ae89"} Apr 02 14:51:45 crc kubenswrapper[4732]: I0402 14:51:45.486301 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwjw2" event={"ID":"f2cd0fc0-9018-4137-83c3-279f35bad40f","Type":"ContainerStarted","Data":"3f818fb28539d9bf38e32f2ca4a07edfc76a62e069624376ec53101686875d44"} Apr 02 14:51:45 crc kubenswrapper[4732]: I0402 14:51:45.503778 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gwjw2" podStartSLOduration=2.102140075 podStartE2EDuration="4.503760376s" podCreationTimestamp="2026-04-02 14:51:41 +0000 UTC" firstStartedPulling="2026-04-02 14:51:42.453879901 +0000 UTC m=+4459.358287494" lastFinishedPulling="2026-04-02 14:51:44.855500242 +0000 UTC m=+4461.759907795" observedRunningTime="2026-04-02 14:51:45.49951748 +0000 UTC m=+4462.403925063" watchObservedRunningTime="2026-04-02 14:51:45.503760376 +0000 UTC m=+4462.408167929" Apr 02 14:51:51 crc kubenswrapper[4732]: I0402 14:51:51.662511 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gwjw2" Apr 02 14:51:51 crc kubenswrapper[4732]: I0402 14:51:51.663889 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gwjw2" Apr 02 14:51:51 crc kubenswrapper[4732]: I0402 14:51:51.726426 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gwjw2" Apr 02 14:51:52 crc kubenswrapper[4732]: I0402 14:51:52.629244 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gwjw2" Apr 02 14:51:52 crc kubenswrapper[4732]: I0402 14:51:52.694272 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gwjw2"] Apr 02 14:51:54 crc kubenswrapper[4732]: I0402 14:51:54.500366 4732 scope.go:117] "RemoveContainer" containerID="f41c5a3ef3d9fb359dd941a2a370b29a00694a7b11300f9f2cdc6823fee02107" Apr 02 14:51:54 crc kubenswrapper[4732]: I0402 14:51:54.575261 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gwjw2" podUID="f2cd0fc0-9018-4137-83c3-279f35bad40f" containerName="registry-server" containerID="cri-o://3f818fb28539d9bf38e32f2ca4a07edfc76a62e069624376ec53101686875d44" gracePeriod=2 Apr 02 14:51:55 crc kubenswrapper[4732]: I0402 14:51:55.587889 4732 generic.go:334] "Generic (PLEG): container finished" podID="f2cd0fc0-9018-4137-83c3-279f35bad40f" containerID="3f818fb28539d9bf38e32f2ca4a07edfc76a62e069624376ec53101686875d44" exitCode=0 Apr 02 14:51:55 crc kubenswrapper[4732]: I0402 14:51:55.587969 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwjw2" event={"ID":"f2cd0fc0-9018-4137-83c3-279f35bad40f","Type":"ContainerDied","Data":"3f818fb28539d9bf38e32f2ca4a07edfc76a62e069624376ec53101686875d44"} Apr 02 14:51:55 crc kubenswrapper[4732]: I0402 14:51:55.588245 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwjw2" event={"ID":"f2cd0fc0-9018-4137-83c3-279f35bad40f","Type":"ContainerDied","Data":"d57df54e8326243c24fdca52db7c624e2a2cac8fa93cce93b23cefd3b2eff603"} Apr 02 14:51:55 crc kubenswrapper[4732]: I0402 14:51:55.588265 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d57df54e8326243c24fdca52db7c624e2a2cac8fa93cce93b23cefd3b2eff603" Apr 02 14:51:55 crc kubenswrapper[4732]: I0402 14:51:55.618196 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwjw2" Apr 02 14:51:55 crc kubenswrapper[4732]: I0402 14:51:55.712854 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2cd0fc0-9018-4137-83c3-279f35bad40f-utilities\") pod \"f2cd0fc0-9018-4137-83c3-279f35bad40f\" (UID: \"f2cd0fc0-9018-4137-83c3-279f35bad40f\") " Apr 02 14:51:55 crc kubenswrapper[4732]: I0402 14:51:55.712924 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjlnq\" (UniqueName: \"kubernetes.io/projected/f2cd0fc0-9018-4137-83c3-279f35bad40f-kube-api-access-pjlnq\") pod \"f2cd0fc0-9018-4137-83c3-279f35bad40f\" (UID: \"f2cd0fc0-9018-4137-83c3-279f35bad40f\") " Apr 02 14:51:55 crc kubenswrapper[4732]: I0402 14:51:55.713029 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2cd0fc0-9018-4137-83c3-279f35bad40f-catalog-content\") pod \"f2cd0fc0-9018-4137-83c3-279f35bad40f\" (UID: \"f2cd0fc0-9018-4137-83c3-279f35bad40f\") " Apr 02 14:51:55 crc kubenswrapper[4732]: I0402 14:51:55.713939 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2cd0fc0-9018-4137-83c3-279f35bad40f-utilities" (OuterVolumeSpecName: "utilities") pod "f2cd0fc0-9018-4137-83c3-279f35bad40f" (UID: "f2cd0fc0-9018-4137-83c3-279f35bad40f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:51:55 crc kubenswrapper[4732]: I0402 14:51:55.719209 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2cd0fc0-9018-4137-83c3-279f35bad40f-kube-api-access-pjlnq" (OuterVolumeSpecName: "kube-api-access-pjlnq") pod "f2cd0fc0-9018-4137-83c3-279f35bad40f" (UID: "f2cd0fc0-9018-4137-83c3-279f35bad40f"). InnerVolumeSpecName "kube-api-access-pjlnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:51:55 crc kubenswrapper[4732]: I0402 14:51:55.771840 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2cd0fc0-9018-4137-83c3-279f35bad40f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2cd0fc0-9018-4137-83c3-279f35bad40f" (UID: "f2cd0fc0-9018-4137-83c3-279f35bad40f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 02 14:51:55 crc kubenswrapper[4732]: I0402 14:51:55.815223 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjlnq\" (UniqueName: \"kubernetes.io/projected/f2cd0fc0-9018-4137-83c3-279f35bad40f-kube-api-access-pjlnq\") on node \"crc\" DevicePath \"\"" Apr 02 14:51:55 crc kubenswrapper[4732]: I0402 14:51:55.815546 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2cd0fc0-9018-4137-83c3-279f35bad40f-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 02 14:51:55 crc kubenswrapper[4732]: I0402 14:51:55.815676 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2cd0fc0-9018-4137-83c3-279f35bad40f-utilities\") on node \"crc\" DevicePath \"\"" Apr 02 14:51:56 crc kubenswrapper[4732]: I0402 14:51:56.598107 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwjw2" Apr 02 14:51:56 crc kubenswrapper[4732]: I0402 14:51:56.628377 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gwjw2"] Apr 02 14:51:56 crc kubenswrapper[4732]: I0402 14:51:56.637024 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gwjw2"] Apr 02 14:51:56 crc kubenswrapper[4732]: I0402 14:51:56.690155 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2cd0fc0-9018-4137-83c3-279f35bad40f" path="/var/lib/kubelet/pods/f2cd0fc0-9018-4137-83c3-279f35bad40f/volumes" Apr 02 14:52:00 crc kubenswrapper[4732]: I0402 14:52:00.152407 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585692-8rh26"] Apr 02 14:52:00 crc kubenswrapper[4732]: E0402 14:52:00.153644 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2cd0fc0-9018-4137-83c3-279f35bad40f" containerName="extract-content" Apr 02 14:52:00 crc kubenswrapper[4732]: I0402 14:52:00.153665 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2cd0fc0-9018-4137-83c3-279f35bad40f" containerName="extract-content" Apr 02 14:52:00 crc kubenswrapper[4732]: E0402 14:52:00.153678 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2cd0fc0-9018-4137-83c3-279f35bad40f" containerName="extract-utilities" Apr 02 14:52:00 crc kubenswrapper[4732]: I0402 14:52:00.153687 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2cd0fc0-9018-4137-83c3-279f35bad40f" containerName="extract-utilities" Apr 02 14:52:00 crc kubenswrapper[4732]: E0402 14:52:00.153705 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2cd0fc0-9018-4137-83c3-279f35bad40f" containerName="registry-server" Apr 02 14:52:00 crc kubenswrapper[4732]: I0402 14:52:00.153713 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2cd0fc0-9018-4137-83c3-279f35bad40f" containerName="registry-server" Apr 02 14:52:00 crc kubenswrapper[4732]: I0402 14:52:00.153962 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2cd0fc0-9018-4137-83c3-279f35bad40f" containerName="registry-server" Apr 02 14:52:00 crc kubenswrapper[4732]: I0402 14:52:00.154650 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585692-8rh26" Apr 02 14:52:00 crc kubenswrapper[4732]: I0402 14:52:00.156952 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:52:00 crc kubenswrapper[4732]: I0402 14:52:00.157013 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:52:00 crc kubenswrapper[4732]: I0402 14:52:00.157203 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:52:00 crc kubenswrapper[4732]: I0402 14:52:00.168352 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585692-8rh26"] Apr 02 14:52:00 crc kubenswrapper[4732]: I0402 14:52:00.199975 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99x7m\" (UniqueName: \"kubernetes.io/projected/fde6080d-581a-430d-a3c7-73b32d2fb094-kube-api-access-99x7m\") pod \"auto-csr-approver-29585692-8rh26\" (UID: \"fde6080d-581a-430d-a3c7-73b32d2fb094\") " pod="openshift-infra/auto-csr-approver-29585692-8rh26" Apr 02 14:52:00 crc kubenswrapper[4732]: I0402 14:52:00.302449 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99x7m\" (UniqueName: \"kubernetes.io/projected/fde6080d-581a-430d-a3c7-73b32d2fb094-kube-api-access-99x7m\") pod \"auto-csr-approver-29585692-8rh26\" (UID: \"fde6080d-581a-430d-a3c7-73b32d2fb094\") " pod="openshift-infra/auto-csr-approver-29585692-8rh26" Apr 02 14:52:00 crc kubenswrapper[4732]: I0402 14:52:00.325094 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99x7m\" (UniqueName: \"kubernetes.io/projected/fde6080d-581a-430d-a3c7-73b32d2fb094-kube-api-access-99x7m\") pod \"auto-csr-approver-29585692-8rh26\" (UID: \"fde6080d-581a-430d-a3c7-73b32d2fb094\") " pod="openshift-infra/auto-csr-approver-29585692-8rh26" Apr 02 14:52:00 crc kubenswrapper[4732]: I0402 14:52:00.475296 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585692-8rh26" Apr 02 14:52:00 crc kubenswrapper[4732]: I0402 14:52:00.913755 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585692-8rh26"] Apr 02 14:52:01 crc kubenswrapper[4732]: I0402 14:52:01.653711 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585692-8rh26" event={"ID":"fde6080d-581a-430d-a3c7-73b32d2fb094","Type":"ContainerStarted","Data":"17895212a3e8378e5679820d6d08110392ea2715541715a454ecd598eab1c5a6"} Apr 02 14:52:01 crc kubenswrapper[4732]: I0402 14:52:01.924419 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:52:01 crc kubenswrapper[4732]: I0402 14:52:01.924768 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:52:01 crc kubenswrapper[4732]: I0402 14:52:01.924822 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" Apr 02 14:52:01 crc kubenswrapper[4732]: I0402 14:52:01.925664 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c6ec8830c9c22d8d796c164088d1dd9b788e6c5f062d2aea3a3dd2f0912e8bde"} pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 02 14:52:01 crc kubenswrapper[4732]: I0402 14:52:01.925743 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" containerID="cri-o://c6ec8830c9c22d8d796c164088d1dd9b788e6c5f062d2aea3a3dd2f0912e8bde" gracePeriod=600 Apr 02 14:52:02 crc kubenswrapper[4732]: I0402 14:52:02.664483 4732 generic.go:334] "Generic (PLEG): container finished" podID="fde6080d-581a-430d-a3c7-73b32d2fb094" containerID="4df99bee95e861e842641c51f68ee60bd45e8568fc89a41941b0c278f30db119" exitCode=0 Apr 02 14:52:02 crc kubenswrapper[4732]: I0402 14:52:02.664533 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585692-8rh26" event={"ID":"fde6080d-581a-430d-a3c7-73b32d2fb094","Type":"ContainerDied","Data":"4df99bee95e861e842641c51f68ee60bd45e8568fc89a41941b0c278f30db119"} Apr 02 14:52:02 crc kubenswrapper[4732]: I0402 14:52:02.667443 4732 generic.go:334] "Generic (PLEG): container finished" podID="38409e5e-4545-49da-8f6c-4bfb30582878" containerID="c6ec8830c9c22d8d796c164088d1dd9b788e6c5f062d2aea3a3dd2f0912e8bde" exitCode=0 Apr 02 14:52:02 crc kubenswrapper[4732]: I0402 14:52:02.667526 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerDied","Data":"c6ec8830c9c22d8d796c164088d1dd9b788e6c5f062d2aea3a3dd2f0912e8bde"} Apr 02 14:52:02 crc kubenswrapper[4732]: I0402 14:52:02.667707 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" event={"ID":"38409e5e-4545-49da-8f6c-4bfb30582878","Type":"ContainerStarted","Data":"9cd0ab474e404be5d2d542f4d4d368708d9fe4a0edb815835f24212829ce96af"} Apr 02 14:52:02 crc kubenswrapper[4732]: I0402 14:52:02.667736 4732 scope.go:117] "RemoveContainer" containerID="4cc9425b6dde0d264f8b72482c5821ad9fabf57354100ab51583dde35da85ea0" Apr 02 14:52:04 crc kubenswrapper[4732]: I0402 14:52:04.091474 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585692-8rh26" Apr 02 14:52:04 crc kubenswrapper[4732]: I0402 14:52:04.184677 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99x7m\" (UniqueName: \"kubernetes.io/projected/fde6080d-581a-430d-a3c7-73b32d2fb094-kube-api-access-99x7m\") pod \"fde6080d-581a-430d-a3c7-73b32d2fb094\" (UID: \"fde6080d-581a-430d-a3c7-73b32d2fb094\") " Apr 02 14:52:04 crc kubenswrapper[4732]: I0402 14:52:04.195149 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fde6080d-581a-430d-a3c7-73b32d2fb094-kube-api-access-99x7m" (OuterVolumeSpecName: "kube-api-access-99x7m") pod "fde6080d-581a-430d-a3c7-73b32d2fb094" (UID: "fde6080d-581a-430d-a3c7-73b32d2fb094"). InnerVolumeSpecName "kube-api-access-99x7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:52:04 crc kubenswrapper[4732]: I0402 14:52:04.287445 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99x7m\" (UniqueName: \"kubernetes.io/projected/fde6080d-581a-430d-a3c7-73b32d2fb094-kube-api-access-99x7m\") on node \"crc\" DevicePath \"\"" Apr 02 14:52:04 crc kubenswrapper[4732]: I0402 14:52:04.699066 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585692-8rh26" event={"ID":"fde6080d-581a-430d-a3c7-73b32d2fb094","Type":"ContainerDied","Data":"17895212a3e8378e5679820d6d08110392ea2715541715a454ecd598eab1c5a6"} Apr 02 14:52:04 crc kubenswrapper[4732]: I0402 14:52:04.699358 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17895212a3e8378e5679820d6d08110392ea2715541715a454ecd598eab1c5a6" Apr 02 14:52:04 crc kubenswrapper[4732]: I0402 14:52:04.699406 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585692-8rh26" Apr 02 14:52:05 crc kubenswrapper[4732]: I0402 14:52:05.159422 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585686-fdtl8"] Apr 02 14:52:05 crc kubenswrapper[4732]: I0402 14:52:05.169177 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585686-fdtl8"] Apr 02 14:52:06 crc kubenswrapper[4732]: I0402 14:52:06.691308 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0d13f1b-e44f-4e95-8f60-9da27a2a247a" path="/var/lib/kubelet/pods/a0d13f1b-e44f-4e95-8f60-9da27a2a247a/volumes" Apr 02 14:52:54 crc kubenswrapper[4732]: I0402 14:52:54.568471 4732 scope.go:117] "RemoveContainer" containerID="503d568eb55b92d13a248a42a5ed973002e5488ea17bb8b75bf7db78eb07e54a" Apr 02 14:54:00 crc kubenswrapper[4732]: I0402 14:54:00.155592 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29585694-f9bvn"] Apr 02 14:54:00 crc kubenswrapper[4732]: E0402 14:54:00.156535 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fde6080d-581a-430d-a3c7-73b32d2fb094" containerName="oc" Apr 02 14:54:00 crc kubenswrapper[4732]: I0402 14:54:00.156547 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="fde6080d-581a-430d-a3c7-73b32d2fb094" containerName="oc" Apr 02 14:54:00 crc kubenswrapper[4732]: I0402 14:54:00.156730 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="fde6080d-581a-430d-a3c7-73b32d2fb094" containerName="oc" Apr 02 14:54:00 crc kubenswrapper[4732]: I0402 14:54:00.157548 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585694-f9bvn" Apr 02 14:54:00 crc kubenswrapper[4732]: I0402 14:54:00.160001 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 02 14:54:00 crc kubenswrapper[4732]: I0402 14:54:00.164038 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 02 14:54:00 crc kubenswrapper[4732]: I0402 14:54:00.164327 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-69g42" Apr 02 14:54:00 crc kubenswrapper[4732]: I0402 14:54:00.178718 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585694-f9bvn"] Apr 02 14:54:00 crc kubenswrapper[4732]: I0402 14:54:00.251139 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v58q5\" (UniqueName: \"kubernetes.io/projected/532fdf5b-c81e-4909-ad36-3b74e705bd65-kube-api-access-v58q5\") pod \"auto-csr-approver-29585694-f9bvn\" (UID: \"532fdf5b-c81e-4909-ad36-3b74e705bd65\") " pod="openshift-infra/auto-csr-approver-29585694-f9bvn" Apr 02 14:54:00 crc kubenswrapper[4732]: I0402 14:54:00.353243 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v58q5\" (UniqueName: \"kubernetes.io/projected/532fdf5b-c81e-4909-ad36-3b74e705bd65-kube-api-access-v58q5\") pod \"auto-csr-approver-29585694-f9bvn\" (UID: \"532fdf5b-c81e-4909-ad36-3b74e705bd65\") " pod="openshift-infra/auto-csr-approver-29585694-f9bvn" Apr 02 14:54:00 crc kubenswrapper[4732]: I0402 14:54:00.487439 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v58q5\" (UniqueName: \"kubernetes.io/projected/532fdf5b-c81e-4909-ad36-3b74e705bd65-kube-api-access-v58q5\") pod \"auto-csr-approver-29585694-f9bvn\" (UID: \"532fdf5b-c81e-4909-ad36-3b74e705bd65\") " pod="openshift-infra/auto-csr-approver-29585694-f9bvn" Apr 02 14:54:00 crc kubenswrapper[4732]: I0402 14:54:00.493877 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585694-f9bvn" Apr 02 14:54:01 crc kubenswrapper[4732]: I0402 14:54:01.088277 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29585694-f9bvn"] Apr 02 14:54:01 crc kubenswrapper[4732]: I0402 14:54:01.923380 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585694-f9bvn" event={"ID":"532fdf5b-c81e-4909-ad36-3b74e705bd65","Type":"ContainerStarted","Data":"70bff4591857ed7a144d9c565f4986da9f27fff498c800f83d12e61b4e7e772c"} Apr 02 14:54:02 crc kubenswrapper[4732]: I0402 14:54:02.934090 4732 generic.go:334] "Generic (PLEG): container finished" podID="532fdf5b-c81e-4909-ad36-3b74e705bd65" containerID="7d8de1209f0fa7d08b065722b74b2cb1c6438c220cd8e7265271187eff3fd534" exitCode=0 Apr 02 14:54:02 crc kubenswrapper[4732]: I0402 14:54:02.934254 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585694-f9bvn" event={"ID":"532fdf5b-c81e-4909-ad36-3b74e705bd65","Type":"ContainerDied","Data":"7d8de1209f0fa7d08b065722b74b2cb1c6438c220cd8e7265271187eff3fd534"} Apr 02 14:54:04 crc kubenswrapper[4732]: I0402 14:54:04.363770 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585694-f9bvn" Apr 02 14:54:04 crc kubenswrapper[4732]: I0402 14:54:04.440489 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v58q5\" (UniqueName: \"kubernetes.io/projected/532fdf5b-c81e-4909-ad36-3b74e705bd65-kube-api-access-v58q5\") pod \"532fdf5b-c81e-4909-ad36-3b74e705bd65\" (UID: \"532fdf5b-c81e-4909-ad36-3b74e705bd65\") " Apr 02 14:54:04 crc kubenswrapper[4732]: I0402 14:54:04.449455 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/532fdf5b-c81e-4909-ad36-3b74e705bd65-kube-api-access-v58q5" (OuterVolumeSpecName: "kube-api-access-v58q5") pod "532fdf5b-c81e-4909-ad36-3b74e705bd65" (UID: "532fdf5b-c81e-4909-ad36-3b74e705bd65"). InnerVolumeSpecName "kube-api-access-v58q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 02 14:54:04 crc kubenswrapper[4732]: I0402 14:54:04.543382 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v58q5\" (UniqueName: \"kubernetes.io/projected/532fdf5b-c81e-4909-ad36-3b74e705bd65-kube-api-access-v58q5\") on node \"crc\" DevicePath \"\"" Apr 02 14:54:04 crc kubenswrapper[4732]: I0402 14:54:04.957284 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29585694-f9bvn" event={"ID":"532fdf5b-c81e-4909-ad36-3b74e705bd65","Type":"ContainerDied","Data":"70bff4591857ed7a144d9c565f4986da9f27fff498c800f83d12e61b4e7e772c"} Apr 02 14:54:04 crc kubenswrapper[4732]: I0402 14:54:04.957792 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70bff4591857ed7a144d9c565f4986da9f27fff498c800f83d12e61b4e7e772c" Apr 02 14:54:04 crc kubenswrapper[4732]: I0402 14:54:04.957889 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29585694-f9bvn" Apr 02 14:54:05 crc kubenswrapper[4732]: I0402 14:54:05.450480 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29585688-xw4zs"] Apr 02 14:54:05 crc kubenswrapper[4732]: I0402 14:54:05.459102 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29585688-xw4zs"] Apr 02 14:54:06 crc kubenswrapper[4732]: I0402 14:54:06.701383 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48c33b2c-08b9-49be-9275-14660b4de57d" path="/var/lib/kubelet/pods/48c33b2c-08b9-49be-9275-14660b4de57d/volumes" Apr 02 14:54:31 crc kubenswrapper[4732]: I0402 14:54:31.924679 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:54:31 crc kubenswrapper[4732]: I0402 14:54:31.925202 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 02 14:54:54 crc kubenswrapper[4732]: I0402 14:54:54.786593 4732 scope.go:117] "RemoveContainer" containerID="46f44336eb7395e1a1e82ff8cacc5899bdd2e483c6ad0652ef728f274a4d9ec9" Apr 02 14:55:01 crc kubenswrapper[4732]: I0402 14:55:01.924717 4732 patch_prober.go:28] interesting pod/machine-config-daemon-6vtmw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 02 14:55:01 crc kubenswrapper[4732]: I0402 14:55:01.925230 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6vtmw" podUID="38409e5e-4545-49da-8f6c-4bfb30582878" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515163501543024450 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015163501543017365 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015163470066016515 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015163470066015465 5ustar corecore